Nov 29 00:35:53 np0005539505 kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 29 00:35:53 np0005539505 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 29 00:35:53 np0005539505 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:53 np0005539505 kernel: BIOS-provided physical RAM map:
Nov 29 00:35:53 np0005539505 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 29 00:35:53 np0005539505 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 29 00:35:53 np0005539505 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 29 00:35:53 np0005539505 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 29 00:35:53 np0005539505 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 29 00:35:53 np0005539505 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 29 00:35:53 np0005539505 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 29 00:35:53 np0005539505 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 29 00:35:53 np0005539505 kernel: NX (Execute Disable) protection: active
Nov 29 00:35:53 np0005539505 kernel: APIC: Static calls initialized
Nov 29 00:35:53 np0005539505 kernel: SMBIOS 2.8 present.
Nov 29 00:35:53 np0005539505 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 29 00:35:53 np0005539505 kernel: Hypervisor detected: KVM
Nov 29 00:35:53 np0005539505 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 29 00:35:53 np0005539505 kernel: kvm-clock: using sched offset of 3413761313 cycles
Nov 29 00:35:53 np0005539505 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 29 00:35:53 np0005539505 kernel: tsc: Detected 2800.000 MHz processor
Nov 29 00:35:53 np0005539505 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 29 00:35:53 np0005539505 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 29 00:35:53 np0005539505 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 29 00:35:53 np0005539505 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 29 00:35:53 np0005539505 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 29 00:35:53 np0005539505 kernel: Using GB pages for direct mapping
Nov 29 00:35:53 np0005539505 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 29 00:35:53 np0005539505 kernel: ACPI: Early table checksum verification disabled
Nov 29 00:35:53 np0005539505 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 29 00:35:53 np0005539505 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:53 np0005539505 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:53 np0005539505 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:53 np0005539505 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 29 00:35:53 np0005539505 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:53 np0005539505 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 29 00:35:53 np0005539505 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 29 00:35:53 np0005539505 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 29 00:35:53 np0005539505 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 29 00:35:53 np0005539505 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 29 00:35:53 np0005539505 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 29 00:35:53 np0005539505 kernel: No NUMA configuration found
Nov 29 00:35:53 np0005539505 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 29 00:35:53 np0005539505 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Nov 29 00:35:53 np0005539505 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 29 00:35:53 np0005539505 kernel: Zone ranges:
Nov 29 00:35:53 np0005539505 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 29 00:35:53 np0005539505 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 29 00:35:53 np0005539505 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 00:35:53 np0005539505 kernel:  Device   empty
Nov 29 00:35:53 np0005539505 kernel: Movable zone start for each node
Nov 29 00:35:53 np0005539505 kernel: Early memory node ranges
Nov 29 00:35:53 np0005539505 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 29 00:35:53 np0005539505 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 29 00:35:53 np0005539505 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 29 00:35:53 np0005539505 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 29 00:35:53 np0005539505 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 29 00:35:53 np0005539505 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 29 00:35:53 np0005539505 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 29 00:35:53 np0005539505 kernel: ACPI: PM-Timer IO Port: 0x608
Nov 29 00:35:53 np0005539505 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 29 00:35:53 np0005539505 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 29 00:35:53 np0005539505 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 29 00:35:53 np0005539505 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 29 00:35:53 np0005539505 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 29 00:35:53 np0005539505 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 29 00:35:53 np0005539505 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 29 00:35:53 np0005539505 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 29 00:35:53 np0005539505 kernel: TSC deadline timer available
Nov 29 00:35:53 np0005539505 kernel: CPU topo: Max. logical packages:   8
Nov 29 00:35:53 np0005539505 kernel: CPU topo: Max. logical dies:       8
Nov 29 00:35:53 np0005539505 kernel: CPU topo: Max. dies per package:   1
Nov 29 00:35:53 np0005539505 kernel: CPU topo: Max. threads per core:   1
Nov 29 00:35:53 np0005539505 kernel: CPU topo: Num. cores per package:     1
Nov 29 00:35:53 np0005539505 kernel: CPU topo: Num. threads per package:   1
Nov 29 00:35:53 np0005539505 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 29 00:35:53 np0005539505 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 29 00:35:53 np0005539505 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 29 00:35:53 np0005539505 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 29 00:35:53 np0005539505 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 29 00:35:53 np0005539505 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 29 00:35:53 np0005539505 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 29 00:35:53 np0005539505 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 29 00:35:53 np0005539505 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 29 00:35:53 np0005539505 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 29 00:35:53 np0005539505 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 29 00:35:53 np0005539505 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 29 00:35:53 np0005539505 kernel: Booting paravirtualized kernel on KVM
Nov 29 00:35:53 np0005539505 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 29 00:35:53 np0005539505 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 29 00:35:53 np0005539505 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 29 00:35:53 np0005539505 kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 29 00:35:53 np0005539505 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:53 np0005539505 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 29 00:35:53 np0005539505 kernel: random: crng init done
Nov 29 00:35:53 np0005539505 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: Fallback order for Node 0: 0 
Nov 29 00:35:53 np0005539505 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 29 00:35:53 np0005539505 kernel: Policy zone: Normal
Nov 29 00:35:53 np0005539505 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 29 00:35:53 np0005539505 kernel: software IO TLB: area num 8.
Nov 29 00:35:53 np0005539505 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 29 00:35:53 np0005539505 kernel: ftrace: allocating 49313 entries in 193 pages
Nov 29 00:35:53 np0005539505 kernel: ftrace: allocated 193 pages with 3 groups
Nov 29 00:35:53 np0005539505 kernel: Dynamic Preempt: voluntary
Nov 29 00:35:53 np0005539505 kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 29 00:35:53 np0005539505 kernel: rcu: #011RCU event tracing is enabled.
Nov 29 00:35:53 np0005539505 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 29 00:35:53 np0005539505 kernel: #011Trampoline variant of Tasks RCU enabled.
Nov 29 00:35:53 np0005539505 kernel: #011Rude variant of Tasks RCU enabled.
Nov 29 00:35:53 np0005539505 kernel: #011Tracing variant of Tasks RCU enabled.
Nov 29 00:35:53 np0005539505 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 29 00:35:53 np0005539505 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 29 00:35:53 np0005539505 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:53 np0005539505 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:53 np0005539505 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 29 00:35:53 np0005539505 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 29 00:35:53 np0005539505 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 29 00:35:53 np0005539505 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 29 00:35:53 np0005539505 kernel: Console: colour VGA+ 80x25
Nov 29 00:35:53 np0005539505 kernel: printk: console [ttyS0] enabled
Nov 29 00:35:53 np0005539505 kernel: ACPI: Core revision 20230331
Nov 29 00:35:53 np0005539505 kernel: APIC: Switch to symmetric I/O mode setup
Nov 29 00:35:53 np0005539505 kernel: x2apic enabled
Nov 29 00:35:53 np0005539505 kernel: APIC: Switched APIC routing to: physical x2apic
Nov 29 00:35:53 np0005539505 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 29 00:35:53 np0005539505 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Nov 29 00:35:53 np0005539505 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 29 00:35:53 np0005539505 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 29 00:35:53 np0005539505 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 29 00:35:53 np0005539505 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 29 00:35:53 np0005539505 kernel: Spectre V2 : Mitigation: Retpolines
Nov 29 00:35:53 np0005539505 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 29 00:35:53 np0005539505 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 29 00:35:53 np0005539505 kernel: RETBleed: Mitigation: untrained return thunk
Nov 29 00:35:53 np0005539505 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 29 00:35:53 np0005539505 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 29 00:35:53 np0005539505 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 29 00:35:53 np0005539505 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 29 00:35:53 np0005539505 kernel: x86/bugs: return thunk changed
Nov 29 00:35:53 np0005539505 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 29 00:35:53 np0005539505 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 29 00:35:53 np0005539505 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 29 00:35:53 np0005539505 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 29 00:35:53 np0005539505 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 29 00:35:53 np0005539505 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 29 00:35:53 np0005539505 kernel: Freeing SMP alternatives memory: 40K
Nov 29 00:35:53 np0005539505 kernel: pid_max: default: 32768 minimum: 301
Nov 29 00:35:53 np0005539505 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 29 00:35:53 np0005539505 kernel: landlock: Up and running.
Nov 29 00:35:53 np0005539505 kernel: Yama: becoming mindful.
Nov 29 00:35:53 np0005539505 kernel: SELinux:  Initializing.
Nov 29 00:35:53 np0005539505 kernel: LSM support for eBPF active
Nov 29 00:35:53 np0005539505 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 29 00:35:53 np0005539505 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 29 00:35:53 np0005539505 kernel: ... version:                0
Nov 29 00:35:53 np0005539505 kernel: ... bit width:              48
Nov 29 00:35:53 np0005539505 kernel: ... generic registers:      6
Nov 29 00:35:53 np0005539505 kernel: ... value mask:             0000ffffffffffff
Nov 29 00:35:53 np0005539505 kernel: ... max period:             00007fffffffffff
Nov 29 00:35:53 np0005539505 kernel: ... fixed-purpose events:   0
Nov 29 00:35:53 np0005539505 kernel: ... event mask:             000000000000003f
Nov 29 00:35:53 np0005539505 kernel: signal: max sigframe size: 1776
Nov 29 00:35:53 np0005539505 kernel: rcu: Hierarchical SRCU implementation.
Nov 29 00:35:53 np0005539505 kernel: rcu: #011Max phase no-delay instances is 400.
Nov 29 00:35:53 np0005539505 kernel: smp: Bringing up secondary CPUs ...
Nov 29 00:35:53 np0005539505 kernel: smpboot: x86: Booting SMP configuration:
Nov 29 00:35:53 np0005539505 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 29 00:35:53 np0005539505 kernel: smp: Brought up 1 node, 8 CPUs
Nov 29 00:35:53 np0005539505 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Nov 29 00:35:53 np0005539505 kernel: node 0 deferred pages initialised in 8ms
Nov 29 00:35:53 np0005539505 kernel: Memory: 7765836K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616280K reserved, 0K cma-reserved)
Nov 29 00:35:53 np0005539505 kernel: devtmpfs: initialized
Nov 29 00:35:53 np0005539505 kernel: x86/mm: Memory block size: 128MB
Nov 29 00:35:53 np0005539505 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 29 00:35:53 np0005539505 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: pinctrl core: initialized pinctrl subsystem
Nov 29 00:35:53 np0005539505 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 29 00:35:53 np0005539505 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 29 00:35:53 np0005539505 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 29 00:35:53 np0005539505 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 29 00:35:53 np0005539505 kernel: audit: initializing netlink subsys (disabled)
Nov 29 00:35:53 np0005539505 kernel: audit: type=2000 audit(1764394551.746:1): state=initialized audit_enabled=0 res=1
Nov 29 00:35:53 np0005539505 kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 29 00:35:53 np0005539505 kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 29 00:35:53 np0005539505 kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 29 00:35:53 np0005539505 kernel: cpuidle: using governor menu
Nov 29 00:35:53 np0005539505 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 29 00:35:53 np0005539505 kernel: PCI: Using configuration type 1 for base access
Nov 29 00:35:53 np0005539505 kernel: PCI: Using configuration type 1 for extended access
Nov 29 00:35:53 np0005539505 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 29 00:35:53 np0005539505 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 29 00:35:53 np0005539505 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 29 00:35:53 np0005539505 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 29 00:35:53 np0005539505 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 29 00:35:53 np0005539505 kernel: Demotion targets for Node 0: null
Nov 29 00:35:53 np0005539505 kernel: cryptd: max_cpu_qlen set to 1000
Nov 29 00:35:53 np0005539505 kernel: ACPI: Added _OSI(Module Device)
Nov 29 00:35:53 np0005539505 kernel: ACPI: Added _OSI(Processor Device)
Nov 29 00:35:53 np0005539505 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 29 00:35:53 np0005539505 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 29 00:35:53 np0005539505 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 29 00:35:53 np0005539505 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 29 00:35:53 np0005539505 kernel: ACPI: Interpreter enabled
Nov 29 00:35:53 np0005539505 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 29 00:35:53 np0005539505 kernel: ACPI: Using IOAPIC for interrupt routing
Nov 29 00:35:53 np0005539505 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 29 00:35:53 np0005539505 kernel: PCI: Using E820 reservations for host bridge windows
Nov 29 00:35:53 np0005539505 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 29 00:35:53 np0005539505 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 29 00:35:53 np0005539505 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [3] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [4] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [5] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [6] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [7] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [8] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [9] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [10] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [11] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [12] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [13] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [14] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [15] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [16] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [17] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [18] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [19] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [20] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [21] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [22] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [23] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [24] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [25] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [26] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [27] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [28] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [29] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [30] registered
Nov 29 00:35:53 np0005539505 kernel: acpiphp: Slot [31] registered
Nov 29 00:35:53 np0005539505 kernel: PCI host bridge to bus 0000:00
Nov 29 00:35:53 np0005539505 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 29 00:35:53 np0005539505 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 29 00:35:53 np0005539505 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 29 00:35:53 np0005539505 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 29 00:35:53 np0005539505 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 29 00:35:53 np0005539505 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 29 00:35:53 np0005539505 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 29 00:35:53 np0005539505 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 29 00:35:53 np0005539505 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 29 00:35:53 np0005539505 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 29 00:35:53 np0005539505 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 29 00:35:53 np0005539505 kernel: iommu: Default domain type: Translated
Nov 29 00:35:53 np0005539505 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 29 00:35:53 np0005539505 kernel: SCSI subsystem initialized
Nov 29 00:35:53 np0005539505 kernel: ACPI: bus type USB registered
Nov 29 00:35:53 np0005539505 kernel: usbcore: registered new interface driver usbfs
Nov 29 00:35:53 np0005539505 kernel: usbcore: registered new interface driver hub
Nov 29 00:35:53 np0005539505 kernel: usbcore: registered new device driver usb
Nov 29 00:35:53 np0005539505 kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 29 00:35:53 np0005539505 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 29 00:35:53 np0005539505 kernel: PTP clock support registered
Nov 29 00:35:53 np0005539505 kernel: EDAC MC: Ver: 3.0.0
Nov 29 00:35:53 np0005539505 kernel: NetLabel: Initializing
Nov 29 00:35:53 np0005539505 kernel: NetLabel:  domain hash size = 128
Nov 29 00:35:53 np0005539505 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 29 00:35:53 np0005539505 kernel: NetLabel:  unlabeled traffic allowed by default
Nov 29 00:35:53 np0005539505 kernel: PCI: Using ACPI for IRQ routing
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 29 00:35:53 np0005539505 kernel: vgaarb: loaded
Nov 29 00:35:53 np0005539505 kernel: clocksource: Switched to clocksource kvm-clock
Nov 29 00:35:53 np0005539505 kernel: VFS: Disk quotas dquot_6.6.0
Nov 29 00:35:53 np0005539505 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 29 00:35:53 np0005539505 kernel: pnp: PnP ACPI init
Nov 29 00:35:53 np0005539505 kernel: pnp: PnP ACPI: found 5 devices
Nov 29 00:35:53 np0005539505 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 29 00:35:53 np0005539505 kernel: NET: Registered PF_INET protocol family
Nov 29 00:35:53 np0005539505 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 29 00:35:53 np0005539505 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 29 00:35:53 np0005539505 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 29 00:35:53 np0005539505 kernel: NET: Registered PF_XDP protocol family
Nov 29 00:35:53 np0005539505 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 29 00:35:53 np0005539505 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 29 00:35:53 np0005539505 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 29 00:35:53 np0005539505 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 29 00:35:53 np0005539505 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 29 00:35:53 np0005539505 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 29 00:35:53 np0005539505 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 84603 usecs
Nov 29 00:35:53 np0005539505 kernel: PCI: CLS 0 bytes, default 64
Nov 29 00:35:53 np0005539505 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 29 00:35:53 np0005539505 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 29 00:35:53 np0005539505 kernel: ACPI: bus type thunderbolt registered
Nov 29 00:35:53 np0005539505 kernel: Trying to unpack rootfs image as initramfs...
Nov 29 00:35:53 np0005539505 kernel: Initialise system trusted keyrings
Nov 29 00:35:53 np0005539505 kernel: Key type blacklist registered
Nov 29 00:35:53 np0005539505 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 29 00:35:53 np0005539505 kernel: zbud: loaded
Nov 29 00:35:53 np0005539505 kernel: integrity: Platform Keyring initialized
Nov 29 00:35:53 np0005539505 kernel: integrity: Machine keyring initialized
Nov 29 00:35:53 np0005539505 kernel: Freeing initrd memory: 85868K
Nov 29 00:35:53 np0005539505 kernel: NET: Registered PF_ALG protocol family
Nov 29 00:35:53 np0005539505 kernel: xor: automatically using best checksumming function   avx       
Nov 29 00:35:53 np0005539505 kernel: Key type asymmetric registered
Nov 29 00:35:53 np0005539505 kernel: Asymmetric key parser 'x509' registered
Nov 29 00:35:53 np0005539505 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 29 00:35:53 np0005539505 kernel: io scheduler mq-deadline registered
Nov 29 00:35:53 np0005539505 kernel: io scheduler kyber registered
Nov 29 00:35:53 np0005539505 kernel: io scheduler bfq registered
Nov 29 00:35:53 np0005539505 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 29 00:35:53 np0005539505 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 29 00:35:53 np0005539505 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 29 00:35:53 np0005539505 kernel: ACPI: button: Power Button [PWRF]
Nov 29 00:35:53 np0005539505 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 29 00:35:53 np0005539505 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 29 00:35:53 np0005539505 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 29 00:35:53 np0005539505 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 29 00:35:53 np0005539505 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 29 00:35:53 np0005539505 kernel: Non-volatile memory driver v1.3
Nov 29 00:35:53 np0005539505 kernel: rdac: device handler registered
Nov 29 00:35:53 np0005539505 kernel: hp_sw: device handler registered
Nov 29 00:35:53 np0005539505 kernel: emc: device handler registered
Nov 29 00:35:53 np0005539505 kernel: alua: device handler registered
Nov 29 00:35:53 np0005539505 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 29 00:35:53 np0005539505 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 29 00:35:53 np0005539505 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 29 00:35:53 np0005539505 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 29 00:35:53 np0005539505 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 29 00:35:53 np0005539505 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 29 00:35:53 np0005539505 kernel: usb usb1: Product: UHCI Host Controller
Nov 29 00:35:53 np0005539505 kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 29 00:35:53 np0005539505 kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 29 00:35:53 np0005539505 kernel: hub 1-0:1.0: USB hub found
Nov 29 00:35:53 np0005539505 kernel: hub 1-0:1.0: 2 ports detected
Nov 29 00:35:53 np0005539505 kernel: usbcore: registered new interface driver usbserial_generic
Nov 29 00:35:53 np0005539505 kernel: usbserial: USB Serial support registered for generic
Nov 29 00:35:53 np0005539505 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 29 00:35:53 np0005539505 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 29 00:35:53 np0005539505 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 29 00:35:53 np0005539505 kernel: mousedev: PS/2 mouse device common for all mice
Nov 29 00:35:53 np0005539505 kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 29 00:35:53 np0005539505 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 29 00:35:53 np0005539505 kernel: rtc_cmos 00:04: registered as rtc0
Nov 29 00:35:53 np0005539505 kernel: rtc_cmos 00:04: setting system clock to 2025-11-29T05:35:52 UTC (1764394552)
Nov 29 00:35:53 np0005539505 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 29 00:35:53 np0005539505 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 29 00:35:53 np0005539505 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 29 00:35:53 np0005539505 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 29 00:35:53 np0005539505 kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 29 00:35:53 np0005539505 kernel: usbcore: registered new interface driver usbhid
Nov 29 00:35:53 np0005539505 kernel: usbhid: USB HID core driver
Nov 29 00:35:53 np0005539505 kernel: drop_monitor: Initializing network drop monitor service
Nov 29 00:35:53 np0005539505 kernel: Initializing XFRM netlink socket
Nov 29 00:35:53 np0005539505 kernel: NET: Registered PF_INET6 protocol family
Nov 29 00:35:53 np0005539505 kernel: Segment Routing with IPv6
Nov 29 00:35:53 np0005539505 kernel: NET: Registered PF_PACKET protocol family
Nov 29 00:35:53 np0005539505 kernel: mpls_gso: MPLS GSO support
Nov 29 00:35:53 np0005539505 kernel: IPI shorthand broadcast: enabled
Nov 29 00:35:53 np0005539505 kernel: AVX2 version of gcm_enc/dec engaged.
Nov 29 00:35:53 np0005539505 kernel: AES CTR mode by8 optimization enabled
Nov 29 00:35:53 np0005539505 kernel: sched_clock: Marking stable (1340006410, 146405920)->(1563808789, -77396459)
Nov 29 00:35:53 np0005539505 kernel: registered taskstats version 1
Nov 29 00:35:53 np0005539505 kernel: Loading compiled-in X.509 certificates
Nov 29 00:35:53 np0005539505 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 00:35:53 np0005539505 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 29 00:35:53 np0005539505 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 29 00:35:53 np0005539505 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 29 00:35:53 np0005539505 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 29 00:35:53 np0005539505 kernel: Demotion targets for Node 0: null
Nov 29 00:35:53 np0005539505 kernel: page_owner is disabled
Nov 29 00:35:53 np0005539505 kernel: Key type .fscrypt registered
Nov 29 00:35:53 np0005539505 kernel: Key type fscrypt-provisioning registered
Nov 29 00:35:53 np0005539505 kernel: Key type big_key registered
Nov 29 00:35:53 np0005539505 kernel: Key type encrypted registered
Nov 29 00:35:53 np0005539505 kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 29 00:35:53 np0005539505 kernel: Loading compiled-in module X.509 certificates
Nov 29 00:35:53 np0005539505 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 29 00:35:53 np0005539505 kernel: ima: Allocated hash algorithm: sha256
Nov 29 00:35:53 np0005539505 kernel: ima: No architecture policies found
Nov 29 00:35:53 np0005539505 kernel: evm: Initialising EVM extended attributes:
Nov 29 00:35:53 np0005539505 kernel: evm: security.selinux
Nov 29 00:35:53 np0005539505 kernel: evm: security.SMACK64 (disabled)
Nov 29 00:35:53 np0005539505 kernel: evm: security.SMACK64EXEC (disabled)
Nov 29 00:35:53 np0005539505 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 29 00:35:53 np0005539505 kernel: evm: security.SMACK64MMAP (disabled)
Nov 29 00:35:53 np0005539505 kernel: evm: security.apparmor (disabled)
Nov 29 00:35:53 np0005539505 kernel: evm: security.ima
Nov 29 00:35:53 np0005539505 kernel: evm: security.capability
Nov 29 00:35:53 np0005539505 kernel: evm: HMAC attrs: 0x1
Nov 29 00:35:53 np0005539505 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 29 00:35:53 np0005539505 kernel: Running certificate verification RSA selftest
Nov 29 00:35:53 np0005539505 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 29 00:35:53 np0005539505 kernel: Running certificate verification ECDSA selftest
Nov 29 00:35:53 np0005539505 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 29 00:35:53 np0005539505 kernel: clk: Disabling unused clocks
Nov 29 00:35:53 np0005539505 kernel: Freeing unused decrypted memory: 2028K
Nov 29 00:35:53 np0005539505 kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 29 00:35:53 np0005539505 kernel: Write protecting the kernel read-only data: 30720k
Nov 29 00:35:53 np0005539505 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 29 00:35:53 np0005539505 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 29 00:35:53 np0005539505 kernel: Run /init as init process
Nov 29 00:35:53 np0005539505 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 00:35:53 np0005539505 systemd: Detected virtualization kvm.
Nov 29 00:35:53 np0005539505 systemd: Detected architecture x86-64.
Nov 29 00:35:53 np0005539505 systemd: Running in initrd.
Nov 29 00:35:53 np0005539505 systemd: No hostname configured, using default hostname.
Nov 29 00:35:53 np0005539505 systemd: Hostname set to <localhost>.
Nov 29 00:35:53 np0005539505 systemd: Initializing machine ID from VM UUID.
Nov 29 00:35:53 np0005539505 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 29 00:35:53 np0005539505 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 29 00:35:53 np0005539505 kernel: usb 1-1: Product: QEMU USB Tablet
Nov 29 00:35:53 np0005539505 kernel: usb 1-1: Manufacturer: QEMU
Nov 29 00:35:53 np0005539505 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 29 00:35:53 np0005539505 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 29 00:35:53 np0005539505 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 29 00:35:53 np0005539505 systemd: Queued start job for default target Initrd Default Target.
Nov 29 00:35:53 np0005539505 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:53 np0005539505 systemd: Reached target Local Encrypted Volumes.
Nov 29 00:35:53 np0005539505 systemd: Reached target Initrd /usr File System.
Nov 29 00:35:53 np0005539505 systemd: Reached target Local File Systems.
Nov 29 00:35:53 np0005539505 systemd: Reached target Path Units.
Nov 29 00:35:53 np0005539505 systemd: Reached target Slice Units.
Nov 29 00:35:53 np0005539505 systemd: Reached target Swaps.
Nov 29 00:35:53 np0005539505 systemd: Reached target Timer Units.
Nov 29 00:35:53 np0005539505 systemd: Listening on D-Bus System Message Bus Socket.
Nov 29 00:35:53 np0005539505 systemd: Listening on Journal Socket (/dev/log).
Nov 29 00:35:53 np0005539505 systemd: Listening on Journal Socket.
Nov 29 00:35:53 np0005539505 systemd: Listening on udev Control Socket.
Nov 29 00:35:53 np0005539505 systemd: Listening on udev Kernel Socket.
Nov 29 00:35:53 np0005539505 systemd: Reached target Socket Units.
Nov 29 00:35:53 np0005539505 systemd: Starting Create List of Static Device Nodes...
Nov 29 00:35:53 np0005539505 systemd: Starting Journal Service...
Nov 29 00:35:53 np0005539505 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 00:35:53 np0005539505 systemd: Starting Apply Kernel Variables...
Nov 29 00:35:53 np0005539505 systemd: Starting Create System Users...
Nov 29 00:35:53 np0005539505 systemd: Starting Setup Virtual Console...
Nov 29 00:35:53 np0005539505 systemd: Finished Create List of Static Device Nodes.
Nov 29 00:35:53 np0005539505 systemd: Finished Apply Kernel Variables.
Nov 29 00:35:53 np0005539505 systemd: Finished Create System Users.
Nov 29 00:35:53 np0005539505 systemd-journald[305]: Journal started
Nov 29 00:35:53 np0005539505 systemd-journald[305]: Runtime Journal (/run/log/journal/e530337733ba4369bb3f4ec9de742582) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:53 np0005539505 systemd-sysusers[309]: Creating group 'users' with GID 100.
Nov 29 00:35:53 np0005539505 systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Nov 29 00:35:53 np0005539505 systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 29 00:35:53 np0005539505 systemd: Started Journal Service.
Nov 29 00:35:53 np0005539505 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 00:35:53 np0005539505 systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 00:35:53 np0005539505 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 00:35:53 np0005539505 systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 00:35:53 np0005539505 systemd[1]: Finished Setup Virtual Console.
Nov 29 00:35:53 np0005539505 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 29 00:35:53 np0005539505 systemd[1]: Starting dracut cmdline hook...
Nov 29 00:35:53 np0005539505 dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Nov 29 00:35:53 np0005539505 dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 29 00:35:53 np0005539505 systemd[1]: Finished dracut cmdline hook.
Nov 29 00:35:53 np0005539505 systemd[1]: Starting dracut pre-udev hook...
Nov 29 00:35:53 np0005539505 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 29 00:35:53 np0005539505 kernel: device-mapper: uevent: version 1.0.3
Nov 29 00:35:53 np0005539505 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 29 00:35:53 np0005539505 kernel: RPC: Registered named UNIX socket transport module.
Nov 29 00:35:53 np0005539505 kernel: RPC: Registered udp transport module.
Nov 29 00:35:53 np0005539505 kernel: RPC: Registered tcp transport module.
Nov 29 00:35:53 np0005539505 kernel: RPC: Registered tcp-with-tls transport module.
Nov 29 00:35:53 np0005539505 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 29 00:35:53 np0005539505 rpc.statd[441]: Version 2.5.4 starting
Nov 29 00:35:53 np0005539505 rpc.statd[441]: Initializing NSM state
Nov 29 00:35:53 np0005539505 rpc.idmapd[446]: Setting log level to 0
Nov 29 00:35:53 np0005539505 systemd[1]: Finished dracut pre-udev hook.
Nov 29 00:35:53 np0005539505 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 00:35:53 np0005539505 systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 00:35:53 np0005539505 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 00:35:53 np0005539505 systemd[1]: Starting dracut pre-trigger hook...
Nov 29 00:35:53 np0005539505 systemd[1]: Finished dracut pre-trigger hook.
Nov 29 00:35:53 np0005539505 systemd[1]: Starting Coldplug All udev Devices...
Nov 29 00:35:53 np0005539505 systemd[1]: Created slice Slice /system/modprobe.
Nov 29 00:35:53 np0005539505 systemd[1]: Starting Load Kernel Module configfs...
Nov 29 00:35:53 np0005539505 systemd[1]: Finished Coldplug All udev Devices.
Nov 29 00:35:53 np0005539505 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:53 np0005539505 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:53 np0005539505 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 00:35:53 np0005539505 systemd[1]: Reached target Network.
Nov 29 00:35:53 np0005539505 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 29 00:35:53 np0005539505 systemd[1]: Starting dracut initqueue hook...
Nov 29 00:35:53 np0005539505 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 29 00:35:53 np0005539505 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 29 00:35:53 np0005539505 kernel: vda: vda1
Nov 29 00:35:54 np0005539505 kernel: scsi host0: ata_piix
Nov 29 00:35:54 np0005539505 kernel: scsi host1: ata_piix
Nov 29 00:35:54 np0005539505 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 29 00:35:54 np0005539505 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 29 00:35:54 np0005539505 systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 00:35:54 np0005539505 systemd[1]: Reached target Initrd Root Device.
Nov 29 00:35:54 np0005539505 systemd[1]: Mounting Kernel Configuration File System...
Nov 29 00:35:54 np0005539505 systemd[1]: Mounted Kernel Configuration File System.
Nov 29 00:35:54 np0005539505 systemd[1]: Reached target System Initialization.
Nov 29 00:35:54 np0005539505 systemd[1]: Reached target Basic System.
Nov 29 00:35:54 np0005539505 kernel: ata1: found unknown device (class 0)
Nov 29 00:35:54 np0005539505 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 29 00:35:54 np0005539505 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 29 00:35:54 np0005539505 systemd-udevd[499]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:35:54 np0005539505 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 29 00:35:54 np0005539505 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 29 00:35:54 np0005539505 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 29 00:35:54 np0005539505 systemd[1]: Finished dracut initqueue hook.
Nov 29 00:35:54 np0005539505 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 00:35:54 np0005539505 systemd[1]: Reached target Remote Encrypted Volumes.
Nov 29 00:35:54 np0005539505 systemd[1]: Reached target Remote File Systems.
Nov 29 00:35:54 np0005539505 systemd[1]: Starting dracut pre-mount hook...
Nov 29 00:35:54 np0005539505 systemd[1]: Finished dracut pre-mount hook.
Nov 29 00:35:54 np0005539505 systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Nov 29 00:35:54 np0005539505 systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Nov 29 00:35:54 np0005539505 systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 29 00:35:54 np0005539505 systemd[1]: Mounting /sysroot...
Nov 29 00:35:54 np0005539505 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 29 00:35:54 np0005539505 kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Nov 29 00:35:54 np0005539505 kernel: XFS (vda1): Ending clean mount
Nov 29 00:35:55 np0005539505 systemd[1]: Mounted /sysroot.
Nov 29 00:35:55 np0005539505 systemd[1]: Reached target Initrd Root File System.
Nov 29 00:35:55 np0005539505 systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 29 00:35:55 np0005539505 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 29 00:35:55 np0005539505 systemd[1]: Reached target Initrd File Systems.
Nov 29 00:35:55 np0005539505 systemd[1]: Reached target Initrd Default Target.
Nov 29 00:35:55 np0005539505 systemd[1]: Starting dracut mount hook...
Nov 29 00:35:55 np0005539505 systemd[1]: Finished dracut mount hook.
Nov 29 00:35:55 np0005539505 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 29 00:35:55 np0005539505 rpc.idmapd[446]: exiting on signal 15
Nov 29 00:35:55 np0005539505 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 29 00:35:55 np0005539505 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Network.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Timer Units.
Nov 29 00:35:55 np0005539505 systemd[1]: dbus.socket: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 29 00:35:55 np0005539505 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Initrd Default Target.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Basic System.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Initrd Root Device.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Initrd /usr File System.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Path Units.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Remote File Systems.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Slice Units.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Socket Units.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target System Initialization.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Local File Systems.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Swaps.
Nov 29 00:35:55 np0005539505 systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped dracut mount hook.
Nov 29 00:35:55 np0005539505 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped dracut pre-mount hook.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped target Local Encrypted Volumes.
Nov 29 00:35:55 np0005539505 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:55 np0005539505 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped dracut initqueue hook.
Nov 29 00:35:55 np0005539505 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped Apply Kernel Variables.
Nov 29 00:35:55 np0005539505 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped Create Volatile Files and Directories.
Nov 29 00:35:55 np0005539505 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped Coldplug All udev Devices.
Nov 29 00:35:55 np0005539505 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped dracut pre-trigger hook.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 29 00:35:55 np0005539505 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped Setup Virtual Console.
Nov 29 00:35:55 np0005539505 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 29 00:35:55 np0005539505 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 29 00:35:55 np0005539505 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Closed udev Control Socket.
Nov 29 00:35:55 np0005539505 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Closed udev Kernel Socket.
Nov 29 00:35:55 np0005539505 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped dracut pre-udev hook.
Nov 29 00:35:55 np0005539505 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped dracut cmdline hook.
Nov 29 00:35:55 np0005539505 systemd[1]: Starting Cleanup udev Database...
Nov 29 00:35:55 np0005539505 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 29 00:35:55 np0005539505 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped Create List of Static Device Nodes.
Nov 29 00:35:55 np0005539505 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Stopped Create System Users.
Nov 29 00:35:55 np0005539505 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 29 00:35:55 np0005539505 systemd[1]: Finished Cleanup udev Database.
Nov 29 00:35:55 np0005539505 systemd[1]: Reached target Switch Root.
Nov 29 00:35:55 np0005539505 systemd[1]: Starting Switch Root...
Nov 29 00:35:55 np0005539505 systemd[1]: Switching root.
Nov 29 00:35:55 np0005539505 systemd-journald[305]: Journal stopped
Nov 29 00:35:56 np0005539505 systemd-journald: Received SIGTERM from PID 1 (systemd).
Nov 29 00:35:56 np0005539505 kernel: audit: type=1404 audit(1764394555.873:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 29 00:35:56 np0005539505 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:35:56 np0005539505 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:35:56 np0005539505 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:35:56 np0005539505 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:35:56 np0005539505 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:35:56 np0005539505 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:35:56 np0005539505 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:35:56 np0005539505 kernel: audit: type=1403 audit(1764394556.003:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 29 00:35:56 np0005539505 systemd: Successfully loaded SELinux policy in 134.164ms.
Nov 29 00:35:56 np0005539505 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.262ms.
Nov 29 00:35:56 np0005539505 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 29 00:35:56 np0005539505 systemd: Detected virtualization kvm.
Nov 29 00:35:56 np0005539505 systemd: Detected architecture x86-64.
Nov 29 00:35:56 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:35:56 np0005539505 systemd: initrd-switch-root.service: Deactivated successfully.
Nov 29 00:35:56 np0005539505 systemd: Stopped Switch Root.
Nov 29 00:35:56 np0005539505 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 29 00:35:56 np0005539505 systemd: Created slice Slice /system/getty.
Nov 29 00:35:56 np0005539505 systemd: Created slice Slice /system/serial-getty.
Nov 29 00:35:56 np0005539505 systemd: Created slice Slice /system/sshd-keygen.
Nov 29 00:35:56 np0005539505 systemd: Created slice User and Session Slice.
Nov 29 00:35:56 np0005539505 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 29 00:35:56 np0005539505 systemd: Started Forward Password Requests to Wall Directory Watch.
Nov 29 00:35:56 np0005539505 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 29 00:35:56 np0005539505 systemd: Reached target Local Encrypted Volumes.
Nov 29 00:35:56 np0005539505 systemd: Stopped target Switch Root.
Nov 29 00:35:56 np0005539505 systemd: Stopped target Initrd File Systems.
Nov 29 00:35:56 np0005539505 systemd: Stopped target Initrd Root File System.
Nov 29 00:35:56 np0005539505 systemd: Reached target Local Integrity Protected Volumes.
Nov 29 00:35:56 np0005539505 systemd: Reached target Path Units.
Nov 29 00:35:56 np0005539505 systemd: Reached target rpc_pipefs.target.
Nov 29 00:35:56 np0005539505 systemd: Reached target Slice Units.
Nov 29 00:35:56 np0005539505 systemd: Reached target Swaps.
Nov 29 00:35:56 np0005539505 systemd: Reached target Local Verity Protected Volumes.
Nov 29 00:35:56 np0005539505 systemd: Listening on RPCbind Server Activation Socket.
Nov 29 00:35:56 np0005539505 systemd: Reached target RPC Port Mapper.
Nov 29 00:35:56 np0005539505 systemd: Listening on Process Core Dump Socket.
Nov 29 00:35:56 np0005539505 systemd: Listening on initctl Compatibility Named Pipe.
Nov 29 00:35:56 np0005539505 systemd: Listening on udev Control Socket.
Nov 29 00:35:56 np0005539505 systemd: Listening on udev Kernel Socket.
Nov 29 00:35:56 np0005539505 systemd: Mounting Huge Pages File System...
Nov 29 00:35:56 np0005539505 systemd: Mounting POSIX Message Queue File System...
Nov 29 00:35:56 np0005539505 systemd: Mounting Kernel Debug File System...
Nov 29 00:35:56 np0005539505 systemd: Mounting Kernel Trace File System...
Nov 29 00:35:56 np0005539505 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 00:35:56 np0005539505 systemd: Starting Create List of Static Device Nodes...
Nov 29 00:35:56 np0005539505 systemd: Starting Load Kernel Module configfs...
Nov 29 00:35:56 np0005539505 systemd: Starting Load Kernel Module drm...
Nov 29 00:35:56 np0005539505 systemd: Starting Load Kernel Module efi_pstore...
Nov 29 00:35:56 np0005539505 systemd: Starting Load Kernel Module fuse...
Nov 29 00:35:56 np0005539505 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 29 00:35:56 np0005539505 systemd: systemd-fsck-root.service: Deactivated successfully.
Nov 29 00:35:56 np0005539505 systemd: Stopped File System Check on Root Device.
Nov 29 00:35:56 np0005539505 systemd: Stopped Journal Service.
Nov 29 00:35:56 np0005539505 kernel: fuse: init (API version 7.37)
Nov 29 00:35:56 np0005539505 systemd: Starting Journal Service...
Nov 29 00:35:56 np0005539505 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 29 00:35:56 np0005539505 systemd: Starting Generate network units from Kernel command line...
Nov 29 00:35:56 np0005539505 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:56 np0005539505 systemd: Starting Remount Root and Kernel File Systems...
Nov 29 00:35:56 np0005539505 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 29 00:35:56 np0005539505 systemd: Starting Apply Kernel Variables...
Nov 29 00:35:56 np0005539505 systemd: Starting Coldplug All udev Devices...
Nov 29 00:35:56 np0005539505 systemd-journald[677]: Journal started
Nov 29 00:35:56 np0005539505 systemd-journald[677]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:56 np0005539505 systemd[1]: Queued start job for default target Multi-User System.
Nov 29 00:35:56 np0005539505 systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 29 00:35:56 np0005539505 systemd: Started Journal Service.
Nov 29 00:35:56 np0005539505 systemd[1]: Mounted Huge Pages File System.
Nov 29 00:35:56 np0005539505 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 29 00:35:56 np0005539505 systemd[1]: Mounted POSIX Message Queue File System.
Nov 29 00:35:56 np0005539505 systemd[1]: Mounted Kernel Debug File System.
Nov 29 00:35:56 np0005539505 systemd[1]: Mounted Kernel Trace File System.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Create List of Static Device Nodes.
Nov 29 00:35:56 np0005539505 kernel: ACPI: bus type drm_connector registered
Nov 29 00:35:56 np0005539505 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:56 np0005539505 systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Load Kernel Module drm.
Nov 29 00:35:56 np0005539505 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 29 00:35:56 np0005539505 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Load Kernel Module fuse.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Generate network units from Kernel command line.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Apply Kernel Variables.
Nov 29 00:35:56 np0005539505 systemd[1]: Mounting FUSE Control File System...
Nov 29 00:35:56 np0005539505 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 00:35:56 np0005539505 systemd[1]: Starting Rebuild Hardware Database...
Nov 29 00:35:56 np0005539505 systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 29 00:35:56 np0005539505 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 29 00:35:56 np0005539505 systemd[1]: Starting Load/Save OS Random Seed...
Nov 29 00:35:56 np0005539505 systemd[1]: Starting Create System Users...
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Coldplug All udev Devices.
Nov 29 00:35:56 np0005539505 systemd[1]: Mounted FUSE Control File System.
Nov 29 00:35:56 np0005539505 systemd-journald[677]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 29 00:35:56 np0005539505 systemd-journald[677]: Received client request to flush runtime journal.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Load/Save OS Random Seed.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 29 00:35:56 np0005539505 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Create System Users.
Nov 29 00:35:56 np0005539505 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 29 00:35:56 np0005539505 systemd[1]: Reached target Preparation for Local File Systems.
Nov 29 00:35:56 np0005539505 systemd[1]: Reached target Local File Systems.
Nov 29 00:35:56 np0005539505 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 29 00:35:56 np0005539505 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 29 00:35:56 np0005539505 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 29 00:35:56 np0005539505 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 29 00:35:56 np0005539505 systemd[1]: Starting Automatic Boot Loader Update...
Nov 29 00:35:56 np0005539505 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 29 00:35:56 np0005539505 systemd[1]: Starting Create Volatile Files and Directories...
Nov 29 00:35:56 np0005539505 bootctl[694]: Couldn't find EFI system partition, skipping.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Automatic Boot Loader Update.
Nov 29 00:35:56 np0005539505 systemd[1]: Finished Create Volatile Files and Directories.
Nov 29 00:35:56 np0005539505 systemd[1]: Starting Security Auditing Service...
Nov 29 00:35:57 np0005539505 systemd[1]: Starting RPC Bind...
Nov 29 00:35:57 np0005539505 systemd[1]: Starting Rebuild Journal Catalog...
Nov 29 00:35:57 np0005539505 auditd[700]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 29 00:35:57 np0005539505 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 29 00:35:57 np0005539505 auditd[700]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 29 00:35:57 np0005539505 systemd[1]: Finished Rebuild Journal Catalog.
Nov 29 00:35:57 np0005539505 augenrules[705]: /sbin/augenrules: No change
Nov 29 00:35:57 np0005539505 augenrules[720]: No rules
Nov 29 00:35:57 np0005539505 augenrules[720]: enabled 1
Nov 29 00:35:57 np0005539505 augenrules[720]: failure 1
Nov 29 00:35:57 np0005539505 augenrules[720]: pid 700
Nov 29 00:35:57 np0005539505 augenrules[720]: rate_limit 0
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog_limit 8192
Nov 29 00:35:57 np0005539505 augenrules[720]: lost 0
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog 0
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog_wait_time 60000
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog_wait_time_actual 0
Nov 29 00:35:57 np0005539505 augenrules[720]: enabled 1
Nov 29 00:35:57 np0005539505 augenrules[720]: failure 1
Nov 29 00:35:57 np0005539505 augenrules[720]: pid 700
Nov 29 00:35:57 np0005539505 augenrules[720]: rate_limit 0
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog_limit 8192
Nov 29 00:35:57 np0005539505 augenrules[720]: lost 0
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog 0
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog_wait_time 60000
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog_wait_time_actual 0
Nov 29 00:35:57 np0005539505 augenrules[720]: enabled 1
Nov 29 00:35:57 np0005539505 augenrules[720]: failure 1
Nov 29 00:35:57 np0005539505 augenrules[720]: pid 700
Nov 29 00:35:57 np0005539505 augenrules[720]: rate_limit 0
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog_limit 8192
Nov 29 00:35:57 np0005539505 augenrules[720]: lost 0
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog 0
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog_wait_time 60000
Nov 29 00:35:57 np0005539505 augenrules[720]: backlog_wait_time_actual 0
Nov 29 00:35:57 np0005539505 systemd[1]: Started Security Auditing Service.
Nov 29 00:35:57 np0005539505 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 29 00:35:57 np0005539505 systemd[1]: Started RPC Bind.
Nov 29 00:35:57 np0005539505 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 29 00:35:57 np0005539505 systemd[1]: Finished Rebuild Hardware Database.
Nov 29 00:35:57 np0005539505 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 29 00:35:57 np0005539505 systemd[1]: Starting Update is Completed...
Nov 29 00:35:57 np0005539505 systemd[1]: Finished Update is Completed.
Nov 29 00:35:57 np0005539505 systemd-udevd[730]: Using default interface naming scheme 'rhel-9.0'.
Nov 29 00:35:57 np0005539505 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 29 00:35:57 np0005539505 systemd[1]: Reached target System Initialization.
Nov 29 00:35:57 np0005539505 systemd[1]: Started dnf makecache --timer.
Nov 29 00:35:57 np0005539505 systemd[1]: Started Daily rotation of log files.
Nov 29 00:35:57 np0005539505 systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 29 00:35:57 np0005539505 systemd[1]: Reached target Timer Units.
Nov 29 00:35:57 np0005539505 systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 29 00:35:57 np0005539505 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 29 00:35:57 np0005539505 systemd[1]: Reached target Socket Units.
Nov 29 00:35:57 np0005539505 systemd[1]: Starting D-Bus System Message Bus...
Nov 29 00:35:57 np0005539505 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:57 np0005539505 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 29 00:35:57 np0005539505 systemd[1]: Starting Load Kernel Module configfs...
Nov 29 00:35:57 np0005539505 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 29 00:35:57 np0005539505 systemd[1]: Finished Load Kernel Module configfs.
Nov 29 00:35:57 np0005539505 systemd-udevd[739]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:35:57 np0005539505 systemd[1]: Started D-Bus System Message Bus.
Nov 29 00:35:57 np0005539505 systemd[1]: Reached target Basic System.
Nov 29 00:35:57 np0005539505 dbus-broker-lau[765]: Ready
Nov 29 00:35:57 np0005539505 systemd[1]: Starting NTP client/server...
Nov 29 00:35:57 np0005539505 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 29 00:35:57 np0005539505 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 29 00:35:57 np0005539505 systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 29 00:35:57 np0005539505 systemd[1]: Starting IPv4 firewall with iptables...
Nov 29 00:35:57 np0005539505 systemd[1]: Started irqbalance daemon.
Nov 29 00:35:57 np0005539505 chronyd[788]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 00:35:57 np0005539505 chronyd[788]: Loaded 0 symmetric keys
Nov 29 00:35:57 np0005539505 chronyd[788]: Using right/UTC timezone to obtain leap second data
Nov 29 00:35:57 np0005539505 chronyd[788]: Loaded seccomp filter (level 2)
Nov 29 00:35:57 np0005539505 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 29 00:35:57 np0005539505 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 29 00:35:57 np0005539505 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 29 00:35:57 np0005539505 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 29 00:35:57 np0005539505 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:57 np0005539505 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:57 np0005539505 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 00:35:57 np0005539505 systemd[1]: Reached target sshd-keygen.target.
Nov 29 00:35:57 np0005539505 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 29 00:35:57 np0005539505 systemd[1]: Reached target User and Group Name Lookups.
Nov 29 00:35:57 np0005539505 systemd[1]: Starting User Login Management...
Nov 29 00:35:57 np0005539505 systemd[1]: Started NTP client/server.
Nov 29 00:35:57 np0005539505 systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 29 00:35:57 np0005539505 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 29 00:35:57 np0005539505 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 29 00:35:57 np0005539505 kernel: Console: switching to colour dummy device 80x25
Nov 29 00:35:57 np0005539505 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 29 00:35:57 np0005539505 kernel: [drm] features: -context_init
Nov 29 00:35:57 np0005539505 kernel: [drm] number of scanouts: 1
Nov 29 00:35:57 np0005539505 kernel: [drm] number of cap sets: 0
Nov 29 00:35:57 np0005539505 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 29 00:35:57 np0005539505 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 29 00:35:57 np0005539505 kernel: Console: switching to colour frame buffer device 128x48
Nov 29 00:35:57 np0005539505 systemd-logind[794]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 00:35:57 np0005539505 systemd-logind[794]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 00:35:57 np0005539505 kernel: kvm_amd: TSC scaling supported
Nov 29 00:35:57 np0005539505 kernel: kvm_amd: Nested Virtualization enabled
Nov 29 00:35:57 np0005539505 kernel: kvm_amd: Nested Paging enabled
Nov 29 00:35:57 np0005539505 kernel: kvm_amd: LBR virtualization supported
Nov 29 00:35:57 np0005539505 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 29 00:35:57 np0005539505 systemd-logind[794]: New seat seat0.
Nov 29 00:35:57 np0005539505 systemd[1]: Started User Login Management.
Nov 29 00:35:57 np0005539505 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 29 00:35:57 np0005539505 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 29 00:35:58 np0005539505 iptables.init[782]: iptables: Applying firewall rules: [  OK  ]
Nov 29 00:35:58 np0005539505 systemd[1]: Finished IPv4 firewall with iptables.
Nov 29 00:35:58 np0005539505 cloud-init[840]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 29 Nov 2025 05:35:58 +0000. Up 6.99 seconds.
Nov 29 00:35:58 np0005539505 systemd[1]: run-cloud\x2dinit-tmp-tmp20zal3m_.mount: Deactivated successfully.
Nov 29 00:35:58 np0005539505 systemd[1]: Starting Hostname Service...
Nov 29 00:35:58 np0005539505 systemd[1]: Started Hostname Service.
Nov 29 00:35:58 np0005539505 systemd-hostnamed[854]: Hostname set to <np0005539505.novalocal> (static)
Nov 29 00:35:58 np0005539505 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 29 00:35:58 np0005539505 systemd[1]: Reached target Preparation for Network.
Nov 29 00:35:58 np0005539505 systemd[1]: Starting Network Manager...
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7492] NetworkManager (version 1.54.1-1.el9) is starting... (boot:c017802d-b3d3-4f2d-87a0-b39da9e20414)
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7497] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7560] manager[0x55995cfb8080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7589] hostname: hostname: using hostnamed
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7589] hostname: static hostname changed from (none) to "np0005539505.novalocal"
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7593] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7685] manager[0x55995cfb8080]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7685] manager[0x55995cfb8080]: rfkill: WWAN hardware radio set enabled
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7722] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7722] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7723] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7723] manager: Networking is enabled by state file
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7725] settings: Loaded settings plugin: keyfile (internal)
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7738] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7757] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7769] dhcp: init: Using DHCP client 'internal'
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7771] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7783] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:35:58 np0005539505 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7793] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7802] device (lo): Activation: starting connection 'lo' (6d2d9121-1479-42a1-9107-1290d8f7122c)
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7811] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7814] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7842] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7846] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7848] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7849] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7851] device (eth0): carrier: link connected
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7853] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7859] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7867] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7870] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7871] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7874] manager: NetworkManager state is now CONNECTING
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7875] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7881] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7884] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7934] dhcp4 (eth0): state changed new lease, address=38.102.83.200
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7941] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.7960] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 00:35:58 np0005539505 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:35:58 np0005539505 systemd[1]: Started Network Manager.
Nov 29 00:35:58 np0005539505 systemd[1]: Reached target Network.
Nov 29 00:35:58 np0005539505 systemd[1]: Starting Network Manager Wait Online...
Nov 29 00:35:58 np0005539505 systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 29 00:35:58 np0005539505 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.8185] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.8187] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.8188] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.8193] device (lo): Activation: successful, device activated.
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.8199] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.8202] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.8204] device (eth0): Activation: successful, device activated.
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.8210] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 00:35:58 np0005539505 NetworkManager[858]: <info>  [1764394558.8213] manager: startup complete
Nov 29 00:35:58 np0005539505 systemd[1]: Started GSSAPI Proxy Daemon.
Nov 29 00:35:58 np0005539505 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 29 00:35:58 np0005539505 systemd[1]: Reached target NFS client services.
Nov 29 00:35:58 np0005539505 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 29 00:35:58 np0005539505 systemd[1]: Reached target Remote File Systems.
Nov 29 00:35:58 np0005539505 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 29 00:35:58 np0005539505 systemd[1]: Finished Network Manager Wait Online.
Nov 29 00:35:58 np0005539505 systemd[1]: Starting Cloud-init: Network Stage...
Nov 29 00:35:59 np0005539505 cloud-init[921]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 29 Nov 2025 05:35:59 +0000. Up 7.95 seconds.
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: |  eth0  | True |        38.102.83.200         | 255.255.255.0 | global | fa:16:3e:5a:56:06 |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: |  eth0  | True | fe80::f816:3eff:fe5a:5606/64 |       .       |  link  | fa:16:3e:5a:56:06 |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 29 00:35:59 np0005539505 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 29 00:36:00 np0005539505 cloud-init[921]: Generating public/private rsa key pair.
Nov 29 00:36:00 np0005539505 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 29 00:36:00 np0005539505 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 29 00:36:00 np0005539505 cloud-init[921]: The key fingerprint is:
Nov 29 00:36:00 np0005539505 cloud-init[921]: SHA256:8uSiOL/8ohLhJ4EeJ1UeGFDPou0QCjK18DQTbBtCEFI root@np0005539505.novalocal
Nov 29 00:36:00 np0005539505 cloud-init[921]: The key's randomart image is:
Nov 29 00:36:00 np0005539505 cloud-init[921]: +---[RSA 3072]----+
Nov 29 00:36:00 np0005539505 cloud-init[921]: |B*E++o           |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |o=+*+ .          |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |=+++ +           |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |*==..            |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |++=.  . S        |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |.+o.   =         |
Nov 29 00:36:00 np0005539505 cloud-init[921]: | .o.  . o        |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |. .o.. .         |
Nov 29 00:36:00 np0005539505 cloud-init[921]: | .o+=+.          |
Nov 29 00:36:00 np0005539505 cloud-init[921]: +----[SHA256]-----+
Nov 29 00:36:00 np0005539505 cloud-init[921]: Generating public/private ecdsa key pair.
Nov 29 00:36:00 np0005539505 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 29 00:36:00 np0005539505 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 29 00:36:00 np0005539505 cloud-init[921]: The key fingerprint is:
Nov 29 00:36:00 np0005539505 cloud-init[921]: SHA256:3xonhc1pcd1M+ZsMH+HJ+SglIvSjQ1VdEehKiybrZoc root@np0005539505.novalocal
Nov 29 00:36:00 np0005539505 cloud-init[921]: The key's randomart image is:
Nov 29 00:36:00 np0005539505 cloud-init[921]: +---[ECDSA 256]---+
Nov 29 00:36:00 np0005539505 cloud-init[921]: |           ....+*|
Nov 29 00:36:00 np0005539505 cloud-init[921]: |        . .  ..*.|
Nov 29 00:36:00 np0005539505 cloud-init[921]: |       . o  o + O|
Nov 29 00:36:00 np0005539505 cloud-init[921]: |        o +=.*.*.|
Nov 29 00:36:00 np0005539505 cloud-init[921]: |       .So+oOo+.=|
Nov 29 00:36:00 np0005539505 cloud-init[921]: |       .o+ *. .=.|
Nov 29 00:36:00 np0005539505 cloud-init[921]: |        =.+ o.   |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |       E . =     |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |      +.. .      |
Nov 29 00:36:00 np0005539505 cloud-init[921]: +----[SHA256]-----+
Nov 29 00:36:00 np0005539505 cloud-init[921]: Generating public/private ed25519 key pair.
Nov 29 00:36:00 np0005539505 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 29 00:36:00 np0005539505 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 29 00:36:00 np0005539505 cloud-init[921]: The key fingerprint is:
Nov 29 00:36:00 np0005539505 cloud-init[921]: SHA256:K2BesUQsbIPs649uvlcwKv+HI0Z0GVq8LYHn8mMZdTY root@np0005539505.novalocal
Nov 29 00:36:00 np0005539505 cloud-init[921]: The key's randomart image is:
Nov 29 00:36:00 np0005539505 cloud-init[921]: +--[ED25519 256]--+
Nov 29 00:36:00 np0005539505 cloud-init[921]: | . = ..          |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |  + @.o E        |
Nov 29 00:36:00 np0005539505 cloud-init[921]: | . * Xoo .       |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |  = X..o         |
Nov 29 00:36:00 np0005539505 cloud-init[921]: | . *o*o S        |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |. +o=o.  .       |
Nov 29 00:36:00 np0005539505 cloud-init[921]: | = ..+. .        |
Nov 29 00:36:00 np0005539505 cloud-init[921]: |  *.+ ..         |
Nov 29 00:36:00 np0005539505 cloud-init[921]: | ==*oo           |
Nov 29 00:36:00 np0005539505 cloud-init[921]: +----[SHA256]-----+
Nov 29 00:36:00 np0005539505 sm-notify[1003]: Version 2.5.4 starting
Nov 29 00:36:00 np0005539505 systemd[1]: Finished Cloud-init: Network Stage.
Nov 29 00:36:00 np0005539505 systemd[1]: Reached target Cloud-config availability.
Nov 29 00:36:00 np0005539505 systemd[1]: Reached target Network is Online.
Nov 29 00:36:00 np0005539505 systemd[1]: Starting Cloud-init: Config Stage...
Nov 29 00:36:00 np0005539505 systemd[1]: Starting Crash recovery kernel arming...
Nov 29 00:36:00 np0005539505 systemd[1]: Starting Notify NFS peers of a restart...
Nov 29 00:36:00 np0005539505 systemd[1]: Starting System Logging Service...
Nov 29 00:36:00 np0005539505 systemd[1]: Starting OpenSSH server daemon...
Nov 29 00:36:00 np0005539505 systemd[1]: Starting Permit User Sessions...
Nov 29 00:36:00 np0005539505 systemd[1]: Started OpenSSH server daemon.
Nov 29 00:36:00 np0005539505 systemd[1]: Started Notify NFS peers of a restart.
Nov 29 00:36:00 np0005539505 systemd[1]: Finished Permit User Sessions.
Nov 29 00:36:00 np0005539505 systemd[1]: Started Command Scheduler.
Nov 29 00:36:00 np0005539505 systemd[1]: Started Getty on tty1.
Nov 29 00:36:00 np0005539505 systemd[1]: Started Serial Getty on ttyS0.
Nov 29 00:36:00 np0005539505 systemd[1]: Reached target Login Prompts.
Nov 29 00:36:00 np0005539505 rsyslogd[1004]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1004" x-info="https://www.rsyslog.com"] start
Nov 29 00:36:00 np0005539505 systemd[1]: Started System Logging Service.
Nov 29 00:36:00 np0005539505 rsyslogd[1004]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 29 00:36:00 np0005539505 systemd[1]: Reached target Multi-User System.
Nov 29 00:36:00 np0005539505 systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 29 00:36:00 np0005539505 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 29 00:36:00 np0005539505 systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 29 00:36:00 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 00:36:00 np0005539505 kdumpctl[1013]: kdump: No kdump initial ramdisk found.
Nov 29 00:36:00 np0005539505 kdumpctl[1013]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 29 00:36:00 np0005539505 cloud-init[1131]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 29 Nov 2025 05:36:00 +0000. Up 9.61 seconds.
Nov 29 00:36:00 np0005539505 systemd[1]: Finished Cloud-init: Config Stage.
Nov 29 00:36:00 np0005539505 systemd[1]: Starting Cloud-init: Final Stage...
Nov 29 00:36:01 np0005539505 dracut[1264]: dracut-057-102.git20250818.el9
Nov 29 00:36:01 np0005539505 cloud-init[1282]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 29 Nov 2025 05:36:01 +0000. Up 10.03 seconds.
Nov 29 00:36:01 np0005539505 dracut[1266]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 29 00:36:01 np0005539505 cloud-init[1303]: #############################################################
Nov 29 00:36:01 np0005539505 cloud-init[1306]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 29 00:36:01 np0005539505 cloud-init[1314]: 256 SHA256:3xonhc1pcd1M+ZsMH+HJ+SglIvSjQ1VdEehKiybrZoc root@np0005539505.novalocal (ECDSA)
Nov 29 00:36:01 np0005539505 cloud-init[1317]: 256 SHA256:K2BesUQsbIPs649uvlcwKv+HI0Z0GVq8LYHn8mMZdTY root@np0005539505.novalocal (ED25519)
Nov 29 00:36:01 np0005539505 cloud-init[1324]: 3072 SHA256:8uSiOL/8ohLhJ4EeJ1UeGFDPou0QCjK18DQTbBtCEFI root@np0005539505.novalocal (RSA)
Nov 29 00:36:01 np0005539505 cloud-init[1327]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 29 00:36:01 np0005539505 cloud-init[1330]: #############################################################
Nov 29 00:36:01 np0005539505 cloud-init[1282]: Cloud-init v. 24.4-7.el9 finished at Sat, 29 Nov 2025 05:36:01 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.19 seconds
Nov 29 00:36:01 np0005539505 systemd[1]: Finished Cloud-init: Final Stage.
Nov 29 00:36:01 np0005539505 systemd[1]: Reached target Cloud-init target.
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 00:36:01 np0005539505 dracut[1266]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: memstrack is not available
Nov 29 00:36:02 np0005539505 dracut[1266]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 29 00:36:02 np0005539505 dracut[1266]: memstrack is not available
Nov 29 00:36:02 np0005539505 dracut[1266]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 29 00:36:02 np0005539505 dracut[1266]: *** Including module: systemd ***
Nov 29 00:36:02 np0005539505 dracut[1266]: *** Including module: fips ***
Nov 29 00:36:03 np0005539505 dracut[1266]: *** Including module: systemd-initrd ***
Nov 29 00:36:03 np0005539505 dracut[1266]: *** Including module: i18n ***
Nov 29 00:36:03 np0005539505 dracut[1266]: *** Including module: drm ***
Nov 29 00:36:03 np0005539505 dracut[1266]: *** Including module: prefixdevname ***
Nov 29 00:36:03 np0005539505 dracut[1266]: *** Including module: kernel-modules ***
Nov 29 00:36:04 np0005539505 chronyd[788]: Selected source 162.159.200.123 (2.centos.pool.ntp.org)
Nov 29 00:36:04 np0005539505 chronyd[788]: System clock wrong by 1.155077 seconds
Nov 29 00:36:04 np0005539505 chronyd[788]: System clock was stepped by 1.155077 seconds
Nov 29 00:36:04 np0005539505 chronyd[788]: System clock TAI offset set to 37 seconds
Nov 29 00:36:05 np0005539505 kernel: block vda: the capability attribute has been deprecated.
Nov 29 00:36:05 np0005539505 dracut[1266]: *** Including module: kernel-modules-extra ***
Nov 29 00:36:05 np0005539505 dracut[1266]: *** Including module: qemu ***
Nov 29 00:36:05 np0005539505 dracut[1266]: *** Including module: fstab-sys ***
Nov 29 00:36:05 np0005539505 dracut[1266]: *** Including module: rootfs-block ***
Nov 29 00:36:05 np0005539505 dracut[1266]: *** Including module: terminfo ***
Nov 29 00:36:05 np0005539505 dracut[1266]: *** Including module: udev-rules ***
Nov 29 00:36:06 np0005539505 dracut[1266]: Skipping udev rule: 91-permissions.rules
Nov 29 00:36:06 np0005539505 dracut[1266]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 29 00:36:06 np0005539505 dracut[1266]: *** Including module: virtiofs ***
Nov 29 00:36:06 np0005539505 dracut[1266]: *** Including module: dracut-systemd ***
Nov 29 00:36:06 np0005539505 dracut[1266]: *** Including module: usrmount ***
Nov 29 00:36:06 np0005539505 dracut[1266]: *** Including module: base ***
Nov 29 00:36:06 np0005539505 dracut[1266]: *** Including module: fs-lib ***
Nov 29 00:36:06 np0005539505 dracut[1266]: *** Including module: kdumpbase ***
Nov 29 00:36:06 np0005539505 dracut[1266]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 29 00:36:06 np0005539505 dracut[1266]:  microcode_ctl module: mangling fw_dir
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: configuration "intel" is ignored
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 29 00:36:06 np0005539505 dracut[1266]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 29 00:36:07 np0005539505 dracut[1266]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 29 00:36:07 np0005539505 dracut[1266]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 29 00:36:07 np0005539505 dracut[1266]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 29 00:36:07 np0005539505 dracut[1266]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 29 00:36:07 np0005539505 dracut[1266]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 29 00:36:07 np0005539505 dracut[1266]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 29 00:36:07 np0005539505 dracut[1266]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 29 00:36:07 np0005539505 dracut[1266]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 29 00:36:07 np0005539505 dracut[1266]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 29 00:36:07 np0005539505 dracut[1266]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 29 00:36:07 np0005539505 dracut[1266]: *** Including module: openssl ***
Nov 29 00:36:07 np0005539505 dracut[1266]: *** Including module: shutdown ***
Nov 29 00:36:07 np0005539505 dracut[1266]: *** Including module: squash ***
Nov 29 00:36:07 np0005539505 dracut[1266]: *** Including modules done ***
Nov 29 00:36:07 np0005539505 dracut[1266]: *** Installing kernel module dependencies ***
Nov 29 00:36:08 np0005539505 dracut[1266]: *** Installing kernel module dependencies done ***
Nov 29 00:36:08 np0005539505 dracut[1266]: *** Resolving executable dependencies ***
Nov 29 00:36:09 np0005539505 irqbalance[784]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 29 00:36:09 np0005539505 irqbalance[784]: IRQ 25 affinity is now unmanaged
Nov 29 00:36:09 np0005539505 irqbalance[784]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 29 00:36:09 np0005539505 irqbalance[784]: IRQ 31 affinity is now unmanaged
Nov 29 00:36:09 np0005539505 irqbalance[784]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 29 00:36:09 np0005539505 irqbalance[784]: IRQ 28 affinity is now unmanaged
Nov 29 00:36:09 np0005539505 irqbalance[784]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 29 00:36:09 np0005539505 irqbalance[784]: IRQ 32 affinity is now unmanaged
Nov 29 00:36:09 np0005539505 irqbalance[784]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 29 00:36:09 np0005539505 irqbalance[784]: IRQ 30 affinity is now unmanaged
Nov 29 00:36:09 np0005539505 irqbalance[784]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 29 00:36:09 np0005539505 irqbalance[784]: IRQ 29 affinity is now unmanaged
Nov 29 00:36:09 np0005539505 dracut[1266]: *** Resolving executable dependencies done ***
Nov 29 00:36:09 np0005539505 dracut[1266]: *** Generating early-microcode cpio image ***
Nov 29 00:36:09 np0005539505 dracut[1266]: *** Store current command line parameters ***
Nov 29 00:36:09 np0005539505 dracut[1266]: Stored kernel commandline:
Nov 29 00:36:09 np0005539505 dracut[1266]: No dracut internal kernel commandline stored in the initramfs
Nov 29 00:36:09 np0005539505 dracut[1266]: *** Install squash loader ***
Nov 29 00:36:10 np0005539505 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:36:10 np0005539505 dracut[1266]: *** Squashing the files inside the initramfs ***
Nov 29 00:36:11 np0005539505 dracut[1266]: *** Squashing the files inside the initramfs done ***
Nov 29 00:36:11 np0005539505 dracut[1266]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 29 00:36:11 np0005539505 dracut[1266]: *** Hardlinking files ***
Nov 29 00:36:11 np0005539505 dracut[1266]: *** Hardlinking files done ***
Nov 29 00:36:11 np0005539505 dracut[1266]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 29 00:36:12 np0005539505 kdumpctl[1013]: kdump: kexec: loaded kdump kernel
Nov 29 00:36:12 np0005539505 kdumpctl[1013]: kdump: Starting kdump: [OK]
Nov 29 00:36:12 np0005539505 systemd[1]: Finished Crash recovery kernel arming.
Nov 29 00:36:12 np0005539505 systemd[1]: Startup finished in 1.660s (kernel) + 3.006s (initrd) + 15.362s (userspace) = 20.029s.
Nov 29 00:36:29 np0005539505 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 00:46:55 np0005539505 systemd[1]: Created slice User Slice of UID 1000.
Nov 29 00:46:55 np0005539505 systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 29 00:46:55 np0005539505 systemd-logind[794]: New session 1 of user zuul.
Nov 29 00:46:55 np0005539505 systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 29 00:46:55 np0005539505 systemd[1]: Starting User Manager for UID 1000...
Nov 29 00:46:55 np0005539505 systemd[4304]: Queued start job for default target Main User Target.
Nov 29 00:46:55 np0005539505 systemd[4304]: Created slice User Application Slice.
Nov 29 00:46:55 np0005539505 systemd[4304]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 00:46:55 np0005539505 systemd[4304]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 00:46:55 np0005539505 systemd[4304]: Reached target Paths.
Nov 29 00:46:55 np0005539505 systemd[4304]: Reached target Timers.
Nov 29 00:46:55 np0005539505 systemd[4304]: Starting D-Bus User Message Bus Socket...
Nov 29 00:46:55 np0005539505 systemd[4304]: Starting Create User's Volatile Files and Directories...
Nov 29 00:46:55 np0005539505 systemd[4304]: Finished Create User's Volatile Files and Directories.
Nov 29 00:46:55 np0005539505 systemd[4304]: Listening on D-Bus User Message Bus Socket.
Nov 29 00:46:55 np0005539505 systemd[4304]: Reached target Sockets.
Nov 29 00:46:55 np0005539505 systemd[4304]: Reached target Basic System.
Nov 29 00:46:55 np0005539505 systemd[4304]: Reached target Main User Target.
Nov 29 00:46:55 np0005539505 systemd[4304]: Startup finished in 136ms.
Nov 29 00:46:55 np0005539505 systemd[1]: Started User Manager for UID 1000.
Nov 29 00:46:55 np0005539505 systemd[1]: Started Session 1 of User zuul.
Nov 29 00:46:55 np0005539505 python3[4387]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:47:00 np0005539505 python3[4416]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:47:07 np0005539505 python3[4475]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:47:08 np0005539505 python3[4515]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 29 00:47:10 np0005539505 python3[4541]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDgFkjZfiFEmT2Jql9lLFt6CMd+9slSl3MrU+Raer5Y68zzzczsYHXSYgggBZM5uz+gWk02zu4ocSLCc0JOe4EmLwZGL6Ezoic8MmIXP1BwfiaeAXto2OGK7Dc7os16Q0SND6rHgOqdWZh8Kyf2kkY5vrdl9/yfrpAOV4V0UE16RT1qCQW53Ky9IytfIZYMSXaZwSmcvRflB6YToX0wepfVb3xbVWsEBI209yBpJ9cNVY5dWwvu1IlNXbIGLhUr4j3UgrB2k+H2+ltPlEHfLXPB0E2e43vS9K00XtLqpM4JZoq24L0kLi1a3RwzEeG1NQhkGbdnesYTkGRJrh5LvfWLiF4tooJWI0nRVs7jaO/R3w1l7zjdLRrSJ0h7Ie09iYSVZ1nuUuZ77A8mwh/mgdp8FEle4ES1X0kEADcAPPXV/6wFLOHevKRKw+jWBtYusFM6hS74njbD8BM8P0xMUAgCMIw7t3AXjeZIFNjZLL1o2fplfERitOr2Mc7dMx1EvfM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:10 np0005539505 python3[4565]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:11 np0005539505 python3[4664]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:11 np0005539505 python3[4735]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764395231.0468383-253-50365740664541/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=913550f2748a46ca85451ad1c4228192_id_rsa follow=False checksum=527bb20bedeb4c076b14aeb265edb174c4d8c41f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:12 np0005539505 python3[4858]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:12 np0005539505 python3[4929]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764395231.9847035-308-242022355482443/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=913550f2748a46ca85451ad1c4228192_id_rsa.pub follow=False checksum=56c975ef54c9fc5ba54f09c3deb1770b074b7446 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:13 np0005539505 python3[4977]: ansible-ping Invoked with data=pong
Nov 29 00:47:15 np0005539505 python3[5001]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 00:47:17 np0005539505 python3[5059]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 29 00:47:18 np0005539505 python3[5091]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:18 np0005539505 python3[5115]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:18 np0005539505 python3[5139]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:19 np0005539505 python3[5163]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:19 np0005539505 python3[5187]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:19 np0005539505 python3[5211]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:21 np0005539505 python3[5237]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:22 np0005539505 python3[5315]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:22 np0005539505 python3[5388]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395241.6624804-34-247068459082114/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:23 np0005539505 python3[5436]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:23 np0005539505 python3[5460]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:23 np0005539505 python3[5484]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:24 np0005539505 python3[5508]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:24 np0005539505 python3[5532]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:24 np0005539505 python3[5556]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:24 np0005539505 python3[5580]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:25 np0005539505 python3[5604]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:25 np0005539505 python3[5628]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:25 np0005539505 python3[5652]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:26 np0005539505 python3[5676]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:26 np0005539505 python3[5700]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:26 np0005539505 python3[5724]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:26 np0005539505 python3[5748]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:27 np0005539505 python3[5772]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:27 np0005539505 python3[5796]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:27 np0005539505 python3[5820]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:27 np0005539505 python3[5844]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:28 np0005539505 python3[5868]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:28 np0005539505 python3[5892]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:28 np0005539505 python3[5916]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:28 np0005539505 python3[5940]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:29 np0005539505 python3[5964]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:29 np0005539505 python3[5988]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:29 np0005539505 python3[6012]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:29 np0005539505 python3[6036]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:47:32 np0005539505 python3[6062]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 00:47:32 np0005539505 systemd[1]: Starting Time & Date Service...
Nov 29 00:47:33 np0005539505 systemd[1]: Started Time & Date Service.
Nov 29 00:47:33 np0005539505 systemd-timedated[6064]: Changed time zone to 'UTC' (UTC).
Nov 29 00:47:33 np0005539505 python3[6094]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:34 np0005539505 python3[6170]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:34 np0005539505 python3[6241]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764395253.8554306-254-217285890330742/source _original_basename=tmpzx4ukbvs follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:34 np0005539505 python3[6341]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:35 np0005539505 python3[6412]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764395254.7344494-305-80231602095228/source _original_basename=tmpf3abvx52 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:36 np0005539505 python3[6514]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:36 np0005539505 python3[6587]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764395255.9429088-384-232720444987038/source _original_basename=tmplc_ieppf follow=False checksum=2301e7467fee6838fa1b769310e346cfdbf4fa73 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:37 np0005539505 python3[6635]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:37 np0005539505 python3[6661]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:37 np0005539505 python3[6741]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:47:38 np0005539505 python3[6814]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395257.6268554-454-213928001646932/source _original_basename=tmplk8hcnl6 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:47:38 np0005539505 python3[6865]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-d89d-f1f2-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:47:39 np0005539505 python3[6893]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-d89d-f1f2-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 29 00:47:41 np0005539505 python3[6922]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:48:00 np0005539505 systemd[1]: Starting dnf makecache...
Nov 29 00:48:00 np0005539505 python3[6950]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:48:00 np0005539505 dnf[6949]: Failed determining last makecache time.
Nov 29 00:48:01 np0005539505 dnf[6949]: CentOS Stream 9 - BaseOS                         66 kB/s | 7.3 kB     00:00
Nov 29 00:48:01 np0005539505 dnf[6949]: CentOS Stream 9 - AppStream                      78 kB/s | 7.4 kB     00:00
Nov 29 00:48:01 np0005539505 dnf[6949]: CentOS Stream 9 - CRB                            73 kB/s | 7.2 kB     00:00
Nov 29 00:48:01 np0005539505 dnf[6949]: CentOS Stream 9 - Extras packages                29 kB/s | 8.3 kB     00:00
Nov 29 00:48:01 np0005539505 dnf[6949]: Metadata cache created.
Nov 29 00:48:01 np0005539505 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 29 00:48:01 np0005539505 systemd[1]: Finished dnf makecache.
Nov 29 00:48:03 np0005539505 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 00:49:00 np0005539505 systemd-logind[794]: Session 1 logged out. Waiting for processes to exit.
Nov 29 00:49:24 np0005539505 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 29 00:49:24 np0005539505 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 29 00:49:24 np0005539505 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 29 00:49:24 np0005539505 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 29 00:49:24 np0005539505 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 29 00:49:24 np0005539505 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 29 00:49:24 np0005539505 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 29 00:49:24 np0005539505 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 29 00:49:24 np0005539505 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 29 00:49:24 np0005539505 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 29 00:49:24 np0005539505 NetworkManager[858]: <info>  [1764395364.0496] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 00:49:24 np0005539505 systemd-udevd[6962]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 00:49:24 np0005539505 NetworkManager[858]: <info>  [1764395364.0693] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 00:49:24 np0005539505 NetworkManager[858]: <info>  [1764395364.0718] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 29 00:49:24 np0005539505 NetworkManager[858]: <info>  [1764395364.0721] device (eth1): carrier: link connected
Nov 29 00:49:24 np0005539505 NetworkManager[858]: <info>  [1764395364.0723] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 29 00:49:24 np0005539505 NetworkManager[858]: <info>  [1764395364.0728] policy: auto-activating connection 'Wired connection 1' (b503c079-5c4a-39e2-8fd9-0eb2e190fef7)
Nov 29 00:49:24 np0005539505 NetworkManager[858]: <info>  [1764395364.0731] device (eth1): Activation: starting connection 'Wired connection 1' (b503c079-5c4a-39e2-8fd9-0eb2e190fef7)
Nov 29 00:49:24 np0005539505 NetworkManager[858]: <info>  [1764395364.0731] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:49:24 np0005539505 NetworkManager[858]: <info>  [1764395364.0733] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:49:24 np0005539505 NetworkManager[858]: <info>  [1764395364.0738] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:49:24 np0005539505 NetworkManager[858]: <info>  [1764395364.0742] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:49:24 np0005539505 systemd[4304]: Starting Mark boot as successful...
Nov 29 00:49:24 np0005539505 systemd[4304]: Finished Mark boot as successful.
Nov 29 00:49:25 np0005539505 systemd-logind[794]: New session 3 of user zuul.
Nov 29 00:49:25 np0005539505 systemd[1]: Started Session 3 of User zuul.
Nov 29 00:49:25 np0005539505 python3[6995]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-fab1-3db9-0000000001ea-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:49:32 np0005539505 python3[7075]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:49:32 np0005539505 python3[7148]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764395372.1164436-206-6524196787710/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=0c7ddedf7a75ff291520d17d1370517c00e23196 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:49:33 np0005539505 python3[7198]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 00:49:33 np0005539505 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 00:49:33 np0005539505 systemd[1]: Stopped Network Manager Wait Online.
Nov 29 00:49:33 np0005539505 systemd[1]: Stopping Network Manager Wait Online...
Nov 29 00:49:33 np0005539505 systemd[1]: Stopping Network Manager...
Nov 29 00:49:33 np0005539505 NetworkManager[858]: <info>  [1764395373.3107] caught SIGTERM, shutting down normally.
Nov 29 00:49:33 np0005539505 NetworkManager[858]: <info>  [1764395373.3118] dhcp4 (eth0): canceled DHCP transaction
Nov 29 00:49:33 np0005539505 NetworkManager[858]: <info>  [1764395373.3118] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:49:33 np0005539505 NetworkManager[858]: <info>  [1764395373.3118] dhcp4 (eth0): state changed no lease
Nov 29 00:49:33 np0005539505 NetworkManager[858]: <info>  [1764395373.3121] manager: NetworkManager state is now CONNECTING
Nov 29 00:49:33 np0005539505 NetworkManager[858]: <info>  [1764395373.3272] dhcp4 (eth1): canceled DHCP transaction
Nov 29 00:49:33 np0005539505 NetworkManager[858]: <info>  [1764395373.3272] dhcp4 (eth1): state changed no lease
Nov 29 00:49:33 np0005539505 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:49:33 np0005539505 NetworkManager[858]: <info>  [1764395373.3304] exiting (success)
Nov 29 00:49:33 np0005539505 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:49:33 np0005539505 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 00:49:33 np0005539505 systemd[1]: Stopped Network Manager.
Nov 29 00:49:33 np0005539505 systemd[1]: NetworkManager.service: Consumed 4.503s CPU time, 10.1M memory peak.
Nov 29 00:49:33 np0005539505 systemd[1]: Starting Network Manager...
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.3730] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:c017802d-b3d3-4f2d-87a0-b39da9e20414)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.3731] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.3776] manager[0x556ca6868070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 00:49:33 np0005539505 systemd[1]: Starting Hostname Service...
Nov 29 00:49:33 np0005539505 systemd[1]: Started Hostname Service.
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4748] hostname: hostname: using hostnamed
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4750] hostname: static hostname changed from (none) to "np0005539505.novalocal"
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4753] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4757] manager[0x556ca6868070]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4757] manager[0x556ca6868070]: rfkill: WWAN hardware radio set enabled
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4777] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4778] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4778] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4778] manager: Networking is enabled by state file
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4780] settings: Loaded settings plugin: keyfile (internal)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4783] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4804] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4811] dhcp: init: Using DHCP client 'internal'
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4813] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4817] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4821] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4828] device (lo): Activation: starting connection 'lo' (6d2d9121-1479-42a1-9107-1290d8f7122c)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4833] device (eth0): carrier: link connected
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4836] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4840] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4840] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4845] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4851] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4855] device (eth1): carrier: link connected
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4858] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4862] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (b503c079-5c4a-39e2-8fd9-0eb2e190fef7) (indicated)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4862] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4867] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4872] device (eth1): Activation: starting connection 'Wired connection 1' (b503c079-5c4a-39e2-8fd9-0eb2e190fef7)
Nov 29 00:49:33 np0005539505 systemd[1]: Started Network Manager.
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4878] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4881] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4882] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4883] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4885] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4887] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4889] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4891] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4893] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4899] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4901] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4907] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4909] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4924] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4925] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4929] device (lo): Activation: successful, device activated.
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4937] dhcp4 (eth0): state changed new lease, address=38.102.83.200
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4942] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.4994] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.5007] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.5008] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.5010] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.5013] device (eth0): Activation: successful, device activated.
Nov 29 00:49:33 np0005539505 NetworkManager[7211]: <info>  [1764395373.5016] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 00:49:33 np0005539505 systemd[1]: Starting Network Manager Wait Online...
Nov 29 00:49:33 np0005539505 python3[7284]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-fab1-3db9-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:49:43 np0005539505 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:50:03 np0005539505 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.3603] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 00:50:18 np0005539505 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:50:18 np0005539505 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.3866] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.3871] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.3886] device (eth1): Activation: successful, device activated.
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.3895] manager: startup complete
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.3897] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <warn>  [1764395418.3914] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.3922] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 29 00:50:18 np0005539505 systemd[1]: Finished Network Manager Wait Online.
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.3992] dhcp4 (eth1): canceled DHCP transaction
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.3993] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.3993] dhcp4 (eth1): state changed no lease
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.4007] policy: auto-activating connection 'ci-private-network' (16f47f05-9a4b-50ba-bb1a-0eecad4f0adc)
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.4012] device (eth1): Activation: starting connection 'ci-private-network' (16f47f05-9a4b-50ba-bb1a-0eecad4f0adc)
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.4013] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.4015] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.4023] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.4034] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.4089] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.4091] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 00:50:18 np0005539505 NetworkManager[7211]: <info>  [1764395418.4099] device (eth1): Activation: successful, device activated.
Nov 29 00:50:28 np0005539505 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:50:33 np0005539505 systemd[1]: session-3.scope: Deactivated successfully.
Nov 29 00:50:33 np0005539505 systemd[1]: session-3.scope: Consumed 1.532s CPU time.
Nov 29 00:50:33 np0005539505 systemd-logind[794]: Session 3 logged out. Waiting for processes to exit.
Nov 29 00:50:33 np0005539505 systemd-logind[794]: Removed session 3.
Nov 29 00:50:45 np0005539505 systemd-logind[794]: New session 4 of user zuul.
Nov 29 00:50:45 np0005539505 systemd[1]: Started Session 4 of User zuul.
Nov 29 00:50:45 np0005539505 python3[7395]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:50:46 np0005539505 python3[7468]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395445.612969-365-7855442018785/source _original_basename=tmp80om4m_2 follow=False checksum=202951a95d8ea5ab635db917083142dc6b9b32e4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:50:48 np0005539505 systemd[1]: session-4.scope: Deactivated successfully.
Nov 29 00:50:48 np0005539505 systemd-logind[794]: Session 4 logged out. Waiting for processes to exit.
Nov 29 00:50:48 np0005539505 systemd-logind[794]: Removed session 4.
Nov 29 00:51:32 np0005539505 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 29 00:51:32 np0005539505 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 29 00:51:32 np0005539505 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 29 00:51:32 np0005539505 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 29 00:52:42 np0005539505 systemd[4304]: Created slice User Background Tasks Slice.
Nov 29 00:52:42 np0005539505 systemd[4304]: Starting Cleanup of User's Temporary Files and Directories...
Nov 29 00:52:42 np0005539505 systemd[4304]: Finished Cleanup of User's Temporary Files and Directories.
Nov 29 00:56:27 np0005539505 systemd-logind[794]: New session 5 of user zuul.
Nov 29 00:56:27 np0005539505 systemd[1]: Started Session 5 of User zuul.
Nov 29 00:56:27 np0005539505 python3[7528]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-2d4e-d9c7-000000000ca4-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:28 np0005539505 python3[7557]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:28 np0005539505 python3[7583]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:28 np0005539505 python3[7609]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:28 np0005539505 python3[7635]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:29 np0005539505 python3[7661]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:29 np0005539505 python3[7739]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:56:30 np0005539505 python3[7812]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395789.6057515-366-7909243202693/source _original_basename=tmpq4xs0lc8 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:56:31 np0005539505 python3[7862]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 00:56:31 np0005539505 systemd[1]: Reloading.
Nov 29 00:56:31 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:56:32 np0005539505 python3[7918]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 29 00:56:33 np0005539505 python3[7944]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:33 np0005539505 python3[7972]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:33 np0005539505 python3[8000]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:34 np0005539505 python3[8028]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:34 np0005539505 python3[8055]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-2d4e-d9c7-000000000cab-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:56:35 np0005539505 python3[8085]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 29 00:56:38 np0005539505 systemd[1]: session-5.scope: Deactivated successfully.
Nov 29 00:56:38 np0005539505 systemd[1]: session-5.scope: Consumed 4.005s CPU time.
Nov 29 00:56:38 np0005539505 systemd-logind[794]: Session 5 logged out. Waiting for processes to exit.
Nov 29 00:56:38 np0005539505 systemd-logind[794]: Removed session 5.
Nov 29 00:56:40 np0005539505 systemd-logind[794]: New session 6 of user zuul.
Nov 29 00:56:40 np0005539505 systemd[1]: Started Session 6 of User zuul.
Nov 29 00:56:40 np0005539505 python3[8121]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 29 00:56:52 np0005539505 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:56:52 np0005539505 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:56:52 np0005539505 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:56:52 np0005539505 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:56:52 np0005539505 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:56:52 np0005539505 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:56:52 np0005539505 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:56:52 np0005539505 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:57:01 np0005539505 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:57:01 np0005539505 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:57:01 np0005539505 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:57:01 np0005539505 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:57:01 np0005539505 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:57:01 np0005539505 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:57:01 np0005539505 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:57:01 np0005539505 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:57:09 np0005539505 kernel: SELinux:  Converting 385 SID table entries...
Nov 29 00:57:09 np0005539505 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:57:09 np0005539505 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:57:09 np0005539505 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:57:09 np0005539505 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:57:09 np0005539505 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:57:09 np0005539505 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:57:09 np0005539505 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:57:10 np0005539505 setsebool[8180]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 29 00:57:10 np0005539505 setsebool[8180]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 29 00:57:21 np0005539505 kernel: SELinux:  Converting 388 SID table entries...
Nov 29 00:57:21 np0005539505 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 00:57:21 np0005539505 kernel: SELinux:  policy capability open_perms=1
Nov 29 00:57:21 np0005539505 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 00:57:21 np0005539505 kernel: SELinux:  policy capability always_check_network=0
Nov 29 00:57:21 np0005539505 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 00:57:21 np0005539505 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 00:57:21 np0005539505 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 00:57:38 np0005539505 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 00:57:38 np0005539505 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 00:57:38 np0005539505 systemd[1]: Starting man-db-cache-update.service...
Nov 29 00:57:38 np0005539505 systemd[1]: Reloading.
Nov 29 00:57:38 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 00:57:38 np0005539505 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 00:58:06 np0005539505 python3[22585]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-965f-b5ec-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 00:58:07 np0005539505 kernel: evm: overlay not supported
Nov 29 00:58:07 np0005539505 systemd[4304]: Starting D-Bus User Message Bus...
Nov 29 00:58:07 np0005539505 dbus-broker-launch[23089]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 29 00:58:07 np0005539505 dbus-broker-launch[23089]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 29 00:58:07 np0005539505 systemd[4304]: Started D-Bus User Message Bus.
Nov 29 00:58:07 np0005539505 dbus-broker-lau[23089]: Ready
Nov 29 00:58:07 np0005539505 systemd[4304]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 29 00:58:07 np0005539505 systemd[4304]: Created slice Slice /user.
Nov 29 00:58:07 np0005539505 systemd[4304]: podman-23013.scope: unit configures an IP firewall, but not running as root.
Nov 29 00:58:07 np0005539505 systemd[4304]: (This warning is only shown for the first unit using IP firewalling.)
Nov 29 00:58:07 np0005539505 systemd[4304]: Started podman-23013.scope.
Nov 29 00:58:08 np0005539505 systemd[4304]: Started podman-pause-b5543c4a.scope.
Nov 29 00:58:08 np0005539505 python3[23469]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.97:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.97:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:58:08 np0005539505 python3[23469]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 29 00:58:09 np0005539505 systemd[1]: session-6.scope: Deactivated successfully.
Nov 29 00:58:09 np0005539505 systemd[1]: session-6.scope: Consumed 57.325s CPU time.
Nov 29 00:58:09 np0005539505 systemd-logind[794]: Session 6 logged out. Waiting for processes to exit.
Nov 29 00:58:09 np0005539505 systemd-logind[794]: Removed session 6.
Nov 29 00:58:19 np0005539505 irqbalance[784]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 29 00:58:19 np0005539505 irqbalance[784]: IRQ 27 affinity is now unmanaged
Nov 29 00:58:22 np0005539505 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 00:58:22 np0005539505 systemd[1]: Finished man-db-cache-update.service.
Nov 29 00:58:22 np0005539505 systemd[1]: man-db-cache-update.service: Consumed 54.287s CPU time.
Nov 29 00:58:22 np0005539505 systemd[1]: run-re3d14f02cab74bc9ae7c0605e98a46e7.service: Deactivated successfully.
Nov 29 00:58:34 np0005539505 systemd-logind[794]: New session 7 of user zuul.
Nov 29 00:58:34 np0005539505 systemd[1]: Started Session 7 of User zuul.
Nov 29 00:58:34 np0005539505 python3[29620]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL3iUg7wOJDjLm9TipkwWPon/M1FO0neD/5ezFHnUJBmbFpPtrL/PoM+teNA62c4mAkgQYtVxx4T3bRgPp78cTw= zuul@np0005539502.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:58:35 np0005539505 python3[29646]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL3iUg7wOJDjLm9TipkwWPon/M1FO0neD/5ezFHnUJBmbFpPtrL/PoM+teNA62c4mAkgQYtVxx4T3bRgPp78cTw= zuul@np0005539502.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:58:36 np0005539505 python3[29672]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005539505.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 29 00:58:36 np0005539505 python3[29706]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBL3iUg7wOJDjLm9TipkwWPon/M1FO0neD/5ezFHnUJBmbFpPtrL/PoM+teNA62c4mAkgQYtVxx4T3bRgPp78cTw= zuul@np0005539502.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 29 00:58:37 np0005539505 python3[29784]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 00:58:37 np0005539505 python3[29857]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764395917.0178616-170-210925874106653/source _original_basename=tmpp3nqx1sk follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 00:58:38 np0005539505 python3[29907]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Nov 29 00:58:38 np0005539505 systemd[1]: Starting Hostname Service...
Nov 29 00:58:38 np0005539505 systemd[1]: Started Hostname Service.
Nov 29 00:58:38 np0005539505 systemd-hostnamed[29911]: Changed pretty hostname to 'compute-2'
Nov 29 00:58:38 np0005539505 systemd-hostnamed[29911]: Hostname set to <compute-2> (static)
Nov 29 00:58:38 np0005539505 NetworkManager[7211]: <info>  [1764395918.6724] hostname: static hostname changed from "np0005539505.novalocal" to "compute-2"
Nov 29 00:58:38 np0005539505 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 00:58:38 np0005539505 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 00:58:39 np0005539505 systemd[1]: session-7.scope: Deactivated successfully.
Nov 29 00:58:39 np0005539505 systemd[1]: session-7.scope: Consumed 2.187s CPU time.
Nov 29 00:58:39 np0005539505 systemd-logind[794]: Session 7 logged out. Waiting for processes to exit.
Nov 29 00:58:39 np0005539505 systemd-logind[794]: Removed session 7.
Nov 29 00:58:48 np0005539505 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 00:59:08 np0005539505 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 01:02:57 np0005539505 systemd-logind[794]: New session 8 of user zuul.
Nov 29 01:02:57 np0005539505 systemd[1]: Started Session 8 of User zuul.
Nov 29 01:02:58 np0005539505 python3[30023]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:03:00 np0005539505 python3[30139]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:01 np0005539505 python3[30212]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.674899-34086-220758865657311/source mode=0755 _original_basename=delorean.repo follow=False checksum=a16f090252000d02a7f7d540bb10f7c1c9cd4ac5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:01 np0005539505 python3[30238]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:02 np0005539505 python3[30311]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.674899-34086-220758865657311/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:02 np0005539505 python3[30337]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:02 np0005539505 python3[30410]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.674899-34086-220758865657311/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:03 np0005539505 python3[30436]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:03 np0005539505 python3[30509]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.674899-34086-220758865657311/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:03 np0005539505 python3[30535]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:04 np0005539505 python3[30608]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.674899-34086-220758865657311/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:04 np0005539505 python3[30634]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:04 np0005539505 python3[30707]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.674899-34086-220758865657311/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:04 np0005539505 python3[30733]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 29 01:03:05 np0005539505 python3[30806]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764396180.674899-34086-220758865657311/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=25e801a9a05537c191e2aa500f19076ac31d3e5b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:03:14 np0005539505 python3[30854]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:08:14 np0005539505 systemd-logind[794]: Session 8 logged out. Waiting for processes to exit.
Nov 29 01:08:14 np0005539505 systemd[1]: session-8.scope: Deactivated successfully.
Nov 29 01:08:14 np0005539505 systemd[1]: session-8.scope: Consumed 5.219s CPU time.
Nov 29 01:08:14 np0005539505 systemd-logind[794]: Removed session 8.
Nov 29 01:17:06 np0005539505 systemd-logind[794]: New session 9 of user zuul.
Nov 29 01:17:06 np0005539505 systemd[1]: Started Session 9 of User zuul.
Nov 29 01:17:07 np0005539505 python3.9[31017]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:09 np0005539505 python3.9[31198]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:17:17 np0005539505 systemd[1]: session-9.scope: Deactivated successfully.
Nov 29 01:17:17 np0005539505 systemd[1]: session-9.scope: Consumed 7.742s CPU time.
Nov 29 01:17:17 np0005539505 systemd-logind[794]: Session 9 logged out. Waiting for processes to exit.
Nov 29 01:17:17 np0005539505 systemd-logind[794]: Removed session 9.
Nov 29 01:17:33 np0005539505 systemd-logind[794]: New session 10 of user zuul.
Nov 29 01:17:33 np0005539505 systemd[1]: Started Session 10 of User zuul.
Nov 29 01:17:33 np0005539505 python3.9[31409]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 29 01:17:35 np0005539505 python3.9[31583]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:36 np0005539505 python3.9[31735]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:17:37 np0005539505 python3.9[31888]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:17:38 np0005539505 python3.9[32040]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:17:39 np0005539505 python3.9[32192]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:17:39 np0005539505 python3.9[32315]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397058.6263611-185-281214055912679/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:17:40 np0005539505 python3.9[32467]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:41 np0005539505 python3.9[32623]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:17:42 np0005539505 python3.9[32775]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:17:43 np0005539505 python3.9[32925]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:17:47 np0005539505 python3.9[33178]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:17:48 np0005539505 python3.9[33328]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:49 np0005539505 python3.9[33482]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:17:50 np0005539505 python3.9[33640]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:17:51 np0005539505 python3.9[33724]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:18:42 np0005539505 systemd[1]: Reloading.
Nov 29 01:18:42 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:42 np0005539505 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 29 01:18:43 np0005539505 systemd[1]: Reloading.
Nov 29 01:18:43 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:43 np0005539505 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 29 01:18:43 np0005539505 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 29 01:18:43 np0005539505 systemd[1]: Reloading.
Nov 29 01:18:43 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:18:43 np0005539505 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 29 01:18:43 np0005539505 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:18:43 np0005539505 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:18:43 np0005539505 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:19:59 np0005539505 kernel: SELinux:  Converting 2718 SID table entries...
Nov 29 01:19:59 np0005539505 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:19:59 np0005539505 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:19:59 np0005539505 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:19:59 np0005539505 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:19:59 np0005539505 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:19:59 np0005539505 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:19:59 np0005539505 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:19:59 np0005539505 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 29 01:19:59 np0005539505 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:19:59 np0005539505 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:19:59 np0005539505 systemd[1]: Reloading.
Nov 29 01:19:59 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:19:59 np0005539505 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:20:01 np0005539505 python3.9[35246]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:20:03 np0005539505 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:20:03 np0005539505 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:20:03 np0005539505 systemd[1]: man-db-cache-update.service: Consumed 1.166s CPU time.
Nov 29 01:20:03 np0005539505 systemd[1]: run-rfe96edf2f5d1408b8922b045623eaa19.service: Deactivated successfully.
Nov 29 01:20:04 np0005539505 python3.9[35528]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 29 01:20:05 np0005539505 python3.9[35680]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 29 01:20:09 np0005539505 python3.9[35833]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:20:15 np0005539505 python3.9[35985]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 29 01:20:19 np0005539505 python3.9[36137]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:20:20 np0005539505 python3.9[36289]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:20:21 np0005539505 python3.9[36412]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397219.8640738-674-207419500177658/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:20:29 np0005539505 irqbalance[784]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 29 01:20:29 np0005539505 irqbalance[784]: IRQ 26 affinity is now unmanaged
Nov 29 01:20:29 np0005539505 python3.9[36564]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:20:30 np0005539505 python3.9[36716]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:20:31 np0005539505 python3.9[36869]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:20:32 np0005539505 python3.9[37021]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 29 01:20:32 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:20:32 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:20:33 np0005539505 python3.9[37175]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:20:34 np0005539505 python3.9[37333]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:20:35 np0005539505 python3.9[37493]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 29 01:20:36 np0005539505 python3.9[37646]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:20:37 np0005539505 python3.9[37804]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 29 01:20:38 np0005539505 python3.9[37956]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:20:42 np0005539505 python3.9[38109]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:20:43 np0005539505 python3.9[38261]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:20:44 np0005539505 python3.9[38384]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397242.9880958-1032-1273475942247/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:20:45 np0005539505 python3.9[38536]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:20:45 np0005539505 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:20:45 np0005539505 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 29 01:20:45 np0005539505 kernel: Bridge firewalling registered
Nov 29 01:20:45 np0005539505 systemd-modules-load[38540]: Inserted module 'br_netfilter'
Nov 29 01:20:45 np0005539505 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:20:46 np0005539505 python3.9[38696]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:20:46 np0005539505 python3.9[38819]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397245.7442102-1100-198075892460079/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:20:47 np0005539505 python3.9[38971]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:20:52 np0005539505 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:20:52 np0005539505 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:20:53 np0005539505 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:20:53 np0005539505 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:20:53 np0005539505 systemd[1]: Reloading.
Nov 29 01:20:53 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:20:53 np0005539505 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:20:55 np0005539505 python3.9[40936]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:20:56 np0005539505 python3.9[41985]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 29 01:20:57 np0005539505 python3.9[42837]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:20:57 np0005539505 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:20:57 np0005539505 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:20:57 np0005539505 systemd[1]: man-db-cache-update.service: Consumed 4.693s CPU time.
Nov 29 01:20:57 np0005539505 systemd[1]: run-rb49782174a07475fa2cf8c2c0a8edbb6.service: Deactivated successfully.
Nov 29 01:20:58 np0005539505 python3.9[43194]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:20:58 np0005539505 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 01:20:58 np0005539505 systemd[1]: Starting Authorization Manager...
Nov 29 01:20:58 np0005539505 polkitd[43411]: Started polkitd version 0.117
Nov 29 01:20:58 np0005539505 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 01:20:58 np0005539505 systemd[1]: Started Authorization Manager.
Nov 29 01:20:59 np0005539505 python3.9[43581]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:20:59 np0005539505 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 29 01:20:59 np0005539505 systemd[1]: tuned.service: Deactivated successfully.
Nov 29 01:20:59 np0005539505 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 29 01:20:59 np0005539505 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 29 01:21:00 np0005539505 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 29 01:21:00 np0005539505 python3.9[43743]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 29 01:21:04 np0005539505 python3.9[43895]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:21:04 np0005539505 systemd[1]: Reloading.
Nov 29 01:21:04 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:21:05 np0005539505 python3.9[44084]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:21:05 np0005539505 systemd[1]: Reloading.
Nov 29 01:21:05 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:21:06 np0005539505 python3.9[44273]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:07 np0005539505 python3.9[44426]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:07 np0005539505 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 29 01:21:08 np0005539505 python3.9[44579]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:10 np0005539505 python3.9[44741]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:11 np0005539505 python3.9[44894]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:21:11 np0005539505 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 29 01:21:11 np0005539505 systemd[1]: Stopped Apply Kernel Variables.
Nov 29 01:21:11 np0005539505 systemd[1]: Stopping Apply Kernel Variables...
Nov 29 01:21:11 np0005539505 systemd[1]: Starting Apply Kernel Variables...
Nov 29 01:21:11 np0005539505 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 29 01:21:11 np0005539505 systemd[1]: Finished Apply Kernel Variables.
Nov 29 01:21:11 np0005539505 systemd[1]: session-10.scope: Deactivated successfully.
Nov 29 01:21:11 np0005539505 systemd-logind[794]: Session 10 logged out. Waiting for processes to exit.
Nov 29 01:21:11 np0005539505 systemd[1]: session-10.scope: Consumed 2min 20.451s CPU time.
Nov 29 01:21:11 np0005539505 systemd-logind[794]: Removed session 10.
Nov 29 01:21:16 np0005539505 systemd-logind[794]: New session 11 of user zuul.
Nov 29 01:21:16 np0005539505 systemd[1]: Started Session 11 of User zuul.
Nov 29 01:21:17 np0005539505 python3.9[45077]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:18 np0005539505 python3.9[45231]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:20 np0005539505 python3.9[45387]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:22 np0005539505 python3.9[45539]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:23 np0005539505 python3.9[45695]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:21:23 np0005539505 python3.9[45779]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:21:26 np0005539505 python3.9[45932]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:21:27 np0005539505 python3.9[46103]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:21:28 np0005539505 python3.9[46255]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:21:28 np0005539505 systemd[1]: var-lib-containers-storage-overlay-compat2395578833-merged.mount: Deactivated successfully.
Nov 29 01:21:28 np0005539505 podman[46256]: 2025-11-29 06:21:28.402157503 +0000 UTC m=+0.083229352 system refresh
Nov 29 01:21:29 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:21:29 np0005539505 python3.9[46418]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:21:30 np0005539505 python3.9[46541]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397289.414001-294-281104616058019/.source.json follow=False _original_basename=podman_network_config.j2 checksum=9cbf0fba90be20f65f613c0c53e2f3fe7b3f49d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:21:31 np0005539505 python3.9[46693]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:21:32 np0005539505 python3.9[46816]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397290.9236112-339-207806431411378/.source.conf follow=False _original_basename=registries.conf.j2 checksum=25aa6c560e50dcbd81b989ea46a7865cb55b8998 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:21:32 np0005539505 python3.9[46968]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:21:33 np0005539505 python3.9[47120]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:21:34 np0005539505 python3.9[47272]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:21:34 np0005539505 python3.9[47424]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:21:35 np0005539505 python3.9[47574]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:21:36 np0005539505 python3.9[47728]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:39 np0005539505 python3.9[47881]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:42 np0005539505 python3.9[48041]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:45 np0005539505 python3.9[48194]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:49 np0005539505 python3.9[48347]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:52 np0005539505 python3.9[48503]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:57 np0005539505 python3.9[48673]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:21:59 np0005539505 python3.9[48826]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:22:23 np0005539505 python3.9[49163]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:22:25 np0005539505 python3.9[49319]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:22:26 np0005539505 python3.9[49494]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:22:27 np0005539505 python3.9[49617]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764397346.1422381-784-249094901272657/.source.json _original_basename=.6oq71m10 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:22:28 np0005539505 python3.9[49769]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 01:22:28 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:34 np0005539505 systemd[1]: var-lib-containers-storage-overlay-compat3103008093-lower\x2dmapped.mount: Deactivated successfully.
Nov 29 01:22:43 np0005539505 podman[49781]: 2025-11-29 06:22:43.803126437 +0000 UTC m=+15.217242341 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:22:43 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:43 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:43 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:45 np0005539505 python3.9[50081]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 01:22:45 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:47 np0005539505 podman[50093]: 2025-11-29 06:22:47.629363864 +0000 UTC m=+2.421958254 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:22:47 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:47 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:47 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:22:49 np0005539505 python3.9[50329]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 01:22:49 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:29 np0005539505 podman[50341]: 2025-11-29 06:23:29.099650144 +0000 UTC m=+39.721085663 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:23:29 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:29 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:29 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:30 np0005539505 python3.9[50703]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 01:23:30 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:39 np0005539505 podman[50716]: 2025-11-29 06:23:39.578746235 +0000 UTC m=+9.291981564 image pull e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 29 01:23:39 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:39 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:39 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:40 np0005539505 python3.9[50975]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 29 01:23:43 np0005539505 podman[50989]: 2025-11-29 06:23:43.089242752 +0000 UTC m=+2.671710761 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 29 01:23:43 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:43 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:43 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:23:43 np0005539505 systemd[1]: session-11.scope: Deactivated successfully.
Nov 29 01:23:43 np0005539505 systemd[1]: session-11.scope: Consumed 1min 48.769s CPU time.
Nov 29 01:23:43 np0005539505 systemd-logind[794]: Session 11 logged out. Waiting for processes to exit.
Nov 29 01:23:43 np0005539505 systemd-logind[794]: Removed session 11.
Nov 29 01:23:49 np0005539505 systemd-logind[794]: New session 12 of user zuul.
Nov 29 01:23:49 np0005539505 systemd[1]: Started Session 12 of User zuul.
Nov 29 01:23:51 np0005539505 python3.9[51284]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:23:52 np0005539505 python3.9[51440]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 29 01:23:54 np0005539505 python3.9[51593]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:24:03 np0005539505 python3.9[51751]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:24:06 np0005539505 python3.9[51911]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:24:06 np0005539505 python3.9[51995]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:24:18 np0005539505 python3.9[52157]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:34 np0005539505 kernel: SELinux:  Converting 2731 SID table entries...
Nov 29 01:24:34 np0005539505 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:24:34 np0005539505 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:24:34 np0005539505 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:24:34 np0005539505 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:24:34 np0005539505 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:24:34 np0005539505 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:24:34 np0005539505 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:24:34 np0005539505 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 29 01:24:34 np0005539505 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 29 01:24:36 np0005539505 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:24:36 np0005539505 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:24:36 np0005539505 systemd[1]: Reloading.
Nov 29 01:24:36 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:24:36 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:24:36 np0005539505 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:24:38 np0005539505 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:24:38 np0005539505 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:24:38 np0005539505 systemd[1]: run-r3e307cae6c1e46a6a6ffb6d9bfca37ef.service: Deactivated successfully.
Nov 29 01:24:41 np0005539505 python3.9[53256]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:24:41 np0005539505 systemd[1]: Reloading.
Nov 29 01:24:41 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:24:41 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:24:41 np0005539505 systemd[1]: Starting Open vSwitch Database Unit...
Nov 29 01:24:41 np0005539505 chown[53297]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 29 01:24:42 np0005539505 ovs-ctl[53303]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 29 01:24:42 np0005539505 ovs-ctl[53303]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 29 01:24:42 np0005539505 ovs-ctl[53303]: Starting ovsdb-server [  OK  ]
Nov 29 01:24:42 np0005539505 ovs-vsctl[53352]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 29 01:24:42 np0005539505 ovs-vsctl[53368]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 29 01:24:42 np0005539505 ovs-ctl[53303]: Configuring Open vSwitch system IDs [  OK  ]
Nov 29 01:24:42 np0005539505 ovs-vsctl[53378]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Nov 29 01:24:42 np0005539505 ovs-ctl[53303]: Enabling remote OVSDB managers [  OK  ]
Nov 29 01:24:42 np0005539505 systemd[1]: Started Open vSwitch Database Unit.
Nov 29 01:24:42 np0005539505 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 29 01:24:42 np0005539505 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 29 01:24:42 np0005539505 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 29 01:24:42 np0005539505 kernel: openvswitch: Open vSwitch switching datapath
Nov 29 01:24:42 np0005539505 ovs-ctl[53422]: Inserting openvswitch module [  OK  ]
Nov 29 01:24:42 np0005539505 ovs-ctl[53391]: Starting ovs-vswitchd [  OK  ]
Nov 29 01:24:42 np0005539505 ovs-vsctl[53440]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Nov 29 01:24:42 np0005539505 ovs-ctl[53391]: Enabling remote OVSDB managers [  OK  ]
Nov 29 01:24:42 np0005539505 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 29 01:24:42 np0005539505 systemd[1]: Starting Open vSwitch...
Nov 29 01:24:42 np0005539505 systemd[1]: Finished Open vSwitch.
Nov 29 01:24:43 np0005539505 python3.9[53591]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:24:44 np0005539505 python3.9[53743]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 29 01:24:46 np0005539505 kernel: SELinux:  Converting 2745 SID table entries...
Nov 29 01:24:46 np0005539505 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:24:46 np0005539505 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:24:46 np0005539505 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:24:46 np0005539505 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:24:46 np0005539505 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:24:46 np0005539505 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:24:46 np0005539505 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:24:47 np0005539505 python3.9[53898]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:24:48 np0005539505 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 29 01:24:48 np0005539505 python3.9[54056]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:52 np0005539505 python3.9[54209]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:24:54 np0005539505 python3.9[54496]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:24:55 np0005539505 python3.9[54646]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:24:56 np0005539505 python3.9[54800]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:24:59 np0005539505 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:24:59 np0005539505 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:24:59 np0005539505 systemd[1]: Reloading.
Nov 29 01:24:59 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:24:59 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:24:59 np0005539505 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:25:03 np0005539505 python3.9[55116]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:25:03 np0005539505 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 29 01:25:03 np0005539505 systemd[1]: Stopped Network Manager Wait Online.
Nov 29 01:25:03 np0005539505 systemd[1]: Stopping Network Manager Wait Online...
Nov 29 01:25:03 np0005539505 systemd[1]: Stopping Network Manager...
Nov 29 01:25:03 np0005539505 NetworkManager[7211]: <info>  [1764397503.1384] caught SIGTERM, shutting down normally.
Nov 29 01:25:03 np0005539505 NetworkManager[7211]: <info>  [1764397503.1404] dhcp4 (eth0): canceled DHCP transaction
Nov 29 01:25:03 np0005539505 NetworkManager[7211]: <info>  [1764397503.1404] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:25:03 np0005539505 NetworkManager[7211]: <info>  [1764397503.1404] dhcp4 (eth0): state changed no lease
Nov 29 01:25:03 np0005539505 NetworkManager[7211]: <info>  [1764397503.1407] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 01:25:03 np0005539505 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 01:25:03 np0005539505 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 01:25:03 np0005539505 NetworkManager[7211]: <info>  [1764397503.9767] exiting (success)
Nov 29 01:25:04 np0005539505 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 29 01:25:04 np0005539505 systemd[1]: Stopped Network Manager.
Nov 29 01:25:04 np0005539505 systemd[1]: NetworkManager.service: Consumed 13.375s CPU time, 4.1M memory peak, read 0B from disk, written 29.5K to disk.
Nov 29 01:25:04 np0005539505 systemd[1]: Starting Network Manager...
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.0718] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:c017802d-b3d3-4f2d-87a0-b39da9e20414)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.0719] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.0791] manager[0x55d2a9472090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 29 01:25:04 np0005539505 systemd[1]: Starting Hostname Service...
Nov 29 01:25:04 np0005539505 systemd[1]: Started Hostname Service.
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1821] hostname: hostname: using hostnamed
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1821] hostname: static hostname changed from (none) to "compute-2"
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1826] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1831] manager[0x55d2a9472090]: rfkill: Wi-Fi hardware radio set enabled
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1831] manager[0x55d2a9472090]: rfkill: WWAN hardware radio set enabled
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1851] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1858] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1859] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1859] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1860] manager: Networking is enabled by state file
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1861] settings: Loaded settings plugin: keyfile (internal)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1864] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1886] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1895] dhcp: init: Using DHCP client 'internal'
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1898] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1902] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1908] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1915] device (lo): Activation: starting connection 'lo' (6d2d9121-1479-42a1-9107-1290d8f7122c)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1920] device (eth0): carrier: link connected
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1924] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1928] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1929] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1934] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1939] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1945] device (eth1): carrier: link connected
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1949] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1956] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (16f47f05-9a4b-50ba-bb1a-0eecad4f0adc) (indicated)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1956] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1962] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1970] device (eth1): Activation: starting connection 'ci-private-network' (16f47f05-9a4b-50ba-bb1a-0eecad4f0adc)
Nov 29 01:25:04 np0005539505 systemd[1]: Started Network Manager.
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1976] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.1984] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2005] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2010] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2015] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2020] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2025] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2029] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2036] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2050] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2056] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2071] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2098] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2113] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2117] dhcp4 (eth0): state changed new lease, address=38.102.83.200
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2121] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2130] device (lo): Activation: successful, device activated.
Nov 29 01:25:04 np0005539505 NetworkManager[55134]: <info>  [1764397504.2149] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 29 01:25:04 np0005539505 systemd[1]: Starting Network Manager Wait Online...
Nov 29 01:25:05 np0005539505 python3.9[55311]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:25:05 np0005539505 NetworkManager[55134]: <info>  [1764397505.1713] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 01:25:05 np0005539505 NetworkManager[55134]: <info>  [1764397505.1733] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 29 01:25:05 np0005539505 NetworkManager[55134]: <info>  [1764397505.1739] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 01:25:05 np0005539505 NetworkManager[55134]: <info>  [1764397505.1743] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 29 01:25:05 np0005539505 NetworkManager[55134]: <info>  [1764397505.1747] device (eth1): Activation: successful, device activated.
Nov 29 01:25:05 np0005539505 NetworkManager[55134]: <info>  [1764397505.1757] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 29 01:25:05 np0005539505 NetworkManager[55134]: <info>  [1764397505.1759] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 29 01:25:05 np0005539505 NetworkManager[55134]: <info>  [1764397505.1763] manager: NetworkManager state is now CONNECTED_SITE
Nov 29 01:25:05 np0005539505 NetworkManager[55134]: <info>  [1764397505.1766] device (eth0): Activation: successful, device activated.
Nov 29 01:25:05 np0005539505 NetworkManager[55134]: <info>  [1764397505.1770] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 29 01:25:05 np0005539505 NetworkManager[55134]: <info>  [1764397505.5590] manager: startup complete
Nov 29 01:25:05 np0005539505 systemd[1]: Finished Network Manager Wait Online.
Nov 29 01:25:09 np0005539505 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:25:09 np0005539505 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:25:09 np0005539505 systemd[1]: run-r92910ec1cc2043de98261aa4cd044d0a.service: Deactivated successfully.
Nov 29 01:25:15 np0005539505 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 01:25:19 np0005539505 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:25:19 np0005539505 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:25:19 np0005539505 systemd[1]: Reloading.
Nov 29 01:25:19 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:25:19 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:25:19 np0005539505 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:25:21 np0005539505 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:25:21 np0005539505 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:25:21 np0005539505 systemd[1]: run-ra95432b21d564f0180dae5a5b2f013c8.service: Deactivated successfully.
Nov 29 01:25:24 np0005539505 python3.9[55803]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:25:25 np0005539505 python3.9[55955]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:26 np0005539505 python3.9[56109]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:26 np0005539505 python3.9[56261]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:27 np0005539505 python3.9[56413]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:28 np0005539505 python3.9[56565]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:28 np0005539505 python3.9[56717]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:31 np0005539505 python3.9[56840]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397528.4989164-656-174518202717748/.source _original_basename=.htcxy77g follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:32 np0005539505 python3.9[56992]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:34 np0005539505 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 29 01:25:34 np0005539505 python3.9[57144]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 29 01:25:35 np0005539505 python3.9[57298]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:38 np0005539505 python3.9[57725]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 29 01:25:39 np0005539505 ansible-async_wrapper.py[57900]: Invoked with j363583086079 300 /home/zuul/.ansible/tmp/ansible-tmp-1764397538.5012605-853-10653097604876/AnsiballZ_edpm_os_net_config.py _
Nov 29 01:25:39 np0005539505 ansible-async_wrapper.py[57903]: Starting module and watcher
Nov 29 01:25:39 np0005539505 ansible-async_wrapper.py[57903]: Start watching 57904 (300)
Nov 29 01:25:39 np0005539505 ansible-async_wrapper.py[57904]: Start module (57904)
Nov 29 01:25:39 np0005539505 ansible-async_wrapper.py[57900]: Return async_wrapper task started.
Nov 29 01:25:39 np0005539505 python3.9[57905]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 29 01:25:40 np0005539505 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 29 01:25:40 np0005539505 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 29 01:25:40 np0005539505 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 29 01:25:40 np0005539505 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 29 01:25:40 np0005539505 kernel: cfg80211: failed to load regulatory.db
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3148] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57906 uid=0 result="success"
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3161] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57906 uid=0 result="success"
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3599] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3600] audit: op="connection-add" uuid="339bbae2-ce01-49d1-9634-86b85845c8dc" name="br-ex-br" pid=57906 uid=0 result="success"
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3615] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3617] audit: op="connection-add" uuid="e7e3d710-89af-46fe-8833-7abb1de9b6fd" name="br-ex-port" pid=57906 uid=0 result="success"
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3629] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3630] audit: op="connection-add" uuid="2e125c0d-15cf-4c4a-9609-dac1e37f97c8" name="eth1-port" pid=57906 uid=0 result="success"
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3639] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3640] audit: op="connection-add" uuid="e223ecdd-ec43-4d68-865c-e7b2cb3bd5c6" name="vlan20-port" pid=57906 uid=0 result="success"
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3650] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3651] audit: op="connection-add" uuid="6cc0977d-dea8-4196-8400-abcf6dcf079f" name="vlan21-port" pid=57906 uid=0 result="success"
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3661] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3662] audit: op="connection-add" uuid="0877b5bd-1eb8-4779-a865-2b82dad0871b" name="vlan22-port" pid=57906 uid=0 result="success"
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3680] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=57906 uid=0 result="success"
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3695] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 29 01:25:41 np0005539505 NetworkManager[55134]: <info>  [1764397541.3696] audit: op="connection-add" uuid="4164fe9f-0229-4284-850f-1a1802e15c76" name="br-ex-if" pid=57906 uid=0 result="success"
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0126] audit: op="connection-update" uuid="16f47f05-9a4b-50ba-bb1a-0eecad4f0adc" name="ci-private-network" args="connection.master,connection.port-type,connection.timestamp,connection.controller,connection.slave-type,ipv4.routing-rules,ipv4.dns,ipv4.never-default,ipv4.addresses,ipv4.method,ipv4.routes,ipv6.routes,ipv6.dns,ipv6.routing-rules,ipv6.addresses,ipv6.addr-gen-mode,ipv6.method,ovs-interface.type,ovs-external-ids.data" pid=57906 uid=0 result="success"
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0163] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0166] audit: op="connection-add" uuid="4baa50c7-6d37-465e-b363-0ce040365902" name="vlan20-if" pid=57906 uid=0 result="success"
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0197] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0200] audit: op="connection-add" uuid="fb666be3-3fe6-4514-80e5-ba66e04b0a44" name="vlan21-if" pid=57906 uid=0 result="success"
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0245] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0249] audit: op="connection-add" uuid="51cd6f56-f377-44f7-984d-c375604b3603" name="vlan22-if" pid=57906 uid=0 result="success"
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0276] audit: op="connection-delete" uuid="b503c079-5c4a-39e2-8fd9-0eb2e190fef7" name="Wired connection 1" pid=57906 uid=0 result="success"
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0299] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0317] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0326] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (339bbae2-ce01-49d1-9634-86b85845c8dc)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0327] audit: op="connection-activate" uuid="339bbae2-ce01-49d1-9634-86b85845c8dc" name="br-ex-br" pid=57906 uid=0 result="success"
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0331] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0344] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0352] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (e7e3d710-89af-46fe-8833-7abb1de9b6fd)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0355] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0366] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0375] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (2e125c0d-15cf-4c4a-9609-dac1e37f97c8)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0379] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0393] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0402] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (e223ecdd-ec43-4d68-865c-e7b2cb3bd5c6)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0406] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0420] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0429] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (6cc0977d-dea8-4196-8400-abcf6dcf079f)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0433] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0444] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0454] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (0877b5bd-1eb8-4779-a865-2b82dad0871b)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0455] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0460] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0463] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0474] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0481] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0487] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (4164fe9f-0229-4284-850f-1a1802e15c76)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0488] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0494] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0497] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0499] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0502] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0523] device (eth1): disconnecting for new activation request.
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0524] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0529] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0534] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0536] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0541] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0549] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0555] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (4baa50c7-6d37-465e-b363-0ce040365902)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0556] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0560] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0563] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0565] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0568] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0576] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0587] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (fb666be3-3fe6-4514-80e5-ba66e04b0a44)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0589] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0593] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0594] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0596] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0599] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0604] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0608] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (51cd6f56-f377-44f7-984d-c375604b3603)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0609] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0612] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0614] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0615] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0616] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0627] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=57906 uid=0 result="success"
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0628] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0630] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0631] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0636] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0639] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0641] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0644] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0646] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0650] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0654] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0657] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0658] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 kernel: ovs-system: entered promiscuous mode
Nov 29 01:25:42 np0005539505 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0672] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0675] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0678] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0679] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 systemd-udevd[57909]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:25:42 np0005539505 kernel: Timeout policy base is empty
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0692] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0696] dhcp4 (eth0): canceled DHCP transaction
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0696] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0696] dhcp4 (eth0): state changed no lease
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0699] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0707] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.0710] audit: op="device-reapply" interface="eth1" ifindex=3 pid=57906 uid=0 result="fail" reason="Device is not activated"
Nov 29 01:25:42 np0005539505 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 29 01:25:42 np0005539505 kernel: br-ex: entered promiscuous mode
Nov 29 01:25:42 np0005539505 kernel: vlan20: entered promiscuous mode
Nov 29 01:25:42 np0005539505 systemd-udevd[57911]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.3540] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.3558] dhcp4 (eth0): state changed new lease, address=38.102.83.200
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.3581] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.3595] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:25:42 np0005539505 NetworkManager[55134]: <info>  [1764397542.3602] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:25:42 np0005539505 kernel: vlan21: entered promiscuous mode
Nov 29 01:25:42 np0005539505 kernel: vlan22: entered promiscuous mode
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0105] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0263] device (eth1): Activation: starting connection 'ci-private-network' (16f47f05-9a4b-50ba-bb1a-0eecad4f0adc)
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0270] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0272] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0274] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0276] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0277] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0283] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0286] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0296] device (eth1): disconnecting for new activation request.
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0297] audit: op="connection-activate" uuid="16f47f05-9a4b-50ba-bb1a-0eecad4f0adc" name="ci-private-network" pid=57906 uid=0 result="success"
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0300] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0327] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0336] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0339] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0344] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0357] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0362] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0366] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0372] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0376] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0384] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0387] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0390] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0395] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0400] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0439] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0447] device (eth1): Activation: starting connection 'ci-private-network' (16f47f05-9a4b-50ba-bb1a-0eecad4f0adc)
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0455] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57906 uid=0 result="success"
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0458] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0491] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0496] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0505] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0514] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0536] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0546] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0550] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0571] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0579] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0586] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0589] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0591] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0598] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0605] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0608] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0612] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0617] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0623] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0625] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0631] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0638] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0639] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 29 01:25:43 np0005539505 NetworkManager[55134]: <info>  [1764397543.0644] device (eth1): Activation: successful, device activated.
Nov 29 01:25:43 np0005539505 python3.9[58149]: ansible-ansible.legacy.async_status Invoked with jid=j363583086079.57900 mode=status _async_dir=/root/.ansible_async
Nov 29 01:25:44 np0005539505 ansible-async_wrapper.py[57903]: 57904 still running (300)
Nov 29 01:25:44 np0005539505 NetworkManager[55134]: <info>  [1764397544.5885] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57906 uid=0 result="success"
Nov 29 01:25:44 np0005539505 NetworkManager[55134]: <info>  [1764397544.7243] checkpoint[0x55d2a9448950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 29 01:25:44 np0005539505 NetworkManager[55134]: <info>  [1764397544.7245] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=57906 uid=0 result="success"
Nov 29 01:25:44 np0005539505 NetworkManager[55134]: <info>  [1764397544.9941] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57906 uid=0 result="success"
Nov 29 01:25:44 np0005539505 NetworkManager[55134]: <info>  [1764397544.9952] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57906 uid=0 result="success"
Nov 29 01:25:45 np0005539505 NetworkManager[55134]: <info>  [1764397545.6886] audit: op="networking-control" arg="global-dns-configuration" pid=57906 uid=0 result="success"
Nov 29 01:25:46 np0005539505 python3.9[58345]: ansible-ansible.legacy.async_status Invoked with jid=j363583086079.57900 mode=status _async_dir=/root/.ansible_async
Nov 29 01:25:46 np0005539505 NetworkManager[55134]: <info>  [1764397546.9025] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 29 01:25:47 np0005539505 NetworkManager[55134]: <info>  [1764397547.4590] audit: op="networking-control" arg="global-dns-configuration" pid=57906 uid=0 result="success"
Nov 29 01:25:47 np0005539505 NetworkManager[55134]: <info>  [1764397547.4618] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57906 uid=0 result="success"
Nov 29 01:25:47 np0005539505 NetworkManager[55134]: <info>  [1764397547.5950] checkpoint[0x55d2a9448a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 29 01:25:47 np0005539505 NetworkManager[55134]: <info>  [1764397547.5953] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=57906 uid=0 result="success"
Nov 29 01:25:47 np0005539505 ansible-async_wrapper.py[57904]: Module complete (57904)
Nov 29 01:25:49 np0005539505 ansible-async_wrapper.py[57903]: Done in kid B.
Nov 29 01:25:50 np0005539505 python3.9[58450]: ansible-ansible.legacy.async_status Invoked with jid=j363583086079.57900 mode=status _async_dir=/root/.ansible_async
Nov 29 01:25:51 np0005539505 python3.9[58549]: ansible-ansible.legacy.async_status Invoked with jid=j363583086079.57900 mode=cleanup _async_dir=/root/.ansible_async
Nov 29 01:25:51 np0005539505 python3.9[58701]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:52 np0005539505 python3.9[58824]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397551.3025708-944-159123684671490/.source.returncode _original_basename=.hzfsw397 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:53 np0005539505 python3.9[58977]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:25:54 np0005539505 python3.9[59100]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397553.3811822-992-263068573701596/.source.cfg _original_basename=.goc0b877 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:25:55 np0005539505 python3.9[59252]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:25:55 np0005539505 systemd[1]: Reloading Network Manager...
Nov 29 01:25:55 np0005539505 NetworkManager[55134]: <info>  [1764397555.4004] audit: op="reload" arg="0" pid=59256 uid=0 result="success"
Nov 29 01:25:55 np0005539505 NetworkManager[55134]: <info>  [1764397555.4014] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 29 01:25:55 np0005539505 systemd[1]: Reloaded Network Manager.
Nov 29 01:25:56 np0005539505 systemd[1]: session-12.scope: Deactivated successfully.
Nov 29 01:25:56 np0005539505 systemd[1]: session-12.scope: Consumed 51.397s CPU time.
Nov 29 01:25:56 np0005539505 systemd-logind[794]: Session 12 logged out. Waiting for processes to exit.
Nov 29 01:25:56 np0005539505 systemd-logind[794]: Removed session 12.
Nov 29 01:26:01 np0005539505 systemd-logind[794]: New session 13 of user zuul.
Nov 29 01:26:01 np0005539505 systemd[1]: Started Session 13 of User zuul.
Nov 29 01:26:02 np0005539505 python3.9[59440]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:26:03 np0005539505 python3.9[59595]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:26:05 np0005539505 python3.9[59784]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:26:05 np0005539505 systemd[1]: session-13.scope: Deactivated successfully.
Nov 29 01:26:05 np0005539505 systemd[1]: session-13.scope: Consumed 2.567s CPU time.
Nov 29 01:26:05 np0005539505 systemd-logind[794]: Session 13 logged out. Waiting for processes to exit.
Nov 29 01:26:05 np0005539505 systemd-logind[794]: Removed session 13.
Nov 29 01:26:05 np0005539505 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 29 01:26:14 np0005539505 systemd-logind[794]: New session 14 of user zuul.
Nov 29 01:26:14 np0005539505 systemd[1]: Started Session 14 of User zuul.
Nov 29 01:26:15 np0005539505 python3.9[59968]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:26:16 np0005539505 python3.9[60122]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:26:18 np0005539505 python3.9[60278]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:26:19 np0005539505 python3.9[60362]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:26:22 np0005539505 python3.9[60515]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:26:24 np0005539505 python3.9[60706]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:25 np0005539505 python3.9[60858]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:26:25 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:26:26 np0005539505 python3.9[61021]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:26 np0005539505 python3.9[61099]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:27 np0005539505 python3.9[61251]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:28 np0005539505 python3.9[61329]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:29 np0005539505 python3.9[61481]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:29 np0005539505 python3.9[61633]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:30 np0005539505 python3.9[61785]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:31 np0005539505 python3.9[61937]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:26:32 np0005539505 python3.9[62089]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:26:37 np0005539505 python3.9[62242]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:26:38 np0005539505 python3.9[62396]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:26:39 np0005539505 python3.9[62548]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:26:40 np0005539505 python3.9[62700]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:26:41 np0005539505 python3.9[62853]: ansible-service_facts Invoked
Nov 29 01:26:41 np0005539505 network[62870]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:26:41 np0005539505 network[62871]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:26:41 np0005539505 network[62872]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:26:47 np0005539505 python3.9[63324]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:26:53 np0005539505 python3.9[63477]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 29 01:26:55 np0005539505 python3.9[63629]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:56 np0005539505 python3.9[63754]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397615.063234-664-211316847503696/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:57 np0005539505 python3.9[63908]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:26:57 np0005539505 python3.9[64033]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397616.670968-709-3928977137896/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:26:59 np0005539505 python3.9[64187]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:01 np0005539505 python3.9[64341]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:27:02 np0005539505 python3.9[64425]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:04 np0005539505 python3.9[64579]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:27:04 np0005539505 python3.9[64663]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:27:04 np0005539505 systemd[1]: Stopping NTP client/server...
Nov 29 01:27:04 np0005539505 chronyd[788]: chronyd exiting
Nov 29 01:27:04 np0005539505 systemd[1]: chronyd.service: Deactivated successfully.
Nov 29 01:27:04 np0005539505 systemd[1]: Stopped NTP client/server.
Nov 29 01:27:04 np0005539505 systemd[1]: Starting NTP client/server...
Nov 29 01:27:05 np0005539505 chronyd[64671]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 29 01:27:05 np0005539505 chronyd[64671]: Frequency -28.327 +/- 0.200 ppm read from /var/lib/chrony/drift
Nov 29 01:27:05 np0005539505 chronyd[64671]: Loaded seccomp filter (level 2)
Nov 29 01:27:05 np0005539505 systemd[1]: Started NTP client/server.
Nov 29 01:27:06 np0005539505 systemd[1]: session-14.scope: Deactivated successfully.
Nov 29 01:27:06 np0005539505 systemd[1]: session-14.scope: Consumed 25.306s CPU time.
Nov 29 01:27:06 np0005539505 systemd-logind[794]: Session 14 logged out. Waiting for processes to exit.
Nov 29 01:27:06 np0005539505 systemd-logind[794]: Removed session 14.
Nov 29 01:27:12 np0005539505 systemd-logind[794]: New session 15 of user zuul.
Nov 29 01:27:12 np0005539505 systemd[1]: Started Session 15 of User zuul.
Nov 29 01:27:13 np0005539505 python3.9[64850]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:27:14 np0005539505 python3.9[65006]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:15 np0005539505 python3.9[65181]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:15 np0005539505 python3.9[65259]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.a8ssad45 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:16 np0005539505 python3.9[65411]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:17 np0005539505 python3.9[65534]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397636.3892918-151-231116320433429/.source _original_basename=.8wlr9d2r follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:18 np0005539505 python3.9[65686]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:27:19 np0005539505 python3.9[65838]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:20 np0005539505 python3.9[65961]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397638.9395726-222-246243585447335/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:27:21 np0005539505 python3.9[66113]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:21 np0005539505 python3.9[66236]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397641.0460627-222-192216235317772/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:27:22 np0005539505 python3.9[66388]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:23 np0005539505 python3.9[66540]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:23 np0005539505 python3.9[66663]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397643.0008335-334-195059198509465/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:24 np0005539505 python3.9[66815]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:25 np0005539505 python3.9[66938]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397644.5117512-379-12289026527640/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:26 np0005539505 python3.9[67090]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:26 np0005539505 systemd[1]: Reloading.
Nov 29 01:27:26 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:26 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:27 np0005539505 systemd[1]: Reloading.
Nov 29 01:27:27 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:27 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:27 np0005539505 systemd[1]: Starting EDPM Container Shutdown...
Nov 29 01:27:27 np0005539505 systemd[1]: Finished EDPM Container Shutdown.
Nov 29 01:27:28 np0005539505 python3.9[67317]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:28 np0005539505 python3.9[67440]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397647.6349263-447-108758104431382/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:29 np0005539505 python3.9[67592]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:29 np0005539505 python3.9[67715]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397648.9764574-493-140997582246142/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:30 np0005539505 python3.9[67867]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:30 np0005539505 systemd[1]: Reloading.
Nov 29 01:27:30 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:30 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:31 np0005539505 systemd[1]: Reloading.
Nov 29 01:27:31 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:31 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:31 np0005539505 systemd[1]: Starting Create netns directory...
Nov 29 01:27:31 np0005539505 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:27:31 np0005539505 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:27:31 np0005539505 systemd[1]: Finished Create netns directory.
Nov 29 01:27:32 np0005539505 python3.9[68093]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:27:32 np0005539505 network[68110]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:27:32 np0005539505 network[68111]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:27:32 np0005539505 network[68112]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:27:36 np0005539505 python3.9[68374]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:36 np0005539505 systemd[1]: Reloading.
Nov 29 01:27:36 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:36 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:36 np0005539505 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 29 01:27:36 np0005539505 iptables.init[68415]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 29 01:27:36 np0005539505 iptables.init[68415]: iptables: Flushing firewall rules: [  OK  ]
Nov 29 01:27:36 np0005539505 systemd[1]: iptables.service: Deactivated successfully.
Nov 29 01:27:36 np0005539505 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 29 01:27:37 np0005539505 python3.9[68611]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:38 np0005539505 python3.9[68765]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:27:38 np0005539505 systemd[1]: Reloading.
Nov 29 01:27:38 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:27:38 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:27:38 np0005539505 systemd[1]: Starting Netfilter Tables...
Nov 29 01:27:38 np0005539505 systemd[1]: Finished Netfilter Tables.
Nov 29 01:27:39 np0005539505 python3.9[68958]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:40 np0005539505 python3.9[69111]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:41 np0005539505 python3.9[69236]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397660.431973-700-143190190557231/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:42 np0005539505 python3.9[69389]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:27:42 np0005539505 systemd[1]: Reloading OpenSSH server daemon...
Nov 29 01:27:42 np0005539505 systemd[1]: Reloaded OpenSSH server daemon.
Nov 29 01:27:43 np0005539505 python3.9[69545]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:44 np0005539505 python3.9[69697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:44 np0005539505 python3.9[69820]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397663.5297558-793-274416229325489/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:45 np0005539505 python3.9[69972]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 29 01:27:45 np0005539505 systemd[1]: Starting Time & Date Service...
Nov 29 01:27:45 np0005539505 systemd[1]: Started Time & Date Service.
Nov 29 01:27:47 np0005539505 python3.9[70128]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:47 np0005539505 python3.9[70280]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:48 np0005539505 python3.9[70403]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397667.295516-898-71958205433481/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:49 np0005539505 python3.9[70555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:49 np0005539505 python3.9[70678]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397668.7288961-943-52965142064872/.source.yaml _original_basename=.i1t3ykpg follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:50 np0005539505 python3.9[70830]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:51 np0005539505 python3.9[70953]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397670.2375603-988-229221530905062/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:52 np0005539505 python3.9[71105]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:52 np0005539505 python3.9[71258]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:27:53 np0005539505 python3[71411]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:27:54 np0005539505 python3.9[71563]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:55 np0005539505 python3.9[71686]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397674.0024314-1105-43174478786171/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:55 np0005539505 python3.9[71838]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:56 np0005539505 python3.9[71961]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397675.301986-1150-80851492112788/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:57 np0005539505 python3.9[72113]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:57 np0005539505 python3.9[72236]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397676.641749-1195-234411584479861/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:27:58 np0005539505 python3.9[72388]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:27:59 np0005539505 python3.9[72511]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397678.06647-1240-9282071802390/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:00 np0005539505 python3.9[72663]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:28:00 np0005539505 python3.9[72786]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397679.4521623-1284-117714567476169/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:01 np0005539505 python3.9[72938]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:02 np0005539505 python3.9[73090]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:03 np0005539505 python3.9[73249]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:03 np0005539505 python3.9[73402]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:04 np0005539505 python3.9[73554]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:06 np0005539505 python3.9[73706]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:28:06 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:28:06 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:28:07 np0005539505 python3.9[73860]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 29 01:28:07 np0005539505 systemd[1]: session-15.scope: Deactivated successfully.
Nov 29 01:28:07 np0005539505 systemd[1]: session-15.scope: Consumed 33.519s CPU time.
Nov 29 01:28:07 np0005539505 systemd-logind[794]: Session 15 logged out. Waiting for processes to exit.
Nov 29 01:28:07 np0005539505 systemd-logind[794]: Removed session 15.
Nov 29 01:28:13 np0005539505 systemd-logind[794]: New session 16 of user zuul.
Nov 29 01:28:13 np0005539505 systemd[1]: Started Session 16 of User zuul.
Nov 29 01:28:14 np0005539505 python3.9[74041]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 29 01:28:15 np0005539505 python3.9[74193]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:15 np0005539505 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 29 01:28:16 np0005539505 python3.9[74347]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:28:17 np0005539505 python3.9[74499]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDxE9+in1YAsVzo2PkbOP/y9jW13mE04+F1VrPVmmgKUME6PWRBUtuT66AB40zRYi5yO+6N76+VAJcvtF1kNGhm3shwR+EkOKx8SHbU+RviKmRHfsi7XEfyHL7uPXOJMckqz85eUFqMQlXm0T6k8SbAwg/7v0r7w70oz6RysylzQYZWVeFgXZ7UFNiz+TKXL4x8MRY/6V3JMXIBdt/vb6cGmIyDwfTLPa/VxO6oKiuknrmAhd6pKWVOAoLeLvCJFRcnCjfZygatiRnwzibR7Xmo/fWClfIWB/RpJC5vSGru0Y/btrmoNInBd93XAWFRh8/+L/mTAUqvgP7Dy/Ft6JXARlkcmX64/tqwMI7M6a4A9voOZ8Eb1cJyJ/XgWoTXUZB9+cehvGP5J0tLJkw/iGBXKOcXhP99ulw5rvtkAaOXV6omaio88Pl85lT2ISJO6g47/pk27eMMKNXxMdNlhqVOtR5zLQHv3t0Pvd9/HFZhfcx1w86u5aR+V9irnyt3WAc=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIB2l802ocmKW/xzYye+Pzw89MQvA5jQh5a0yLK2ZyZCd#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI7XJn2j/ECeKq3mKYHO54Bh/Op2+6G6UX6ad7xn+hglSDuDDZy9KOJY974X6YapBGPsvID5GfLpKZuusj2w6cw=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDSD8ZpMWbyfWwat32zE3dwK32EyLj7Y+//yic/Bd8bh7jSKBLK9ym42oYT01mO+NFTdefo2ARchFmERRxMzut4oUrqMlfhrn+mNHsvLaQycoAg+oq19ivJki9YXqDUIR0GwObpBRBSVczn15OcfSZNvJ+5yEWYWoeMXyjR7IpLeP4unrXYU5Gefx+ixYfHqq9U2klSr1mLGklHOYT1257UfS7aFtDHfrGqNLBghhbpbjLBljCPzwbz2JHg+8oO3x0s19DpnMBT0ID3emGqK2CRupsBeiWpZYUfcIDbCqmgcmC5QRkORpTRfGSYdDcsqSjpDOkPShwf1Le1r5QnW7JiFsy0ogLQ0ThcibSAVqVQZpFDROMSTPeqUlnDDqklZEtTgARcUGiVhmiXhR8sIdJXzJ5b1IB28Y3jGlf6kmQpBa9raXRegF/7J3SWDcOHO/sYe7Wh50S0cBgRgix0492hkGz3icxCzNwpQ5H/dTKdLCX7SvWyn/dHYE7411EP0Xc=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIARLjHbwtuz0VGhEJnZ8jUcmug4YEziBMgu/+Q2Xf/qr#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN/4QosKjhedc/jgjDOXpXhsciLiDd+ILxSMZxLO5NzR72Gm5KH5lEdveLrailDwVrIBl1+UjfksCNfnn+zVt1w=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCs1lxOj+O3cXQh+L6Hvro0WUX7vGdONQb0UkjJDqrzMWuP0tmX4CuMYeN2kUtGqc5U1dKriurXmo1qGVTvVz1rFJWYr1e1qwcv/DCLijB+4QR8oi61K8+nnWm47XeUoyWOI1GxkiHPeLPUs3QDDbHClDRGD9SWUQ5AtaO0NqAPalgp4eYChWy0Y4soQNnOXqbjnwEsJRK85/mXhogmZpALrFBu87oJtbviSxczqa+4bci7R6jWZ+ZkZbKw2+D3QskWWoHcgFgQVCprAXuj/ebUq1gyCY/d+tnyQs80H9XZ6Ryvmu1e7zEhKJvldu5mAamd8l4EwL79yt1ds7cSRXEH/+ajyYpXXTerzMFIsItjkdt+fg8DiheTqZexiHXvykMSjhPdshC1A9JWSsD+ISIR5qLPmHx5g3kZyVt5WM3mPfqh8WYsG4FM7EzMz492DnLUqdIsJXOBPjExJZhCLYvOdjJI5hMYHQ2GTE4ZlW0rvYr85xi12yOn9K3zmZ6q2SU=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOVwi6LOnwRGXKTYlL5FohHpKT05ra2BKYgm2kBQxP+u#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOCcnxH2XsLxMcRRLaA4DruLY3oryYRdOPfwLiZD7s7kBHBXt+svOGk0QImtaVEKV/k9369qMK8GrFyzO2efaCk=#012 create=True mode=0644 path=/tmp/ansible.cm45o0el state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:18 np0005539505 python3.9[74651]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.cm45o0el' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:19 np0005539505 python3.9[74805]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.cm45o0el state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:20 np0005539505 systemd[1]: session-16.scope: Deactivated successfully.
Nov 29 01:28:20 np0005539505 systemd[1]: session-16.scope: Consumed 3.421s CPU time.
Nov 29 01:28:20 np0005539505 systemd-logind[794]: Session 16 logged out. Waiting for processes to exit.
Nov 29 01:28:20 np0005539505 systemd-logind[794]: Removed session 16.
Nov 29 01:28:26 np0005539505 systemd-logind[794]: New session 17 of user zuul.
Nov 29 01:28:26 np0005539505 systemd[1]: Started Session 17 of User zuul.
Nov 29 01:28:27 np0005539505 python3.9[74983]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:28:28 np0005539505 python3.9[75139]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 01:28:29 np0005539505 python3.9[75294]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:28:31 np0005539505 python3.9[75447]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:32 np0005539505 python3.9[75600]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:32 np0005539505 python3.9[75754]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:33 np0005539505 python3.9[75909]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:28:34 np0005539505 systemd[1]: session-17.scope: Deactivated successfully.
Nov 29 01:28:34 np0005539505 systemd[1]: session-17.scope: Consumed 4.290s CPU time.
Nov 29 01:28:34 np0005539505 systemd-logind[794]: Session 17 logged out. Waiting for processes to exit.
Nov 29 01:28:34 np0005539505 systemd-logind[794]: Removed session 17.
Nov 29 01:28:39 np0005539505 systemd-logind[794]: New session 18 of user zuul.
Nov 29 01:28:39 np0005539505 systemd[1]: Started Session 18 of User zuul.
Nov 29 01:28:40 np0005539505 python3.9[76087]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:28:42 np0005539505 python3.9[76243]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:28:43 np0005539505 python3.9[76327]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 29 01:28:45 np0005539505 python3.9[76478]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:28:47 np0005539505 python3.9[76629]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:28:47 np0005539505 python3.9[76779]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:48 np0005539505 python3.9[76929]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:28:49 np0005539505 systemd[1]: session-18.scope: Deactivated successfully.
Nov 29 01:28:49 np0005539505 systemd[1]: session-18.scope: Consumed 5.818s CPU time.
Nov 29 01:28:49 np0005539505 systemd-logind[794]: Session 18 logged out. Waiting for processes to exit.
Nov 29 01:28:49 np0005539505 systemd-logind[794]: Removed session 18.
Nov 29 01:28:55 np0005539505 systemd-logind[794]: New session 19 of user zuul.
Nov 29 01:28:55 np0005539505 systemd[1]: Started Session 19 of User zuul.
Nov 29 01:28:56 np0005539505 python3.9[77107]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:28:59 np0005539505 python3.9[77263]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:28:59 np0005539505 python3.9[77415]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:00 np0005539505 python3.9[77567]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:01 np0005539505 python3.9[77690]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397739.9825323-157-81490168839089/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=fd8e5d3798df682c54a12f09e0c326909e3124b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:01 np0005539505 python3.9[77842]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:02 np0005539505 python3.9[77965]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397741.5153992-157-41800220410982/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=53ec2692be9fe7fa10ffde7cdba9150c4076f3fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:03 np0005539505 python3.9[78117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:04 np0005539505 python3.9[78240]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397742.7216604-157-260228779198488/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=07a7156bfb8a601874ca86af5508fc2ac992231c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:04 np0005539505 python3.9[78392]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:05 np0005539505 python3.9[78544]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:06 np0005539505 python3.9[78696]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:06 np0005539505 python3.9[78819]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397745.5621243-335-208428390415046/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=b8a2bbc7a2bafa3c050f9b6e72828e10e35091ca backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:07 np0005539505 python3.9[78971]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:07 np0005539505 python3.9[79094]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397746.8004215-335-91981317750624/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=bf4291d96f7c0f5cd858ccf4f424f476f6c02cd9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:08 np0005539505 python3.9[79246]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:09 np0005539505 python3.9[79369]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397747.8754723-335-100056911859606/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=f4ac32ea8d93ec45a7e4f946d7e9e8fe3047c2b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:09 np0005539505 python3.9[79521]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:10 np0005539505 python3.9[79673]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:10 np0005539505 python3.9[79825]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:11 np0005539505 python3.9[79948]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397750.4444244-504-124503936397134/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=6f36e378fefc01bab92460f7d764c54f4de93edc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:12 np0005539505 python3.9[80100]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:12 np0005539505 python3.9[80223]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397751.7129774-504-102320629022945/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=d5ce9fc6df7543706791229321e0116a703016b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:13 np0005539505 python3.9[80375]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:13 np0005539505 python3.9[80498]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397752.852834-504-113329447605684/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=77f3b24af0b44ec965bb00221e86a3aa9313a8aa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:14 np0005539505 chronyd[64671]: Selected source 216.232.132.102 (pool.ntp.org)
Nov 29 01:29:14 np0005539505 python3.9[80650]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:15 np0005539505 python3.9[80802]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:15 np0005539505 python3.9[80954]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:16 np0005539505 python3.9[81077]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397755.3336082-653-33573553309804/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=f60a3a3fc80b5253e231b85070073044643e2c88 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:17 np0005539505 python3.9[81229]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:17 np0005539505 python3.9[81352]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397756.4850907-653-170364312171193/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=d5ce9fc6df7543706791229321e0116a703016b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:18 np0005539505 python3.9[81504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:19 np0005539505 python3.9[81627]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397757.7958684-653-179153992199132/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=b543b19168d8bb96e7335b6cf354506e950ba992 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:20 np0005539505 python3.9[81779]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:21 np0005539505 python3.9[81931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:21 np0005539505 python3.9[82054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397760.5954359-857-43213221539775/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:23 np0005539505 python3.9[82206]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:24 np0005539505 python3.9[82358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:25 np0005539505 python3.9[82481]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397763.7174478-960-18239090053592/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:25 np0005539505 python3.9[82633]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:26 np0005539505 python3.9[82785]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:26 np0005539505 python3.9[82908]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397765.9521453-1044-150701523531840/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:27 np0005539505 python3.9[83060]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:28 np0005539505 python3.9[83212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:28 np0005539505 python3.9[83335]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397767.8860033-1109-8829938805046/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:29 np0005539505 python3.9[83487]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:30 np0005539505 python3.9[83639]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:30 np0005539505 python3.9[83762]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397769.7551296-1173-147421678328577/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:31 np0005539505 python3.9[83914]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:32 np0005539505 python3.9[84066]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:32 np0005539505 python3.9[84189]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397771.5684352-1244-90236470684055/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:33 np0005539505 python3.9[84341]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:33 np0005539505 python3.9[84493]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:29:34 np0005539505 python3.9[84616]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397773.5184915-1303-256552272880522/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e40b48f8fd7ce69cedaa6e53dbd579733b9096d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:29:38 np0005539505 systemd[1]: session-19.scope: Deactivated successfully.
Nov 29 01:29:38 np0005539505 systemd[1]: session-19.scope: Consumed 26.689s CPU time.
Nov 29 01:29:38 np0005539505 systemd-logind[794]: Session 19 logged out. Waiting for processes to exit.
Nov 29 01:29:38 np0005539505 systemd-logind[794]: Removed session 19.
Nov 29 01:29:43 np0005539505 systemd-logind[794]: New session 20 of user zuul.
Nov 29 01:29:43 np0005539505 systemd[1]: Started Session 20 of User zuul.
Nov 29 01:29:44 np0005539505 python3.9[84795]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:29:45 np0005539505 python3.9[84951]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:46 np0005539505 python3.9[85103]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:29:47 np0005539505 python3.9[85253]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:29:48 np0005539505 python3.9[85405]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 01:29:53 np0005539505 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 29 01:29:54 np0005539505 python3.9[85561]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:29:55 np0005539505 python3.9[85645]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:29:57 np0005539505 python3.9[85798]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:30:01 np0005539505 python3[85953]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 29 01:30:02 np0005539505 python3.9[86105]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:03 np0005539505 python3.9[86257]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:03 np0005539505 python3.9[86335]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:04 np0005539505 python3.9[86487]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:05 np0005539505 python3.9[86565]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.ldqoc9yv recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:05 np0005539505 python3.9[86717]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:06 np0005539505 python3.9[86795]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:07 np0005539505 python3.9[86947]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:08 np0005539505 python3[87100]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:30:09 np0005539505 python3.9[87252]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:09 np0005539505 python3.9[87377]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397808.5520155-437-239922939384989/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:10 np0005539505 python3.9[87529]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:13 np0005539505 python3.9[87654]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397810.1781273-482-93245896408786/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:14 np0005539505 python3.9[87806]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:15 np0005539505 python3.9[87931]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397813.690883-528-230621200014198/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:16 np0005539505 python3.9[88083]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:16 np0005539505 python3.9[88208]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397815.3082879-573-279507855939306/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:17 np0005539505 python3.9[88360]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:18 np0005539505 python3.9[88485]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764397817.1499205-617-243394782819129/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:19 np0005539505 python3.9[88637]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:20 np0005539505 python3.9[88789]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:21 np0005539505 python3.9[88944]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:22 np0005539505 python3.9[89096]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:23 np0005539505 python3.9[89249]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:24 np0005539505 python3.9[89403]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:24 np0005539505 python3.9[89558]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:26 np0005539505 python3.9[89708]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:30:27 np0005539505 python3.9[89861]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:27 np0005539505 ovs-vsctl[89862]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 29 01:30:28 np0005539505 python3.9[90014]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:29 np0005539505 python3.9[90169]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:30:29 np0005539505 ovs-vsctl[90170]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 29 01:30:30 np0005539505 python3.9[90320]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:30 np0005539505 python3.9[90474]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:31 np0005539505 python3.9[90626]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:31 np0005539505 python3.9[90704]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:32 np0005539505 python3.9[90856]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:33 np0005539505 python3.9[90934]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:33 np0005539505 python3.9[91086]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:34 np0005539505 python3.9[91238]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:35 np0005539505 python3.9[91316]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:35 np0005539505 python3.9[91468]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:36 np0005539505 python3.9[91546]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:37 np0005539505 python3.9[91698]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:30:37 np0005539505 systemd[1]: Reloading.
Nov 29 01:30:38 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:38 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:39 np0005539505 python3.9[91887]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:39 np0005539505 python3.9[91965]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:40 np0005539505 python3.9[92117]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:40 np0005539505 python3.9[92195]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:41 np0005539505 python3.9[92347]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:30:41 np0005539505 systemd[1]: Reloading.
Nov 29 01:30:41 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:41 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:41 np0005539505 systemd[1]: Starting Create netns directory...
Nov 29 01:30:41 np0005539505 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:30:41 np0005539505 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:30:41 np0005539505 systemd[1]: Finished Create netns directory.
Nov 29 01:30:42 np0005539505 python3.9[92542]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:43 np0005539505 python3.9[92694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:44 np0005539505 python3.9[92817]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397843.1078222-1371-170017194578577/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:45 np0005539505 python3.9[92969]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:30:45 np0005539505 python3.9[93121]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:30:46 np0005539505 python3.9[93244]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397845.469103-1445-155231236355097/.source.json _original_basename=.tax_n8fs follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:47 np0005539505 python3.9[93396]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:50 np0005539505 python3.9[93823]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 29 01:30:51 np0005539505 python3.9[93976]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:30:52 np0005539505 python3.9[94128]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:30:52 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:30:54 np0005539505 python3[94291]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:30:54 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:30:54 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:30:54 np0005539505 podman[94330]: 2025-11-29 06:30:54.553635897 +0000 UTC m=+0.021566455 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:30:54 np0005539505 podman[94330]: 2025-11-29 06:30:54.805814608 +0000 UTC m=+0.273745166 container create f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 01:30:54 np0005539505 python3[94291]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 29 01:30:55 np0005539505 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 29 01:30:56 np0005539505 python3.9[94520]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:57 np0005539505 python3.9[94674]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:57 np0005539505 python3.9[94750]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:30:58 np0005539505 python3.9[94901]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397857.9179168-1709-110381967658279/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:30:59 np0005539505 python3.9[94977]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:30:59 np0005539505 systemd[1]: Reloading.
Nov 29 01:30:59 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:59 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:30:59 np0005539505 python3.9[95089]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:30:59 np0005539505 systemd[1]: Reloading.
Nov 29 01:30:59 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:30:59 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:00 np0005539505 systemd[1]: Starting ovn_controller container...
Nov 29 01:31:00 np0005539505 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 29 01:31:00 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:31:00 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3edec70490225d28c22337ee1d1a85b777c24a7589ea287cf270bf7a959fe695/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 01:31:00 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43.
Nov 29 01:31:00 np0005539505 podman[95130]: 2025-11-29 06:31:00.320937109 +0000 UTC m=+0.171950846 container init f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: + sudo -E kolla_set_configs
Nov 29 01:31:00 np0005539505 podman[95130]: 2025-11-29 06:31:00.356297625 +0000 UTC m=+0.207311312 container start f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 01:31:00 np0005539505 systemd[1]: Created slice User Slice of UID 0.
Nov 29 01:31:00 np0005539505 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 29 01:31:00 np0005539505 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 29 01:31:00 np0005539505 systemd[1]: Starting User Manager for UID 0...
Nov 29 01:31:00 np0005539505 edpm-start-podman-container[95130]: ovn_controller
Nov 29 01:31:00 np0005539505 edpm-start-podman-container[95129]: Creating additional drop-in dependency for "ovn_controller" (f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43)
Nov 29 01:31:00 np0005539505 systemd[95166]: Queued start job for default target Main User Target.
Nov 29 01:31:00 np0005539505 systemd[1]: Reloading.
Nov 29 01:31:00 np0005539505 podman[95151]: 2025-11-29 06:31:00.557577275 +0000 UTC m=+0.190688045 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:31:00 np0005539505 systemd[95166]: Created slice User Application Slice.
Nov 29 01:31:00 np0005539505 systemd[95166]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 29 01:31:00 np0005539505 systemd[95166]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:31:00 np0005539505 systemd[95166]: Reached target Paths.
Nov 29 01:31:00 np0005539505 systemd[95166]: Reached target Timers.
Nov 29 01:31:00 np0005539505 systemd[95166]: Starting D-Bus User Message Bus Socket...
Nov 29 01:31:00 np0005539505 systemd[95166]: Starting Create User's Volatile Files and Directories...
Nov 29 01:31:00 np0005539505 systemd[95166]: Finished Create User's Volatile Files and Directories.
Nov 29 01:31:00 np0005539505 systemd[95166]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:31:00 np0005539505 systemd[95166]: Reached target Sockets.
Nov 29 01:31:00 np0005539505 systemd[95166]: Reached target Basic System.
Nov 29 01:31:00 np0005539505 systemd[95166]: Reached target Main User Target.
Nov 29 01:31:00 np0005539505 systemd[95166]: Startup finished in 122ms.
Nov 29 01:31:00 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:31:00 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:00 np0005539505 systemd[1]: Started User Manager for UID 0.
Nov 29 01:31:00 np0005539505 systemd[1]: Started ovn_controller container.
Nov 29 01:31:00 np0005539505 systemd[1]: f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43-5d3226aea51d212f.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:31:00 np0005539505 systemd[1]: f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43-5d3226aea51d212f.service: Failed with result 'exit-code'.
Nov 29 01:31:00 np0005539505 systemd[1]: Started Session c1 of User root.
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: INFO:__main__:Validating config file
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: INFO:__main__:Writing out command to execute
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: ++ cat /run_command
Nov 29 01:31:00 np0005539505 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: + ARGS=
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: + sudo kolla_copy_cacerts
Nov 29 01:31:00 np0005539505 systemd[1]: Started Session c2 of User root.
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: + [[ ! -n '' ]]
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: + . kolla_extend_start
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: + umask 0022
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 29 01:31:00 np0005539505 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 29 01:31:00 np0005539505 NetworkManager[55134]: <info>  [1764397860.9585] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 29 01:31:00 np0005539505 NetworkManager[55134]: <info>  [1764397860.9597] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:31:00 np0005539505 NetworkManager[55134]: <info>  [1764397860.9609] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 29 01:31:00 np0005539505 NetworkManager[55134]: <info>  [1764397860.9615] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 29 01:31:00 np0005539505 NetworkManager[55134]: <info>  [1764397860.9617] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 29 01:31:00 np0005539505 kernel: br-int: entered promiscuous mode
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00018|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00019|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00021|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00022|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00023|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 29 01:31:00 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:00Z|00024|main|INFO|OVS feature set changed, force recompute.
Nov 29 01:31:01 np0005539505 systemd-udevd[95277]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:31:01 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:01Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:31:01 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:01Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:31:01 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:01Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 29 01:31:01 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:01Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 29 01:31:01 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:01Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:31:01 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:01Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 29 01:31:01 np0005539505 NetworkManager[55134]: <info>  [1764397861.0860] manager: (ovn-bd30a8-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 29 01:31:01 np0005539505 NetworkManager[55134]: <info>  [1764397861.0870] manager: (ovn-7525db-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Nov 29 01:31:01 np0005539505 kernel: genev_sys_6081: entered promiscuous mode
Nov 29 01:31:01 np0005539505 NetworkManager[55134]: <info>  [1764397861.1040] device (genev_sys_6081): carrier: link connected
Nov 29 01:31:01 np0005539505 NetworkManager[55134]: <info>  [1764397861.1043] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Nov 29 01:31:01 np0005539505 NetworkManager[55134]: <info>  [1764397861.6085] manager: (ovn-a43628-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Nov 29 01:31:01 np0005539505 python3.9[95409]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:31:01 np0005539505 ovs-vsctl[95410]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 29 01:31:02 np0005539505 python3.9[95562]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:31:02 np0005539505 ovs-vsctl[95564]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 29 01:31:03 np0005539505 python3.9[95717]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:31:03 np0005539505 ovs-vsctl[95718]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 29 01:31:04 np0005539505 systemd[1]: session-20.scope: Deactivated successfully.
Nov 29 01:31:04 np0005539505 systemd[1]: session-20.scope: Consumed 43.426s CPU time.
Nov 29 01:31:04 np0005539505 systemd-logind[794]: Session 20 logged out. Waiting for processes to exit.
Nov 29 01:31:04 np0005539505 systemd-logind[794]: Removed session 20.
Nov 29 01:31:10 np0005539505 systemd-logind[794]: New session 22 of user zuul.
Nov 29 01:31:10 np0005539505 systemd[1]: Started Session 22 of User zuul.
Nov 29 01:31:11 np0005539505 systemd[1]: Stopping User Manager for UID 0...
Nov 29 01:31:11 np0005539505 systemd[95166]: Activating special unit Exit the Session...
Nov 29 01:31:11 np0005539505 systemd[95166]: Stopped target Main User Target.
Nov 29 01:31:11 np0005539505 systemd[95166]: Stopped target Basic System.
Nov 29 01:31:11 np0005539505 systemd[95166]: Stopped target Paths.
Nov 29 01:31:11 np0005539505 systemd[95166]: Stopped target Sockets.
Nov 29 01:31:11 np0005539505 systemd[95166]: Stopped target Timers.
Nov 29 01:31:11 np0005539505 systemd[95166]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:31:11 np0005539505 systemd[95166]: Closed D-Bus User Message Bus Socket.
Nov 29 01:31:11 np0005539505 systemd[95166]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:31:11 np0005539505 systemd[95166]: Removed slice User Application Slice.
Nov 29 01:31:11 np0005539505 systemd[95166]: Reached target Shutdown.
Nov 29 01:31:11 np0005539505 systemd[95166]: Finished Exit the Session.
Nov 29 01:31:11 np0005539505 systemd[95166]: Reached target Exit the Session.
Nov 29 01:31:11 np0005539505 systemd[1]: user@0.service: Deactivated successfully.
Nov 29 01:31:11 np0005539505 systemd[1]: Stopped User Manager for UID 0.
Nov 29 01:31:11 np0005539505 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 29 01:31:11 np0005539505 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 29 01:31:11 np0005539505 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 29 01:31:11 np0005539505 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 29 01:31:11 np0005539505 systemd[1]: Removed slice User Slice of UID 0.
Nov 29 01:31:11 np0005539505 python3.9[95899]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:31:13 np0005539505 python3.9[96055]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:13 np0005539505 python3.9[96207]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:14 np0005539505 python3.9[96359]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:20 np0005539505 python3.9[96511]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:20 np0005539505 python3.9[96664]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:21 np0005539505 python3.9[96814]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:31:22 np0005539505 python3.9[96966]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 29 01:31:24 np0005539505 python3.9[97118]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:24 np0005539505 python3.9[97239]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397883.6588101-225-67628016932594/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:25 np0005539505 python3.9[97389]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:26 np0005539505 python3.9[97510]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397885.2213728-270-233564131942973/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:27 np0005539505 python3.9[97662]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:31:28 np0005539505 python3.9[97746]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:31:31 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:31Z|00025|memory|INFO|16000 kB peak resident set size after 30.2 seconds
Nov 29 01:31:31 np0005539505 ovn_controller[95143]: 2025-11-29T06:31:31Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Nov 29 01:31:31 np0005539505 podman[97871]: 2025-11-29 06:31:31.25893687 +0000 UTC m=+0.169554476 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:31:31 np0005539505 python3.9[97915]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:31:32 np0005539505 python3.9[98076]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:32 np0005539505 python3.9[98197]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397891.8597453-381-112121097842488/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:33 np0005539505 python3.9[98347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:33 np0005539505 python3.9[98468]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397892.9553568-381-192274480809330/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:35 np0005539505 python3.9[98618]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:35 np0005539505 python3.9[98739]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397894.8204913-513-271463747066327/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:37 np0005539505 python3.9[98889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:37 np0005539505 python3.9[99010]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397895.8837652-513-175444667716876/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:38 np0005539505 python3.9[99160]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:31:39 np0005539505 python3.9[99314]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:39 np0005539505 python3.9[99466]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:40 np0005539505 python3.9[99544]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:40 np0005539505 python3.9[99696]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:41 np0005539505 python3.9[99774]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:42 np0005539505 python3.9[99926]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:42 np0005539505 python3.9[100078]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:43 np0005539505 python3.9[100156]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:44 np0005539505 python3.9[100308]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:44 np0005539505 python3.9[100386]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:45 np0005539505 python3.9[100538]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:31:45 np0005539505 systemd[1]: Reloading.
Nov 29 01:31:45 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:45 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:31:47 np0005539505 python3.9[100726]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:47 np0005539505 python3.9[100804]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:48 np0005539505 python3.9[100956]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:48 np0005539505 python3.9[101034]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:49 np0005539505 python3.9[101186]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:31:49 np0005539505 systemd[1]: Reloading.
Nov 29 01:31:49 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:31:49 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:31:50 np0005539505 systemd[1]: Starting Create netns directory...
Nov 29 01:31:50 np0005539505 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:31:50 np0005539505 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:31:50 np0005539505 systemd[1]: Finished Create netns directory.
Nov 29 01:31:50 np0005539505 python3.9[101379]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:51 np0005539505 python3.9[101531]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:52 np0005539505 python3.9[101654]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764397911.1911678-967-188815516920292/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:53 np0005539505 python3.9[101806]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:31:53 np0005539505 python3.9[101958]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:31:54 np0005539505 python3.9[102081]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764397913.4405677-1041-87085316058661/.source.json _original_basename=.l5uev_u0 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:55 np0005539505 python3.9[102233]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:31:57 np0005539505 python3.9[102660]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 29 01:31:58 np0005539505 python3.9[102812]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:31:59 np0005539505 python3.9[102964]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:32:01 np0005539505 python3[103141]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:32:01 np0005539505 podman[103167]: 2025-11-29 06:32:01.912679781 +0000 UTC m=+0.226818090 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 01:32:15 np0005539505 podman[103153]: 2025-11-29 06:32:15.975475352 +0000 UTC m=+14.604561210 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:32:16 np0005539505 podman[103275]: 2025-11-29 06:32:16.1281237 +0000 UTC m=+0.055357813 container create ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 01:32:16 np0005539505 podman[103275]: 2025-11-29 06:32:16.097745163 +0000 UTC m=+0.024979316 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:32:16 np0005539505 python3[103141]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:32:16 np0005539505 python3.9[103463]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:32:17 np0005539505 python3.9[103617]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:18 np0005539505 python3.9[103693]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:32:18 np0005539505 python3.9[103844]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764397938.3142967-1305-276840730172232/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:20 np0005539505 python3.9[103920]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:32:20 np0005539505 systemd[1]: Reloading.
Nov 29 01:32:20 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:20 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:20 np0005539505 python3.9[104031]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:21 np0005539505 systemd[1]: Reloading.
Nov 29 01:32:21 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:21 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:21 np0005539505 systemd[1]: Starting ovn_metadata_agent container...
Nov 29 01:32:21 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:32:21 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56fdd02fdc34e8b9da89f6941991cf52df18f7d04ba7fa7d333cfa4c7c62a880/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 29 01:32:21 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/56fdd02fdc34e8b9da89f6941991cf52df18f7d04ba7fa7d333cfa4c7c62a880/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:32:24 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282.
Nov 29 01:32:24 np0005539505 podman[104073]: 2025-11-29 06:32:24.964062124 +0000 UTC m=+3.562243125 container init ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:32:24 np0005539505 ovn_metadata_agent[104089]: + sudo -E kolla_set_configs
Nov 29 01:32:24 np0005539505 podman[104073]: 2025-11-29 06:32:24.996691075 +0000 UTC m=+3.594872056 container start ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Validating config file
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Copying service configuration files
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Writing out command to execute
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: ++ cat /run_command
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: + CMD=neutron-ovn-metadata-agent
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: + ARGS=
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: + sudo kolla_copy_cacerts
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: + [[ ! -n '' ]]
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: + . kolla_extend_start
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: Running command: 'neutron-ovn-metadata-agent'
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: + umask 0022
Nov 29 01:32:25 np0005539505 ovn_metadata_agent[104089]: + exec neutron-ovn-metadata-agent
Nov 29 01:32:25 np0005539505 edpm-start-podman-container[104073]: ovn_metadata_agent
Nov 29 01:32:25 np0005539505 edpm-start-podman-container[104072]: Creating additional drop-in dependency for "ovn_metadata_agent" (ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282)
Nov 29 01:32:25 np0005539505 podman[104096]: 2025-11-29 06:32:25.501143693 +0000 UTC m=+0.493781068 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:32:25 np0005539505 systemd[1]: Reloading.
Nov 29 01:32:25 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:25 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:25 np0005539505 systemd[1]: Started ovn_metadata_agent container.
Nov 29 01:32:26 np0005539505 systemd-logind[794]: Session 22 logged out. Waiting for processes to exit.
Nov 29 01:32:26 np0005539505 systemd[1]: session-22.scope: Deactivated successfully.
Nov 29 01:32:26 np0005539505 systemd[1]: session-22.scope: Consumed 47.579s CPU time.
Nov 29 01:32:26 np0005539505 systemd-logind[794]: Removed session 22.
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.862 104094 INFO neutron.common.config [-] Logging enabled!#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.863 104094 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.863 104094 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.863 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.863 104094 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.864 104094 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.864 104094 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.864 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.864 104094 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.864 104094 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.864 104094 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.865 104094 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.865 104094 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.865 104094 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.865 104094 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.865 104094 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.865 104094 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.865 104094 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.865 104094 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.865 104094 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.866 104094 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.866 104094 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.866 104094 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.866 104094 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.866 104094 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.866 104094 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.866 104094 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.866 104094 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.866 104094 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.867 104094 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.867 104094 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.867 104094 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.867 104094 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.867 104094 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.867 104094 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.867 104094 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.867 104094 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.867 104094 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.868 104094 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.868 104094 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.868 104094 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.868 104094 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.868 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.868 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.868 104094 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.868 104094 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.868 104094 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.868 104094 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.869 104094 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.869 104094 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.869 104094 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.869 104094 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.869 104094 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.869 104094 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.869 104094 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.869 104094 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.869 104094 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.870 104094 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.870 104094 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.870 104094 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.870 104094 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.870 104094 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.870 104094 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.870 104094 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.870 104094 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.870 104094 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.871 104094 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.871 104094 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.871 104094 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.871 104094 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.871 104094 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.871 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.871 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.871 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.871 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.872 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.872 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.872 104094 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.872 104094 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.872 104094 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.872 104094 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.872 104094 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.872 104094 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.872 104094 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.872 104094 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.873 104094 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.873 104094 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.873 104094 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.873 104094 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.873 104094 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.873 104094 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.873 104094 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.873 104094 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.873 104094 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.874 104094 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.874 104094 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.874 104094 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.874 104094 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.874 104094 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.874 104094 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.874 104094 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.874 104094 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.874 104094 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.874 104094 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.874 104094 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.875 104094 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.875 104094 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.875 104094 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.875 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.875 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.875 104094 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.875 104094 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.875 104094 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.876 104094 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.876 104094 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.876 104094 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.876 104094 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.876 104094 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.876 104094 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.876 104094 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.876 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.876 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.877 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.877 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.877 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.877 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.877 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.877 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.877 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.877 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.877 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.878 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.878 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.878 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.878 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.878 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.878 104094 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.878 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.878 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.878 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.878 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.879 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.879 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.879 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.879 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.879 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.879 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.879 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.879 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.880 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.880 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.880 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.880 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.880 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.880 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.880 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.880 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.880 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.880 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.881 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.881 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.881 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.881 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.881 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.881 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.881 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.881 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.881 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.882 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.882 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.882 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.882 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.882 104094 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.882 104094 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.882 104094 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.882 104094 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.882 104094 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.883 104094 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.883 104094 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.883 104094 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.883 104094 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.883 104094 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.883 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.883 104094 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.883 104094 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.883 104094 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.884 104094 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.884 104094 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.884 104094 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.884 104094 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.884 104094 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.884 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.884 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.884 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.884 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.885 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.885 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.885 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.885 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.885 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.885 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.885 104094 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.885 104094 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.885 104094 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.886 104094 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.886 104094 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.886 104094 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.886 104094 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.886 104094 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.886 104094 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.886 104094 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.886 104094 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.886 104094 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.886 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.887 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.887 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.887 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.887 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.887 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.887 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.887 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.887 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.887 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.888 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.888 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.888 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.888 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.888 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.888 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.888 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.888 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.888 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.889 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.889 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.889 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.889 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.889 104094 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.889 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.889 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.889 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.889 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.890 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.890 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.890 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.890 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.890 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.890 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.890 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.890 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.891 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.891 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.891 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.891 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.891 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.891 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.891 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.891 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.891 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.892 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.892 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.892 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.892 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.892 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.892 104094 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.892 104094 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.892 104094 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.892 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.893 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.893 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.893 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.893 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.893 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.893 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.893 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.893 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.893 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.893 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.894 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.894 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.894 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.894 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.894 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.894 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.894 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.894 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.894 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.895 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.895 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.895 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.895 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.895 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.895 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.895 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.895 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.895 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.896 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.896 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.896 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.896 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.896 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.896 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.896 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.896 104094 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.896 104094 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.905 104094 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.905 104094 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.905 104094 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.906 104094 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.906 104094 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.919 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0 (UUID: cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.941 104094 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.941 104094 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.942 104094 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.942 104094 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.945 104094 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.951 104094 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.957 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], external_ids={}, name=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, nb_cfg_timestamp=1764397868982, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.958 104094 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f6389ad6a90>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.959 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.959 104094 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.959 104094 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.959 104094 INFO oslo_service.service [-] Starting 1 workers#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.964 104094 DEBUG oslo_service.service [-] Started child 104201 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.968 104094 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp49vhtj5z/privsep.sock']#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.968 104201 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-496257'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.989 104201 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.990 104201 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.990 104201 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.993 104201 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 29 01:32:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:26.998 104201 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 29 01:32:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:27.003 104201 INFO eventlet.wsgi.server [-] (104201) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Nov 29 01:32:27 np0005539505 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 29 01:32:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:27.682 104094 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:32:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:27.683 104094 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp49vhtj5z/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 29 01:32:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:27.570 104206 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:32:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:27.574 104206 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:32:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:27.576 104206 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Nov 29 01:32:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:27.576 104206 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104206#033[00m
Nov 29 01:32:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:27.685 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[a14f3275-620e-49b3-a7d4-9c8821b2be36]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.158 104206 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.158 104206 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.159 104206 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.662 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[efc6519b-6ffe-479e-84a4-9fb97bc8e4b4]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.666 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, column=external_ids, values=({'neutron:ovn-metadata-id': '95929da3-015a-5906-bb8c-1382cd5d1283'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.679 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.686 104094 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.686 104094 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.687 104094 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.687 104094 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.687 104094 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.687 104094 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.687 104094 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.687 104094 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.687 104094 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.687 104094 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.688 104094 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.688 104094 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.688 104094 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.688 104094 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.688 104094 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.688 104094 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.688 104094 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.688 104094 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.689 104094 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.689 104094 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.689 104094 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.689 104094 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.689 104094 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.689 104094 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.689 104094 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.689 104094 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.690 104094 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.690 104094 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.690 104094 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.690 104094 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.690 104094 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.690 104094 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.690 104094 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.690 104094 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.691 104094 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.691 104094 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.691 104094 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.691 104094 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.691 104094 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.691 104094 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.691 104094 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.692 104094 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.692 104094 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.692 104094 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.692 104094 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.692 104094 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.692 104094 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.692 104094 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.692 104094 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.693 104094 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.693 104094 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.693 104094 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.693 104094 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.693 104094 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.693 104094 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.693 104094 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.693 104094 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.693 104094 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.694 104094 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.694 104094 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.694 104094 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.694 104094 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.694 104094 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.694 104094 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.694 104094 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.694 104094 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.694 104094 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.695 104094 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.695 104094 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.695 104094 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.695 104094 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.695 104094 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.695 104094 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.695 104094 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.695 104094 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.695 104094 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.696 104094 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.696 104094 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.696 104094 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.696 104094 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.696 104094 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.696 104094 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.696 104094 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.696 104094 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.696 104094 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.696 104094 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.697 104094 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.697 104094 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.697 104094 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.697 104094 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.697 104094 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.697 104094 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.697 104094 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.697 104094 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.697 104094 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.698 104094 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.698 104094 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.698 104094 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.698 104094 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.698 104094 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.698 104094 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.698 104094 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.698 104094 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.698 104094 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.699 104094 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.699 104094 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.699 104094 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.699 104094 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.699 104094 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.699 104094 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.699 104094 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.699 104094 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.700 104094 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.700 104094 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.700 104094 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.700 104094 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.700 104094 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.700 104094 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.700 104094 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.700 104094 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.700 104094 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.701 104094 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.701 104094 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.701 104094 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.701 104094 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.701 104094 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.701 104094 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.701 104094 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.701 104094 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.702 104094 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.702 104094 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.702 104094 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.702 104094 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.702 104094 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.702 104094 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.702 104094 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.702 104094 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.702 104094 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.703 104094 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.703 104094 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.703 104094 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.703 104094 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.703 104094 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.703 104094 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.703 104094 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.703 104094 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.703 104094 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.704 104094 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.704 104094 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.704 104094 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.704 104094 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.704 104094 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.704 104094 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.704 104094 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.704 104094 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.704 104094 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.704 104094 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.705 104094 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.705 104094 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.705 104094 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.705 104094 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.705 104094 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.705 104094 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.705 104094 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.705 104094 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.705 104094 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.705 104094 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.706 104094 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.706 104094 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.706 104094 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.706 104094 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.706 104094 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.706 104094 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.706 104094 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.706 104094 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.706 104094 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.707 104094 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.707 104094 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.707 104094 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.707 104094 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.707 104094 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.707 104094 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.707 104094 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.707 104094 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.707 104094 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.708 104094 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.708 104094 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.708 104094 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.708 104094 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.708 104094 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.708 104094 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.708 104094 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.708 104094 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.708 104094 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.708 104094 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.709 104094 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.709 104094 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.709 104094 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.709 104094 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.709 104094 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.709 104094 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.709 104094 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.709 104094 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.709 104094 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.710 104094 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.710 104094 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.710 104094 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.710 104094 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.710 104094 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.710 104094 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.710 104094 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.710 104094 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.710 104094 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.710 104094 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.711 104094 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.711 104094 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.711 104094 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.711 104094 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.711 104094 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.711 104094 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.711 104094 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.711 104094 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.711 104094 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.712 104094 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.712 104094 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.712 104094 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.712 104094 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.712 104094 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.712 104094 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.712 104094 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.712 104094 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.712 104094 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.713 104094 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.713 104094 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.713 104094 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.713 104094 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.713 104094 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.713 104094 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.713 104094 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.713 104094 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.713 104094 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.714 104094 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.714 104094 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.714 104094 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.714 104094 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.714 104094 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.714 104094 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.714 104094 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.714 104094 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.714 104094 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.715 104094 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.715 104094 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.715 104094 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.715 104094 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.715 104094 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.715 104094 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.715 104094 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.715 104094 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.715 104094 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.715 104094 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.716 104094 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.716 104094 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.716 104094 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.716 104094 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.716 104094 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.716 104094 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.716 104094 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.717 104094 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.717 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.717 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.717 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.717 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.717 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.717 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.717 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.718 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.718 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.718 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.718 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.718 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.718 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.718 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.719 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.719 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.719 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.719 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.719 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.719 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.719 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.720 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.720 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.720 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.720 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.720 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.720 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.720 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.720 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.721 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.721 104094 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.721 104094 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.721 104094 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.721 104094 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.721 104094 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:32:28.721 104094 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:32:32 np0005539505 systemd-logind[794]: New session 23 of user zuul.
Nov 29 01:32:32 np0005539505 systemd[1]: Started Session 23 of User zuul.
Nov 29 01:32:32 np0005539505 podman[104213]: 2025-11-29 06:32:32.519007788 +0000 UTC m=+0.104901119 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:32:33 np0005539505 python3.9[104390]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:32:34 np0005539505 python3.9[104546]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:32:37 np0005539505 python3.9[104711]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:32:37 np0005539505 systemd[1]: Reloading.
Nov 29 01:32:37 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:32:37 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:32:41 np0005539505 python3.9[104897]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:32:41 np0005539505 network[104914]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:32:41 np0005539505 network[104915]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:32:41 np0005539505 network[104916]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:32:46 np0005539505 python3.9[105178]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:49 np0005539505 python3.9[105332]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:51 np0005539505 python3.9[105485]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:52 np0005539505 python3.9[105638]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:53 np0005539505 python3.9[105791]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:54 np0005539505 python3.9[105945]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:55 np0005539505 python3.9[106098]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:32:55 np0005539505 podman[106169]: 2025-11-29 06:32:55.726983889 +0000 UTC m=+0.060461649 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:32:56 np0005539505 python3.9[106271]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:57 np0005539505 python3.9[106423]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:57 np0005539505 python3.9[106575]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:58 np0005539505 python3.9[106727]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:59 np0005539505 python3.9[106879]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:32:59 np0005539505 python3.9[107031]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:00 np0005539505 python3.9[107183]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:02 np0005539505 python3.9[107337]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:02 np0005539505 podman[107387]: 2025-11-29 06:33:02.745400086 +0000 UTC m=+0.077207357 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 29 01:33:03 np0005539505 python3.9[107514]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:03 np0005539505 python3.9[107666]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:04 np0005539505 python3.9[107818]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:04 np0005539505 python3.9[107970]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:05 np0005539505 python3.9[108122]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:06 np0005539505 python3.9[108274]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:33:07 np0005539505 python3.9[108426]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:08 np0005539505 python3.9[108578]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:33:09 np0005539505 python3.9[108730]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:33:09 np0005539505 systemd[1]: Reloading.
Nov 29 01:33:09 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:33:09 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:33:10 np0005539505 python3.9[108917]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:11 np0005539505 python3.9[109070]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:11 np0005539505 python3.9[109223]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:12 np0005539505 python3.9[109376]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:13 np0005539505 python3.9[109529]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:14 np0005539505 python3.9[109682]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:14 np0005539505 python3.9[109835]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:33:16 np0005539505 python3.9[109988]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 29 01:33:17 np0005539505 python3.9[110141]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:33:19 np0005539505 python3.9[110299]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:33:20 np0005539505 python3.9[110459]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:33:21 np0005539505 python3.9[110543]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:33:26 np0005539505 podman[110555]: 2025-11-29 06:33:26.725985828 +0000 UTC m=+0.060378127 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:33:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:33:26.899 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:33:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:33:26.900 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:33:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:33:26.900 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:33:33 np0005539505 podman[110623]: 2025-11-29 06:33:33.764039051 +0000 UTC m=+0.097703556 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 29 01:33:53 np0005539505 kernel: SELinux:  Converting 2757 SID table entries...
Nov 29 01:33:53 np0005539505 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:33:53 np0005539505 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:33:53 np0005539505 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:33:53 np0005539505 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:33:53 np0005539505 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:33:53 np0005539505 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:33:53 np0005539505 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:33:57 np0005539505 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 29 01:33:57 np0005539505 podman[110786]: 2025-11-29 06:33:57.744026345 +0000 UTC m=+0.058311545 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 01:34:03 np0005539505 kernel: SELinux:  Converting 2757 SID table entries...
Nov 29 01:34:03 np0005539505 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:34:03 np0005539505 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:34:03 np0005539505 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:34:03 np0005539505 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:34:03 np0005539505 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:34:03 np0005539505 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:34:03 np0005539505 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:34:04 np0005539505 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 29 01:34:04 np0005539505 podman[110813]: 2025-11-29 06:34:04.768072652 +0000 UTC m=+0.092376619 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 29 01:34:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:34:26.900 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:34:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:34:26.902 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:34:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:34:26.902 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:34:28 np0005539505 podman[119137]: 2025-11-29 06:34:28.711078661 +0000 UTC m=+0.047414639 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 29 01:34:35 np0005539505 podman[123930]: 2025-11-29 06:34:35.736205328 +0000 UTC m=+0.072040259 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:34:59 np0005539505 podman[127683]: 2025-11-29 06:34:59.724351612 +0000 UTC m=+0.056724579 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 29 01:35:06 np0005539505 podman[127707]: 2025-11-29 06:35:06.771165845 +0000 UTC m=+0.104033106 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_controller)
Nov 29 01:35:11 np0005539505 kernel: SELinux:  Converting 2758 SID table entries...
Nov 29 01:35:11 np0005539505 kernel: SELinux:  policy capability network_peer_controls=1
Nov 29 01:35:11 np0005539505 kernel: SELinux:  policy capability open_perms=1
Nov 29 01:35:11 np0005539505 kernel: SELinux:  policy capability extended_socket_class=1
Nov 29 01:35:11 np0005539505 kernel: SELinux:  policy capability always_check_network=0
Nov 29 01:35:11 np0005539505 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 29 01:35:11 np0005539505 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 29 01:35:11 np0005539505 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 29 01:35:14 np0005539505 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:35:14 np0005539505 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 29 01:35:14 np0005539505 dbus-broker-launch[765]: Noticed file-system modification, trigger reload.
Nov 29 01:35:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:35:26.903 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:35:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:35:26.906 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:35:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:35:26.906 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:35:30 np0005539505 podman[127841]: 2025-11-29 06:35:30.815107469 +0000 UTC m=+0.128232414 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:35:37 np0005539505 podman[127998]: 2025-11-29 06:35:37.795135005 +0000 UTC m=+0.130743191 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 01:35:39 np0005539505 systemd[1]: Stopping OpenSSH server daemon...
Nov 29 01:35:39 np0005539505 systemd[1]: sshd.service: Deactivated successfully.
Nov 29 01:35:39 np0005539505 systemd[1]: Stopped OpenSSH server daemon.
Nov 29 01:35:39 np0005539505 systemd[1]: sshd.service: Consumed 1.156s CPU time, read 32.0K from disk, written 0B to disk.
Nov 29 01:35:39 np0005539505 systemd[1]: Stopped target sshd-keygen.target.
Nov 29 01:35:39 np0005539505 systemd[1]: Stopping sshd-keygen.target...
Nov 29 01:35:39 np0005539505 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:35:39 np0005539505 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:35:39 np0005539505 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 29 01:35:39 np0005539505 systemd[1]: Reached target sshd-keygen.target.
Nov 29 01:35:39 np0005539505 systemd[1]: Starting OpenSSH server daemon...
Nov 29 01:35:39 np0005539505 systemd[1]: Started OpenSSH server daemon.
Nov 29 01:35:42 np0005539505 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:35:42 np0005539505 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:35:42 np0005539505 systemd[1]: Reloading.
Nov 29 01:35:42 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:35:42 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:35:43 np0005539505 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:36:01 np0005539505 podman[137039]: 2025-11-29 06:36:01.78066549 +0000 UTC m=+0.096519391 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:36:08 np0005539505 podman[137175]: 2025-11-29 06:36:08.038098651 +0000 UTC m=+0.111013194 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:36:16 np0005539505 python3.9[137352]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:16 np0005539505 systemd[1]: Reloading.
Nov 29 01:36:16 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:16 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:17 np0005539505 python3.9[137542]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:17 np0005539505 systemd[1]: Reloading.
Nov 29 01:36:17 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:17 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:18 np0005539505 python3.9[137732]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:18 np0005539505 systemd[1]: Reloading.
Nov 29 01:36:18 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:18 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:19 np0005539505 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:36:19 np0005539505 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:36:19 np0005539505 systemd[1]: man-db-cache-update.service: Consumed 9.950s CPU time.
Nov 29 01:36:19 np0005539505 systemd[1]: run-r93cd6ee2e7634601940ef84dca48b73b.service: Deactivated successfully.
Nov 29 01:36:19 np0005539505 python3.9[137924]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:19 np0005539505 systemd[1]: Reloading.
Nov 29 01:36:19 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:19 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:21 np0005539505 python3.9[138114]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:22 np0005539505 systemd[1]: Reloading.
Nov 29 01:36:22 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:22 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:24 np0005539505 python3.9[138303]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:24 np0005539505 systemd[1]: Reloading.
Nov 29 01:36:24 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:24 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:26 np0005539505 python3.9[138492]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:26 np0005539505 systemd[1]: Reloading.
Nov 29 01:36:26 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:26 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:36:26.905 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:36:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:36:26.907 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:36:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:36:26.907 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:36:27 np0005539505 python3.9[138681]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:28 np0005539505 python3.9[138836]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:28 np0005539505 systemd[1]: Reloading.
Nov 29 01:36:28 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:28 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:30 np0005539505 python3.9[139026]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 29 01:36:30 np0005539505 systemd[1]: Reloading.
Nov 29 01:36:30 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:36:30 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:36:30 np0005539505 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 29 01:36:30 np0005539505 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 29 01:36:31 np0005539505 python3.9[139221]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:31 np0005539505 podman[139223]: 2025-11-29 06:36:31.968570364 +0000 UTC m=+0.068223040 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 01:36:32 np0005539505 python3.9[139395]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:33 np0005539505 python3.9[139550]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:34 np0005539505 python3.9[139705]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:35 np0005539505 python3.9[139860]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:36 np0005539505 python3.9[140015]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:37 np0005539505 python3.9[140170]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:38 np0005539505 podman[140297]: 2025-11-29 06:36:38.253539605 +0000 UTC m=+0.106288149 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Nov 29 01:36:38 np0005539505 python3.9[140344]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:39 np0005539505 python3.9[140507]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:39 np0005539505 python3.9[140662]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:40 np0005539505 python3.9[140817]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:41 np0005539505 python3.9[140972]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:42 np0005539505 python3.9[141127]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:43 np0005539505 python3.9[141282]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 29 01:36:44 np0005539505 python3.9[141437]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:45 np0005539505 python3.9[141589]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:45 np0005539505 python3.9[141741]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:46 np0005539505 python3.9[141893]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:47 np0005539505 python3.9[142045]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:47 np0005539505 python3.9[142197]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:36:48 np0005539505 python3.9[142349]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:49 np0005539505 python3.9[142474]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398208.4221942-1629-30433727384357/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:50 np0005539505 python3.9[142626]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:51 np0005539505 python3.9[142751]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398209.886086-1629-79472059868383/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:51 np0005539505 python3.9[142903]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:52 np0005539505 python3.9[143028]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398211.2603536-1629-187114374147877/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:52 np0005539505 python3.9[143180]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:53 np0005539505 python3.9[143305]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398212.490243-1629-37402310973971/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:54 np0005539505 python3.9[143457]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:54 np0005539505 python3.9[143582]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398213.6918397-1629-220819765853172/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:55 np0005539505 python3.9[143735]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:56 np0005539505 python3.9[143860]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398215.2031393-1629-198805799584781/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:56 np0005539505 python3.9[144012]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:57 np0005539505 python3.9[144135]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398216.3773196-1629-146434273591298/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:57 np0005539505 python3.9[144287]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:36:58 np0005539505 python3.9[144412]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764398217.4814172-1629-279011140279339/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:36:59 np0005539505 python3.9[144565]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 29 01:37:00 np0005539505 python3.9[144718]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:01 np0005539505 python3.9[144870]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:02 np0005539505 python3.9[145022]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:02 np0005539505 podman[145146]: 2025-11-29 06:37:02.583658076 +0000 UTC m=+0.045835690 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Nov 29 01:37:02 np0005539505 python3.9[145190]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:03 np0005539505 python3.9[145344]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:04 np0005539505 python3.9[145496]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:04 np0005539505 python3.9[145648]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:05 np0005539505 python3.9[145800]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:05 np0005539505 python3.9[145952]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:06 np0005539505 python3.9[146104]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:07 np0005539505 python3.9[146256]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:07 np0005539505 python3.9[146408]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:08 np0005539505 python3.9[146560]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:08 np0005539505 podman[146660]: 2025-11-29 06:37:08.774706153 +0000 UTC m=+0.093305721 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Nov 29 01:37:08 np0005539505 python3.9[146735]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:10 np0005539505 python3.9[146888]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:11 np0005539505 python3.9[147011]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398230.0083401-2292-158807421934114/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:11 np0005539505 python3.9[147163]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:12 np0005539505 python3.9[147286]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398231.2588239-2292-31259882014785/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:12 np0005539505 python3.9[147438]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:13 np0005539505 python3.9[147561]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398232.327851-2292-13690170942461/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:13 np0005539505 python3.9[147713]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:14 np0005539505 python3.9[147836]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398233.448991-2292-14217252414023/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:15 np0005539505 python3.9[147988]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:15 np0005539505 python3.9[148111]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398234.869262-2292-204606801363560/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:16 np0005539505 python3.9[148263]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:16 np0005539505 python3.9[148386]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398235.9747686-2292-80713911306770/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:17 np0005539505 python3.9[148538]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:17 np0005539505 python3.9[148661]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398237.062053-2292-232723332390164/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:18 np0005539505 python3.9[148813]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:19 np0005539505 python3.9[148936]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398238.1118512-2292-38987494686941/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:19 np0005539505 python3.9[149088]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:20 np0005539505 python3.9[149211]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398239.2568035-2292-171501893816679/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:20 np0005539505 python3.9[149363]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:21 np0005539505 python3.9[149486]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398240.3165576-2292-99224557868678/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:21 np0005539505 python3.9[149638]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:22 np0005539505 python3.9[149761]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398241.4223957-2292-262195626423217/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:22 np0005539505 python3.9[149913]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:23 np0005539505 python3.9[150036]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398242.5213463-2292-213921554517054/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:24 np0005539505 python3.9[150188]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:24 np0005539505 python3.9[150311]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398243.6630828-2292-164145483773556/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:25 np0005539505 python3.9[150463]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:25 np0005539505 python3.9[150586]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398244.7818496-2292-266290957279413/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:37:26.907 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:37:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:37:26.909 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:37:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:37:26.909 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:37:27 np0005539505 python3.9[150736]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:37:29 np0005539505 python3.9[150891]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 29 01:37:32 np0005539505 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 29 01:37:32 np0005539505 podman[150896]: 2025-11-29 06:37:32.810141843 +0000 UTC m=+0.112595360 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:37:36 np0005539505 python3.9[151068]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:36 np0005539505 python3.9[151220]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:37 np0005539505 python3.9[151372]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:38 np0005539505 python3.9[151524]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:38 np0005539505 podman[151648]: 2025-11-29 06:37:38.966197538 +0000 UTC m=+0.101414622 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:37:39 np0005539505 python3.9[151691]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:41 np0005539505 python3.9[151852]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:41 np0005539505 python3.9[152004]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:42 np0005539505 python3.9[152156]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:43 np0005539505 python3.9[152308]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:43 np0005539505 python3.9[152460]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:45 np0005539505 python3.9[152612]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:37:45 np0005539505 systemd[1]: Reloading.
Nov 29 01:37:45 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:37:45 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:37:45 np0005539505 systemd[1]: Starting libvirt logging daemon socket...
Nov 29 01:37:45 np0005539505 systemd[1]: Listening on libvirt logging daemon socket.
Nov 29 01:37:45 np0005539505 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 29 01:37:45 np0005539505 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 29 01:37:45 np0005539505 systemd[1]: Starting libvirt logging daemon...
Nov 29 01:37:45 np0005539505 systemd[1]: Started libvirt logging daemon.
Nov 29 01:37:46 np0005539505 python3.9[152805]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:37:46 np0005539505 systemd[1]: Reloading.
Nov 29 01:37:46 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:37:46 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:37:46 np0005539505 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 29 01:37:46 np0005539505 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 29 01:37:46 np0005539505 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 29 01:37:46 np0005539505 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 29 01:37:46 np0005539505 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 29 01:37:46 np0005539505 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 29 01:37:46 np0005539505 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 01:37:46 np0005539505 systemd[1]: Started libvirt nodedev daemon.
Nov 29 01:37:47 np0005539505 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 29 01:37:47 np0005539505 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 29 01:37:47 np0005539505 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 29 01:37:47 np0005539505 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 29 01:37:47 np0005539505 python3.9[153024]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:37:47 np0005539505 systemd[1]: Reloading.
Nov 29 01:37:47 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:37:47 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:37:47 np0005539505 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 29 01:37:47 np0005539505 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 29 01:37:47 np0005539505 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 29 01:37:47 np0005539505 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 29 01:37:47 np0005539505 systemd[1]: Starting libvirt proxy daemon...
Nov 29 01:37:48 np0005539505 systemd[1]: Started libvirt proxy daemon.
Nov 29 01:37:48 np0005539505 setroubleshoot[152919]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 18955e96-9779-4fc0-a274-73eb5b498a5b
Nov 29 01:37:48 np0005539505 setroubleshoot[152919]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 29 01:37:48 np0005539505 setroubleshoot[152919]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 18955e96-9779-4fc0-a274-73eb5b498a5b
Nov 29 01:37:48 np0005539505 setroubleshoot[152919]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 29 01:37:49 np0005539505 python3.9[153245]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:37:49 np0005539505 systemd[1]: Reloading.
Nov 29 01:37:49 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:37:49 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:37:49 np0005539505 systemd[1]: Listening on libvirt locking daemon socket.
Nov 29 01:37:49 np0005539505 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 29 01:37:49 np0005539505 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 29 01:37:49 np0005539505 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 29 01:37:49 np0005539505 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 29 01:37:49 np0005539505 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 29 01:37:49 np0005539505 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 29 01:37:49 np0005539505 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 29 01:37:49 np0005539505 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 29 01:37:49 np0005539505 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 29 01:37:49 np0005539505 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 01:37:49 np0005539505 systemd[1]: Started libvirt QEMU daemon.
Nov 29 01:37:50 np0005539505 python3.9[153460]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:37:50 np0005539505 systemd[1]: Reloading.
Nov 29 01:37:50 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:37:50 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:37:50 np0005539505 systemd[1]: Starting libvirt secret daemon socket...
Nov 29 01:37:50 np0005539505 systemd[1]: Listening on libvirt secret daemon socket.
Nov 29 01:37:50 np0005539505 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 29 01:37:50 np0005539505 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 29 01:37:50 np0005539505 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 29 01:37:50 np0005539505 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 29 01:37:50 np0005539505 systemd[1]: Starting libvirt secret daemon...
Nov 29 01:37:50 np0005539505 systemd[1]: Started libvirt secret daemon.
Nov 29 01:37:53 np0005539505 python3.9[153671]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:54 np0005539505 python3.9[153823]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:37:56 np0005539505 python3.9[153975]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:56 np0005539505 python3.9[154098]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398275.2080667-3328-202873225170138/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:58 np0005539505 python3.9[154250]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:37:58 np0005539505 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 29 01:37:58 np0005539505 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.016s CPU time.
Nov 29 01:37:58 np0005539505 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 29 01:37:58 np0005539505 python3.9[154402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:37:59 np0005539505 python3.9[154480]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:00 np0005539505 python3.9[154632]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:00 np0005539505 python3.9[154710]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.gh38vnus recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:01 np0005539505 python3.9[154862]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:01 np0005539505 python3.9[154940]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:02 np0005539505 python3.9[155092]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:03 np0005539505 podman[155217]: 2025-11-29 06:38:03.515075138 +0000 UTC m=+0.064914942 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:38:03 np0005539505 python3[155264]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:38:04 np0005539505 python3.9[155416]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:05 np0005539505 python3.9[155494]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:06 np0005539505 python3.9[155646]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:06 np0005539505 python3.9[155724]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:07 np0005539505 python3.9[155876]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:07 np0005539505 python3.9[155954]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:08 np0005539505 python3.9[156106]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:08 np0005539505 python3.9[156184]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:09 np0005539505 podman[156301]: 2025-11-29 06:38:09.749104514 +0000 UTC m=+0.087632278 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 29 01:38:09 np0005539505 python3.9[156362]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:10 np0005539505 python3.9[156488]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398289.3463407-3703-102682076825068/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:11 np0005539505 python3.9[156640]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:12 np0005539505 python3.9[156792]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:13 np0005539505 python3.9[156947]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:14 np0005539505 python3.9[157099]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:14 np0005539505 python3.9[157252]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:38:15 np0005539505 python3.9[157406]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:16 np0005539505 python3.9[157561]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:17 np0005539505 python3.9[157713]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:17 np0005539505 python3.9[157836]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398296.611474-3919-281432664799561/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:18 np0005539505 python3.9[157988]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:19 np0005539505 python3.9[158111]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398298.1243007-3964-33398683662532/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:19 np0005539505 python3.9[158263]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:20 np0005539505 python3.9[158386]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398299.4293065-4009-70452356838846/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:21 np0005539505 python3.9[158538]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:38:21 np0005539505 systemd[1]: Reloading.
Nov 29 01:38:21 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:21 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:21 np0005539505 systemd[1]: Reached target edpm_libvirt.target.
Nov 29 01:38:22 np0005539505 python3.9[158730]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 29 01:38:22 np0005539505 systemd[1]: Reloading.
Nov 29 01:38:22 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:22 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:23 np0005539505 systemd[1]: Reloading.
Nov 29 01:38:23 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:23 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:24 np0005539505 systemd[1]: session-23.scope: Deactivated successfully.
Nov 29 01:38:24 np0005539505 systemd[1]: session-23.scope: Consumed 3min 15.571s CPU time.
Nov 29 01:38:24 np0005539505 systemd-logind[794]: Session 23 logged out. Waiting for processes to exit.
Nov 29 01:38:24 np0005539505 systemd-logind[794]: Removed session 23.
Nov 29 01:38:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:38:26.909 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:38:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:38:26.912 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:38:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:38:26.912 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:38:30 np0005539505 systemd-logind[794]: New session 24 of user zuul.
Nov 29 01:38:30 np0005539505 systemd[1]: Started Session 24 of User zuul.
Nov 29 01:38:31 np0005539505 python3.9[158979]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:38:32 np0005539505 python3.9[159133]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:38:33 np0005539505 network[159150]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:38:33 np0005539505 network[159151]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:38:33 np0005539505 network[159152]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:38:33 np0005539505 podman[159158]: 2025-11-29 06:38:33.903136369 +0000 UTC m=+0.056329290 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:38:37 np0005539505 python3.9[159443]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 29 01:38:38 np0005539505 python3.9[159527]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:38:40 np0005539505 podman[159529]: 2025-11-29 06:38:40.839793077 +0000 UTC m=+0.145055877 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:38:45 np0005539505 python3.9[159706]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:38:46 np0005539505 python3.9[159858]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:47 np0005539505 python3.9[160011]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:38:47 np0005539505 python3.9[160163]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:38:48 np0005539505 python3.9[160316]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:38:49 np0005539505 python3.9[160439]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398328.20904-253-214291747361055/.source.iscsi _original_basename=.bntwo1ao follow=False checksum=82232450b5fa6d5a937d682055cd69f6920ca682 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:50 np0005539505 python3.9[160591]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:51 np0005539505 python3.9[160743]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:38:51 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:38:51 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:38:51 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:38:52 np0005539505 python3.9[160896]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:38:52 np0005539505 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 29 01:38:53 np0005539505 python3.9[161052]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:38:53 np0005539505 systemd[1]: Reloading.
Nov 29 01:38:53 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:38:53 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:38:53 np0005539505 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 01:38:53 np0005539505 systemd[1]: Starting Open-iSCSI...
Nov 29 01:38:53 np0005539505 kernel: Loading iSCSI transport class v2.0-870.
Nov 29 01:38:53 np0005539505 systemd[1]: Started Open-iSCSI.
Nov 29 01:38:53 np0005539505 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 29 01:38:53 np0005539505 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 29 01:38:54 np0005539505 python3.9[161253]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:38:55 np0005539505 network[161270]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:38:55 np0005539505 network[161271]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:38:55 np0005539505 network[161272]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:38:59 np0005539505 python3.9[161543]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:39:00 np0005539505 python3.9[161695]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 29 01:39:01 np0005539505 python3.9[161851]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:01 np0005539505 python3.9[161974]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398340.7926037-483-121387439529197/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:02 np0005539505 python3.9[162126]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:03 np0005539505 python3.9[162278]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:39:03 np0005539505 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 01:39:03 np0005539505 systemd[1]: Stopped Load Kernel Modules.
Nov 29 01:39:03 np0005539505 systemd[1]: Stopping Load Kernel Modules...
Nov 29 01:39:03 np0005539505 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:39:03 np0005539505 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:39:04 np0005539505 podman[162406]: 2025-11-29 06:39:04.392133648 +0000 UTC m=+0.056209732 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Nov 29 01:39:04 np0005539505 python3.9[162451]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:05 np0005539505 python3.9[162603]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:06 np0005539505 python3.9[162755]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:06 np0005539505 python3.9[162907]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:07 np0005539505 python3.9[163030]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398346.332404-658-256214513297781/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:08 np0005539505 python3.9[163182]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:39:08 np0005539505 python3.9[163335]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:09 np0005539505 python3.9[163487]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:10 np0005539505 python3.9[163639]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:11 np0005539505 podman[163763]: 2025-11-29 06:39:11.18082285 +0000 UTC m=+0.080824563 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 01:39:11 np0005539505 python3.9[163808]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:11 np0005539505 python3.9[163967]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:12 np0005539505 python3.9[164119]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:13 np0005539505 python3.9[164271]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:14 np0005539505 python3.9[164423]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:14 np0005539505 python3.9[164577]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:15 np0005539505 python3.9[164729]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:16 np0005539505 python3.9[164881]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:17 np0005539505 python3.9[164959]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:17 np0005539505 python3.9[165111]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:18 np0005539505 python3.9[165189]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:19 np0005539505 python3.9[165341]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:19 np0005539505 python3.9[165493]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:20 np0005539505 python3.9[165571]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:20 np0005539505 python3.9[165723]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:21 np0005539505 python3.9[165801]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:22 np0005539505 python3.9[165953]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:22 np0005539505 systemd[1]: Reloading.
Nov 29 01:39:22 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:22 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:23 np0005539505 python3.9[166142]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:23 np0005539505 python3.9[166220]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:24 np0005539505 python3.9[166372]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:25 np0005539505 python3.9[166450]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:25 np0005539505 python3.9[166602]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:26 np0005539505 systemd[1]: Reloading.
Nov 29 01:39:26 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:26 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:26 np0005539505 systemd[1]: Starting Create netns directory...
Nov 29 01:39:26 np0005539505 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 29 01:39:26 np0005539505 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 29 01:39:26 np0005539505 systemd[1]: Finished Create netns directory.
Nov 29 01:39:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:39:26.911 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:39:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:39:26.914 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:39:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:39:26.914 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:39:27 np0005539505 python3.9[166796]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:28 np0005539505 python3.9[166948]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:28 np0005539505 python3.9[167071]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398367.7209015-1279-119327895963003/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:29 np0005539505 python3.9[167223]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:39:30 np0005539505 python3.9[167375]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:31 np0005539505 python3.9[167498]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398370.1746867-1353-103488747556037/.source.json _original_basename=.fg9zhfcm follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:32 np0005539505 python3.9[167650]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:34 np0005539505 podman[168050]: 2025-11-29 06:39:34.523760679 +0000 UTC m=+0.095472469 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:39:34 np0005539505 python3.9[168091]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 29 01:39:35 np0005539505 python3.9[168246]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:39:36 np0005539505 python3.9[168398]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 29 01:39:38 np0005539505 python3[168577]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:39:38 np0005539505 podman[168616]: 2025-11-29 06:39:38.702861927 +0000 UTC m=+0.022693637 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:39:38 np0005539505 podman[168616]: 2025-11-29 06:39:38.816478272 +0000 UTC m=+0.136309952 container create 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:39:38 np0005539505 python3[168577]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 29 01:39:39 np0005539505 python3.9[168806]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:40 np0005539505 python3.9[168960]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:41 np0005539505 python3.9[169036]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:41 np0005539505 podman[169135]: 2025-11-29 06:39:41.793184571 +0000 UTC m=+0.117064685 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller)
Nov 29 01:39:41 np0005539505 python3.9[169215]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398381.3585646-1617-212098035703447/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:42 np0005539505 python3.9[169291]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:39:42 np0005539505 systemd[1]: Reloading.
Nov 29 01:39:42 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:42 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:43 np0005539505 python3.9[169403]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:39:43 np0005539505 systemd[1]: Reloading.
Nov 29 01:39:43 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:43 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:43 np0005539505 systemd[1]: Starting multipathd container...
Nov 29 01:39:43 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:39:43 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fdc981172e0a4d968eeb1f48e50cb44848871e2172277d32c06cf821682465b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:39:43 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fdc981172e0a4d968eeb1f48e50cb44848871e2172277d32c06cf821682465b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:39:43 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3.
Nov 29 01:39:43 np0005539505 podman[169444]: 2025-11-29 06:39:43.781014728 +0000 UTC m=+0.121162691 container init 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 29 01:39:43 np0005539505 multipathd[169459]: + sudo -E kolla_set_configs
Nov 29 01:39:43 np0005539505 podman[169444]: 2025-11-29 06:39:43.804525198 +0000 UTC m=+0.144673151 container start 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:39:43 np0005539505 podman[169444]: multipathd
Nov 29 01:39:43 np0005539505 systemd[1]: Started multipathd container.
Nov 29 01:39:43 np0005539505 multipathd[169459]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:39:43 np0005539505 multipathd[169459]: INFO:__main__:Validating config file
Nov 29 01:39:43 np0005539505 multipathd[169459]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:39:43 np0005539505 multipathd[169459]: INFO:__main__:Writing out command to execute
Nov 29 01:39:43 np0005539505 multipathd[169459]: ++ cat /run_command
Nov 29 01:39:43 np0005539505 multipathd[169459]: + CMD='/usr/sbin/multipathd -d'
Nov 29 01:39:43 np0005539505 multipathd[169459]: + ARGS=
Nov 29 01:39:43 np0005539505 multipathd[169459]: + sudo kolla_copy_cacerts
Nov 29 01:39:43 np0005539505 multipathd[169459]: + [[ ! -n '' ]]
Nov 29 01:39:43 np0005539505 multipathd[169459]: + . kolla_extend_start
Nov 29 01:39:43 np0005539505 multipathd[169459]: Running command: '/usr/sbin/multipathd -d'
Nov 29 01:39:43 np0005539505 multipathd[169459]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 01:39:43 np0005539505 multipathd[169459]: + umask 0022
Nov 29 01:39:43 np0005539505 multipathd[169459]: + exec /usr/sbin/multipathd -d
Nov 29 01:39:43 np0005539505 podman[169466]: 2025-11-29 06:39:43.895130848 +0000 UTC m=+0.078789085 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:39:43 np0005539505 multipathd[169459]: 3831.538831 | --------start up--------
Nov 29 01:39:43 np0005539505 multipathd[169459]: 3831.538852 | read /etc/multipath.conf
Nov 29 01:39:43 np0005539505 systemd[1]: 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3-74a43d474be17ad.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:39:43 np0005539505 systemd[1]: 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3-74a43d474be17ad.service: Failed with result 'exit-code'.
Nov 29 01:39:43 np0005539505 multipathd[169459]: 3831.545488 | path checkers start up
Nov 29 01:39:45 np0005539505 python3.9[169648]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:39:46 np0005539505 python3.9[169802]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:39:46 np0005539505 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 29 01:39:47 np0005539505 python3.9[169967]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:39:47 np0005539505 systemd[1]: Stopping multipathd container...
Nov 29 01:39:47 np0005539505 multipathd[169459]: 3834.744865 | exit (signal)
Nov 29 01:39:47 np0005539505 multipathd[169459]: 3834.745471 | --------shut down-------
Nov 29 01:39:47 np0005539505 systemd[1]: libpod-25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3.scope: Deactivated successfully.
Nov 29 01:39:47 np0005539505 podman[169972]: 2025-11-29 06:39:47.125697805 +0000 UTC m=+0.054637877 container died 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:39:47 np0005539505 systemd[1]: 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3-74a43d474be17ad.timer: Deactivated successfully.
Nov 29 01:39:47 np0005539505 systemd[1]: Stopped /usr/bin/podman healthcheck run 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3.
Nov 29 01:39:47 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3-userdata-shm.mount: Deactivated successfully.
Nov 29 01:39:47 np0005539505 systemd[1]: var-lib-containers-storage-overlay-2fdc981172e0a4d968eeb1f48e50cb44848871e2172277d32c06cf821682465b-merged.mount: Deactivated successfully.
Nov 29 01:39:47 np0005539505 podman[169972]: 2025-11-29 06:39:47.178076996 +0000 UTC m=+0.107017068 container cleanup 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:39:47 np0005539505 podman[169972]: multipathd
Nov 29 01:39:47 np0005539505 podman[169999]: multipathd
Nov 29 01:39:47 np0005539505 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 29 01:39:47 np0005539505 systemd[1]: Stopped multipathd container.
Nov 29 01:39:47 np0005539505 systemd[1]: Starting multipathd container...
Nov 29 01:39:47 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:39:47 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fdc981172e0a4d968eeb1f48e50cb44848871e2172277d32c06cf821682465b/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:39:47 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fdc981172e0a4d968eeb1f48e50cb44848871e2172277d32c06cf821682465b/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:39:47 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3.
Nov 29 01:39:47 np0005539505 podman[170012]: 2025-11-29 06:39:47.34218377 +0000 UTC m=+0.088859642 container init 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 01:39:47 np0005539505 multipathd[170027]: + sudo -E kolla_set_configs
Nov 29 01:39:47 np0005539505 podman[170012]: 2025-11-29 06:39:47.375574701 +0000 UTC m=+0.122250563 container start 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:39:47 np0005539505 podman[170012]: multipathd
Nov 29 01:39:47 np0005539505 systemd[1]: Started multipathd container.
Nov 29 01:39:47 np0005539505 multipathd[170027]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:39:47 np0005539505 multipathd[170027]: INFO:__main__:Validating config file
Nov 29 01:39:47 np0005539505 multipathd[170027]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:39:47 np0005539505 multipathd[170027]: INFO:__main__:Writing out command to execute
Nov 29 01:39:47 np0005539505 multipathd[170027]: ++ cat /run_command
Nov 29 01:39:47 np0005539505 multipathd[170027]: + CMD='/usr/sbin/multipathd -d'
Nov 29 01:39:47 np0005539505 multipathd[170027]: + ARGS=
Nov 29 01:39:47 np0005539505 multipathd[170027]: + sudo kolla_copy_cacerts
Nov 29 01:39:47 np0005539505 multipathd[170027]: + [[ ! -n '' ]]
Nov 29 01:39:47 np0005539505 multipathd[170027]: + . kolla_extend_start
Nov 29 01:39:47 np0005539505 multipathd[170027]: Running command: '/usr/sbin/multipathd -d'
Nov 29 01:39:47 np0005539505 multipathd[170027]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 29 01:39:47 np0005539505 multipathd[170027]: + umask 0022
Nov 29 01:39:47 np0005539505 multipathd[170027]: + exec /usr/sbin/multipathd -d
Nov 29 01:39:47 np0005539505 podman[170034]: 2025-11-29 06:39:47.442418854 +0000 UTC m=+0.057826577 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:39:47 np0005539505 systemd[1]: 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3-622312fb4efd99ca.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:39:47 np0005539505 systemd[1]: 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3-622312fb4efd99ca.service: Failed with result 'exit-code'.
Nov 29 01:39:47 np0005539505 multipathd[170027]: 3835.097359 | --------start up--------
Nov 29 01:39:47 np0005539505 multipathd[170027]: 3835.097487 | read /etc/multipath.conf
Nov 29 01:39:47 np0005539505 multipathd[170027]: 3835.103234 | path checkers start up
Nov 29 01:39:48 np0005539505 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 01:39:48 np0005539505 python3.9[170219]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:49 np0005539505 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 29 01:39:49 np0005539505 python3.9[170371]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 29 01:39:50 np0005539505 python3.9[170524]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 29 01:39:50 np0005539505 kernel: Key type psk registered
Nov 29 01:39:50 np0005539505 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 29 01:39:51 np0005539505 python3.9[170686]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:39:51 np0005539505 python3.9[170809]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398390.6044552-1858-197929454624482/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:52 np0005539505 python3.9[170961]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:39:53 np0005539505 python3.9[171113]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:39:53 np0005539505 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 29 01:39:53 np0005539505 systemd[1]: Stopped Load Kernel Modules.
Nov 29 01:39:53 np0005539505 systemd[1]: Stopping Load Kernel Modules...
Nov 29 01:39:53 np0005539505 systemd[1]: Starting Load Kernel Modules...
Nov 29 01:39:53 np0005539505 systemd[1]: Finished Load Kernel Modules.
Nov 29 01:39:54 np0005539505 python3.9[171269]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 29 01:39:58 np0005539505 systemd[1]: Reloading.
Nov 29 01:39:58 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:58 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:58 np0005539505 systemd[1]: Reloading.
Nov 29 01:39:58 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:58 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:58 np0005539505 systemd-logind[794]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 29 01:39:58 np0005539505 systemd-logind[794]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 29 01:39:59 np0005539505 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 29 01:39:59 np0005539505 systemd[1]: Starting man-db-cache-update.service...
Nov 29 01:39:59 np0005539505 systemd[1]: Reloading.
Nov 29 01:39:59 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:39:59 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:39:59 np0005539505 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 29 01:40:01 np0005539505 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 29 01:40:01 np0005539505 systemd[1]: Finished man-db-cache-update.service.
Nov 29 01:40:01 np0005539505 systemd[1]: man-db-cache-update.service: Consumed 1.500s CPU time.
Nov 29 01:40:01 np0005539505 systemd[1]: run-r373ac42adfa54e0c844dee37fcc37e62.service: Deactivated successfully.
Nov 29 01:40:01 np0005539505 python3.9[172719]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:40:01 np0005539505 systemd[1]: Stopping Open-iSCSI...
Nov 29 01:40:01 np0005539505 iscsid[161093]: iscsid shutting down.
Nov 29 01:40:01 np0005539505 systemd[1]: iscsid.service: Deactivated successfully.
Nov 29 01:40:01 np0005539505 systemd[1]: Stopped Open-iSCSI.
Nov 29 01:40:01 np0005539505 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 29 01:40:01 np0005539505 systemd[1]: Starting Open-iSCSI...
Nov 29 01:40:01 np0005539505 systemd[1]: Started Open-iSCSI.
Nov 29 01:40:02 np0005539505 python3.9[172875]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:40:03 np0005539505 python3.9[173031]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:04 np0005539505 python3.9[173183]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:40:04 np0005539505 systemd[1]: Reloading.
Nov 29 01:40:04 np0005539505 podman[173184]: 2025-11-29 06:40:04.747345593 +0000 UTC m=+0.079273086 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 01:40:04 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:04 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:05 np0005539505 python3.9[173386]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:40:05 np0005539505 network[173403]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:40:05 np0005539505 network[173404]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:40:05 np0005539505 network[173405]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:40:11 np0005539505 python3.9[173679]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:12 np0005539505 podman[173804]: 2025-11-29 06:40:12.058092925 +0000 UTC m=+0.184315581 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:40:12 np0005539505 python3.9[173851]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:12 np0005539505 python3.9[174010]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:13 np0005539505 python3.9[174163]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:14 np0005539505 python3.9[174316]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:15 np0005539505 python3.9[174469]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:16 np0005539505 python3.9[174622]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:16 np0005539505 python3.9[174775]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:40:17 np0005539505 podman[174801]: 2025-11-29 06:40:17.787148535 +0000 UTC m=+0.110438038 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Nov 29 01:40:18 np0005539505 python3.9[174950]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:19 np0005539505 python3.9[175102]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:20 np0005539505 python3.9[175254]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:20 np0005539505 python3.9[175406]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:21 np0005539505 python3.9[175558]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:21 np0005539505 python3.9[175710]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:22 np0005539505 python3.9[175862]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:23 np0005539505 python3.9[176014]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:24 np0005539505 python3.9[176166]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:25 np0005539505 python3.9[176318]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:25 np0005539505 python3.9[176470]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:26 np0005539505 python3.9[176622]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:40:26.913 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:40:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:40:26.915 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:40:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:40:26.916 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:40:27 np0005539505 python3.9[176774]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:27 np0005539505 python3.9[176926]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:28 np0005539505 python3.9[177078]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:29 np0005539505 python3.9[177230]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:40:30 np0005539505 python3.9[177382]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:30 np0005539505 python3.9[177534]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:40:31 np0005539505 python3.9[177686]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:40:31 np0005539505 systemd[1]: Reloading.
Nov 29 01:40:31 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:40:31 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:40:32 np0005539505 python3.9[177873]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:33 np0005539505 python3.9[178026]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:34 np0005539505 python3.9[178179]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:34 np0005539505 python3.9[178332]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:35 np0005539505 podman[178457]: 2025-11-29 06:40:35.405183612 +0000 UTC m=+0.069980775 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 01:40:35 np0005539505 python3.9[178503]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:36 np0005539505 python3.9[178656]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:36 np0005539505 python3.9[178809]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:37 np0005539505 python3.9[178962]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:40:39 np0005539505 python3.9[179115]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:39 np0005539505 python3.9[179267]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:40 np0005539505 python3.9[179419]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:41 np0005539505 python3.9[179571]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:42 np0005539505 python3.9[179723]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:42 np0005539505 podman[179724]: 2025-11-29 06:40:42.358325568 +0000 UTC m=+0.094280850 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 01:40:42 np0005539505 python3.9[179899]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:43 np0005539505 python3.9[180051]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:44 np0005539505 python3.9[180203]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:45 np0005539505 python3.9[180355]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:45 np0005539505 python3.9[180507]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:48 np0005539505 podman[180532]: 2025-11-29 06:40:48.739240244 +0000 UTC m=+0.075682888 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 01:40:50 np0005539505 python3.9[180679]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 29 01:40:51 np0005539505 python3.9[180832]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:40:52 np0005539505 python3.9[180990]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:40:54 np0005539505 systemd-logind[794]: New session 25 of user zuul.
Nov 29 01:40:54 np0005539505 systemd[1]: Started Session 25 of User zuul.
Nov 29 01:40:54 np0005539505 systemd[1]: session-25.scope: Deactivated successfully.
Nov 29 01:40:54 np0005539505 systemd-logind[794]: Session 25 logged out. Waiting for processes to exit.
Nov 29 01:40:54 np0005539505 systemd-logind[794]: Removed session 25.
Nov 29 01:40:55 np0005539505 python3.9[181176]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:55 np0005539505 python3.9[181297]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398454.7340937-3421-129839437465743/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:56 np0005539505 python3.9[181447]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:56 np0005539505 python3.9[181523]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:57 np0005539505 python3.9[181673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:58 np0005539505 python3.9[181794]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398457.0262125-3421-276097119842089/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:58 np0005539505 python3.9[181944]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:40:59 np0005539505 python3.9[182065]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398458.160626-3421-50443881654545/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:40:59 np0005539505 python3.9[182215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:00 np0005539505 python3.9[182336]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398459.2921932-3421-237703550780462/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:00 np0005539505 python3.9[182486]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:01 np0005539505 python3.9[182607]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398460.3206265-3421-91823906702787/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:02 np0005539505 python3.9[182759]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:03 np0005539505 python3.9[182911]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:04 np0005539505 python3.9[183063]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:04 np0005539505 python3.9[183215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:05 np0005539505 python3.9[183338]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764398464.3411462-3742-199897043164873/.source _original_basename=.un7ixd64 follow=False checksum=f0cf1d5447f2d601ff5d5bbff9fb6d0c4f95e65c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 29 01:41:05 np0005539505 podman[183365]: 2025-11-29 06:41:05.729539884 +0000 UTC m=+0.060661534 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 29 01:41:06 np0005539505 python3.9[183509]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:06 np0005539505 python3.9[183661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:07 np0005539505 python3.9[183782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398466.5482292-3820-47352602163371/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:08 np0005539505 python3.9[183932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:41:08 np0005539505 python3.9[184053]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398467.7875273-3865-226320366977702/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:09 np0005539505 python3.9[184205]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 29 01:41:10 np0005539505 python3.9[184357]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:41:11 np0005539505 python3[184509]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:41:11 np0005539505 podman[184546]: 2025-11-29 06:41:11.800305674 +0000 UTC m=+0.046656782 container create b7331d3ce283088ce521e3cccb36a9936cd85c1b13750ce9645f834b60b0e5ff (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:41:11 np0005539505 podman[184546]: 2025-11-29 06:41:11.774390799 +0000 UTC m=+0.020741927 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:41:11 np0005539505 python3[184509]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 29 01:41:12 np0005539505 podman[184709]: 2025-11-29 06:41:12.698182909 +0000 UTC m=+0.118362098 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 29 01:41:12 np0005539505 python3.9[184756]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:14 np0005539505 python3.9[184918]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 29 01:41:14 np0005539505 python3.9[185070]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:41:15 np0005539505 python3[185222]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:41:16 np0005539505 podman[185257]: 2025-11-29 06:41:16.154587147 +0000 UTC m=+0.025916525 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 29 01:41:16 np0005539505 podman[185257]: 2025-11-29 06:41:16.583180737 +0000 UTC m=+0.454510075 container create d13111f2c6794b73cca51bcc4ec786a1758ab3ca4ac12afd43dd71ea5f2e31ad (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:41:16 np0005539505 python3[185222]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 29 01:41:17 np0005539505 python3.9[185447]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:18 np0005539505 python3.9[185601]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:18 np0005539505 podman[185724]: 2025-11-29 06:41:18.882973146 +0000 UTC m=+0.054404803 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible)
Nov 29 01:41:19 np0005539505 python3.9[185772]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398478.4953754-4140-200261557874073/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:19 np0005539505 python3.9[185848]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:41:19 np0005539505 systemd[1]: Reloading.
Nov 29 01:41:19 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:19 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:20 np0005539505 python3.9[185960]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:20 np0005539505 systemd[1]: Reloading.
Nov 29 01:41:20 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:20 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:21 np0005539505 systemd[1]: Starting nova_compute container...
Nov 29 01:41:21 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:41:21 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdcbae7d770ff9091d27d75320c1fcfbd0ac339f53258d0e79bcebef2a63f1a3/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:21 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdcbae7d770ff9091d27d75320c1fcfbd0ac339f53258d0e79bcebef2a63f1a3/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:21 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdcbae7d770ff9091d27d75320c1fcfbd0ac339f53258d0e79bcebef2a63f1a3/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:21 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdcbae7d770ff9091d27d75320c1fcfbd0ac339f53258d0e79bcebef2a63f1a3/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:21 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdcbae7d770ff9091d27d75320c1fcfbd0ac339f53258d0e79bcebef2a63f1a3/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:21 np0005539505 podman[185999]: 2025-11-29 06:41:21.196876286 +0000 UTC m=+0.134968710 container init d13111f2c6794b73cca51bcc4ec786a1758ab3ca4ac12afd43dd71ea5f2e31ad (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 01:41:21 np0005539505 podman[185999]: 2025-11-29 06:41:21.203860384 +0000 UTC m=+0.141952778 container start d13111f2c6794b73cca51bcc4ec786a1758ab3ca4ac12afd43dd71ea5f2e31ad (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 01:41:21 np0005539505 nova_compute[186014]: + sudo -E kolla_set_configs
Nov 29 01:41:21 np0005539505 podman[185999]: nova_compute
Nov 29 01:41:21 np0005539505 systemd[1]: Started nova_compute container.
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Validating config file
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Copying service configuration files
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Deleting /etc/ceph
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Creating directory /etc/ceph
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Writing out command to execute
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:21 np0005539505 nova_compute[186014]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:41:21 np0005539505 nova_compute[186014]: ++ cat /run_command
Nov 29 01:41:21 np0005539505 nova_compute[186014]: + CMD=nova-compute
Nov 29 01:41:21 np0005539505 nova_compute[186014]: + ARGS=
Nov 29 01:41:21 np0005539505 nova_compute[186014]: + sudo kolla_copy_cacerts
Nov 29 01:41:21 np0005539505 nova_compute[186014]: + [[ ! -n '' ]]
Nov 29 01:41:21 np0005539505 nova_compute[186014]: + . kolla_extend_start
Nov 29 01:41:21 np0005539505 nova_compute[186014]: Running command: 'nova-compute'
Nov 29 01:41:21 np0005539505 nova_compute[186014]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 01:41:21 np0005539505 nova_compute[186014]: + umask 0022
Nov 29 01:41:21 np0005539505 nova_compute[186014]: + exec nova-compute
Nov 29 01:41:23 np0005539505 nova_compute[186014]: 2025-11-29 06:41:23.375 186018 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:23 np0005539505 nova_compute[186014]: 2025-11-29 06:41:23.375 186018 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:23 np0005539505 nova_compute[186014]: 2025-11-29 06:41:23.376 186018 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:23 np0005539505 nova_compute[186014]: 2025-11-29 06:41:23.376 186018 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 29 01:41:23 np0005539505 python3.9[186176]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:23 np0005539505 nova_compute[186014]: 2025-11-29 06:41:23.524 186018 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:41:23 np0005539505 nova_compute[186014]: 2025-11-29 06:41:23.537 186018 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:41:23 np0005539505 nova_compute[186014]: 2025-11-29 06:41:23.537 186018 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.127 186018 INFO nova.virt.driver [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.241 186018 INFO nova.compute.provider_config [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.258 186018 DEBUG oslo_concurrency.lockutils [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.258 186018 DEBUG oslo_concurrency.lockutils [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.259 186018 DEBUG oslo_concurrency.lockutils [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.259 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.259 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.259 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.259 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.259 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.260 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.260 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.260 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.260 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.260 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.260 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.261 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.261 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.261 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.261 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.261 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.262 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.262 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.262 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.262 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.262 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.262 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.262 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.263 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.263 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.263 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.263 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.263 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.263 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.263 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.264 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.264 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.264 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.264 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.264 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.264 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.264 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.265 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.265 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.265 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.265 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.265 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.265 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.266 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.266 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.266 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.266 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.266 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.266 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.267 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.267 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.267 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.267 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.267 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.267 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.267 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.268 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.268 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.268 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.268 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.268 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.268 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.268 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.269 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.269 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.269 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.269 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.269 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.269 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.269 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.270 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.270 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.270 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.270 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.270 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.270 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.270 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.271 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.271 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.271 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.271 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.271 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.271 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.272 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.272 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.272 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.272 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.272 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.272 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.272 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.273 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.273 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.273 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.273 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.273 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.273 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.273 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.274 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.274 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.274 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.274 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.274 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.274 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.274 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.274 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.275 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.275 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.275 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.275 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.275 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.276 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.276 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.276 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.276 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.276 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.277 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.277 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.277 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.277 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.277 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.278 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.278 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.278 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.278 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.278 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.279 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.279 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.279 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.279 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.279 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.280 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.280 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.280 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.280 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.280 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.281 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.281 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.281 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.281 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.282 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.282 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.282 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.282 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.282 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.283 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.283 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.283 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.283 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.284 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.284 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.284 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.284 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.285 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.285 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.285 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.285 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.286 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.286 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.286 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.286 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.287 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.287 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.287 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.287 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.287 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.288 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.288 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.288 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.288 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.289 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.289 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.289 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.289 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.290 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.290 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.290 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.290 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.290 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.291 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.291 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.291 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.291 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.292 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.292 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.292 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.292 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.292 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.293 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.293 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.293 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.293 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.294 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.294 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.294 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.294 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.294 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.295 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.295 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.295 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.295 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.296 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.296 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.296 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.296 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.296 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.297 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.297 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.297 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.297 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.298 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.298 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.298 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.298 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.298 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.299 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.299 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.299 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.299 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.300 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.300 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.300 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.300 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.300 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.301 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.301 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.301 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.301 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.301 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.302 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.302 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.302 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.302 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.302 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.303 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.303 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.303 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.303 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.304 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.304 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.304 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.304 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.304 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.305 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.305 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.305 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.305 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.305 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.306 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.306 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.306 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.306 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.306 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.307 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.307 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.307 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.307 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.308 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.308 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.308 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.308 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.308 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.309 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.309 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.309 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.309 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.310 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.310 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.310 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.310 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.311 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.311 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.311 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.311 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.312 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.312 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.312 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.312 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.312 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.313 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.313 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.313 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.313 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.314 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.314 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.314 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.314 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.314 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.315 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.315 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.315 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.315 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.316 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.316 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.316 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.316 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.316 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.317 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.317 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.317 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.317 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.317 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.318 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.318 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.318 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.318 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.318 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.319 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.319 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.319 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.319 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.320 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.320 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.320 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.320 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.320 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.321 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.321 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.321 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.321 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.322 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.322 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.322 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.322 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.323 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.323 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.323 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.323 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.324 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.324 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.324 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.324 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.324 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.325 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.325 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.325 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.325 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.326 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.326 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.326 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.326 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.326 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.327 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.327 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.327 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.327 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.328 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.328 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.328 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.328 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.328 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.329 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.329 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.329 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.329 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.330 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.330 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.330 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.331 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.331 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.331 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.331 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.331 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.332 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.332 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.332 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.332 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.333 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.333 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.333 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.333 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.333 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.334 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.334 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.334 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.334 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.335 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.335 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.335 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.335 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.335 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.336 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.336 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.336 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.336 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.336 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.337 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.337 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.337 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.337 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.338 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.338 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.338 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.338 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.339 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.339 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.339 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.339 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.339 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.340 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.340 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.340 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.340 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.341 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.341 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.341 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.341 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.341 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.342 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.342 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.342 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.342 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.342 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.343 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.343 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.343 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.343 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.344 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.344 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.344 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.344 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.344 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.345 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.345 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.345 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.345 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.345 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.346 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.346 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.346 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.346 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.347 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.347 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.347 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.347 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.347 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.348 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.348 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.348 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.348 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.349 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.349 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.349 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.349 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.349 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.350 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.350 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.350 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.350 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.350 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.351 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.351 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.351 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.352 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.352 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.352 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.352 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.352 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.353 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.353 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.353 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.353 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.354 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.354 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.354 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.354 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.355 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.355 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.355 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.355 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.356 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.356 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.356 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.356 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.356 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.357 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.357 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.357 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.357 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.358 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.358 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.358 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.358 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.358 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.359 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.359 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.359 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.359 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.360 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.360 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.360 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.360 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.361 186018 WARNING oslo_config.cfg [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 01:41:24 np0005539505 nova_compute[186014]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 01:41:24 np0005539505 nova_compute[186014]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 01:41:24 np0005539505 nova_compute[186014]: and ``live_migration_inbound_addr`` respectively.
Nov 29 01:41:24 np0005539505 nova_compute[186014]: ).  Its value may be silently ignored in the future.#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.361 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.361 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.361 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.362 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.362 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.362 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.363 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.363 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.363 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.363 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.363 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.364 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.365 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.365 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.365 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.365 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.366 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.366 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.366 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.366 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.366 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.367 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.367 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.367 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.367 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.367 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.367 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.367 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.368 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.368 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.368 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.368 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.368 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.369 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.369 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.369 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.369 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.369 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.369 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.370 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.370 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.370 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.370 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.370 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.370 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.371 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.371 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.371 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.371 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.371 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.371 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.372 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.372 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.372 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.372 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.372 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.372 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.373 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.373 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.373 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.373 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.373 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.373 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.373 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.374 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.374 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.374 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.374 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.374 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.374 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.374 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.375 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.375 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.375 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.375 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.375 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.375 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.375 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.376 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.376 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.376 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.376 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.376 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.376 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.376 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.377 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.377 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.377 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.377 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.377 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.378 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.378 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.378 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.378 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.378 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.378 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.379 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.379 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.379 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.379 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.379 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.379 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.380 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.380 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.380 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.380 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.380 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.380 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.381 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.381 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.381 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.381 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.381 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.381 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.382 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.382 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.382 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.382 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.382 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.382 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.383 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.383 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.383 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.383 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.383 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.383 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.384 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.384 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.384 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.384 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.384 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.384 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.384 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.385 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.385 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.385 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.385 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.385 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.385 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.386 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.386 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.386 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.386 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.386 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.386 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.386 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.387 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.387 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.387 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.387 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.387 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.387 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.387 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.388 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.388 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.388 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.388 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.388 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.388 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.388 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.389 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.389 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.389 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.389 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.389 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.389 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.389 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.390 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.390 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.390 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.390 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.390 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.390 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.390 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.390 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.391 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.391 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.391 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.391 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.391 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.391 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.392 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.392 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.392 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.392 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.392 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.393 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.393 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.393 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.393 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.394 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.394 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.394 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.394 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.394 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.395 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.395 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.395 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.395 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.396 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.396 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.396 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.396 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.396 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.397 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.397 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.397 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.397 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.397 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.398 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.398 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.398 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.398 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.398 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.398 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.399 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.399 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.399 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.399 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.399 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.400 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.400 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.400 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.400 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.400 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.401 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.401 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.401 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.401 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.401 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.402 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.402 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.402 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.402 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.402 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.402 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.403 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.403 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.403 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.403 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.403 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.404 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.404 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.404 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.404 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.404 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.405 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.405 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.405 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.405 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.406 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.406 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.406 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.406 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.406 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.407 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.407 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.407 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.407 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.407 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.408 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.408 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.408 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.408 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.408 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.408 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.409 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.409 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.409 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.409 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.409 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.410 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.410 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.410 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.410 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.410 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.411 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.411 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.411 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.411 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.411 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.412 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.412 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.412 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.412 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.412 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.413 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.413 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.413 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.413 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.413 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.414 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.414 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.414 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.414 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.414 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.415 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.415 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.415 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.415 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.415 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.416 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.416 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.416 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.416 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.416 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.417 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.417 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.417 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.417 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.417 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.418 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.418 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.418 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.418 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.418 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.419 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.419 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.419 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.419 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.419 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.419 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.420 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.420 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.420 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.420 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.420 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.421 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.421 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.421 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.421 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.421 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.422 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.422 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.422 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.422 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.423 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.423 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.423 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.423 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.423 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.424 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.424 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.424 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.424 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.424 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.425 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.425 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.425 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.425 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.425 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.426 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.426 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.426 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.426 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.426 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.427 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.427 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.427 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.427 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.427 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.428 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.428 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.428 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.428 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.428 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.428 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.429 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.429 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.429 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.429 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.429 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.430 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.430 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.430 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.430 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.430 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.431 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.431 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.431 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.431 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.431 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.432 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.432 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.432 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.432 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.432 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.432 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.433 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.433 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.433 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.433 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.433 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.434 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.434 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.434 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.434 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.434 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.435 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.435 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.435 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.435 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.435 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.436 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.436 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.436 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.436 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.436 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.437 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.437 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.437 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.437 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.437 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.437 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.438 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.438 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.438 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.438 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.438 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.439 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.439 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.439 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.439 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.439 186018 DEBUG oslo_service.service [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.441 186018 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.456 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.456 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.457 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.457 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 29 01:41:24 np0005539505 python3.9[186331]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:24 np0005539505 systemd[1]: Starting libvirt QEMU daemon...
Nov 29 01:41:24 np0005539505 systemd[1]: Started libvirt QEMU daemon.
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.517 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f5d0a9c5bb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.519 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f5d0a9c5bb0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.520 186018 INFO nova.virt.libvirt.driver [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.544 186018 WARNING nova.virt.libvirt.driver [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Nov 29 01:41:24 np0005539505 nova_compute[186014]: 2025-11-29 06:41:24.544 186018 DEBUG nova.virt.libvirt.volume.mount [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 29 01:41:25 np0005539505 python3.9[186541]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.298 186018 INFO nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <host>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <uuid>e5303377-33ba-4369-bb3f-4ec9de742582</uuid>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <cpu>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <arch>x86_64</arch>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model>EPYC-Rome-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <vendor>AMD</vendor>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <microcode version='16777317'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <signature family='23' model='49' stepping='0'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='x2apic'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='tsc-deadline'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='osxsave'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='hypervisor'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='tsc_adjust'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='spec-ctrl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='stibp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='arch-capabilities'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='cmp_legacy'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='topoext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='virt-ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='lbrv'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='tsc-scale'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='vmcb-clean'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='pause-filter'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='pfthreshold'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='svme-addr-chk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='rdctl-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='skip-l1dfl-vmentry'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='mds-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature name='pschange-mc-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <pages unit='KiB' size='4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <pages unit='KiB' size='2048'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <pages unit='KiB' size='1048576'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </cpu>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <power_management>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <suspend_mem/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <suspend_disk/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <suspend_hybrid/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </power_management>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <iommu support='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <migration_features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <live/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <uri_transports>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <uri_transport>tcp</uri_transport>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <uri_transport>rdma</uri_transport>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </uri_transports>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </migration_features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <topology>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <cells num='1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <cell id='0'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:          <memory unit='KiB'>7864312</memory>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:          <pages unit='KiB' size='4'>1966078</pages>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:          <pages unit='KiB' size='2048'>0</pages>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:          <distances>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:            <sibling id='0' value='10'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:          </distances>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:          <cpus num='8'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:          </cpus>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        </cell>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </cells>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </topology>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <cache>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </cache>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <secmodel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model>selinux</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <doi>0</doi>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </secmodel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <secmodel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model>dac</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <doi>0</doi>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </secmodel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </host>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <guest>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <os_type>hvm</os_type>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <arch name='i686'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <wordsize>32</wordsize>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <domain type='qemu'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <domain type='kvm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </arch>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <pae/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <nonpae/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <acpi default='on' toggle='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <apic default='on' toggle='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <cpuselection/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <deviceboot/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <externalSnapshot/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </guest>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <guest>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <os_type>hvm</os_type>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <arch name='x86_64'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <wordsize>64</wordsize>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <domain type='qemu'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <domain type='kvm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </arch>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <acpi default='on' toggle='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <apic default='on' toggle='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <cpuselection/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <deviceboot/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <externalSnapshot/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </guest>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 
Nov 29 01:41:25 np0005539505 nova_compute[186014]: </capabilities>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: #033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.305 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.320 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 01:41:25 np0005539505 nova_compute[186014]: <domainCapabilities>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <domain>kvm</domain>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <arch>i686</arch>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <vcpu max='240'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <iothreads supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <os supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <enum name='firmware'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <loader supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>rom</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pflash</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='readonly'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>yes</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>no</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='secure'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>no</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </loader>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </os>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <cpu>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>on</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>off</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='maximumMigratable'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>on</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>off</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <vendor>AMD</vendor>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='succor'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='custom' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-128'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-256'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-512'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='KnightsMill'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SierraForest'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='athlon'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='athlon-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='core2duo'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='core2duo-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='coreduo'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='coreduo-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='n270'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='n270-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='phenom'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='phenom-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </cpu>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <memoryBacking supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <enum name='sourceType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>file</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>anonymous</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>memfd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </memoryBacking>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <devices>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <disk supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='diskDevice'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>disk</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>cdrom</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>floppy</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>lun</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='bus'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>ide</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>fdc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>scsi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>sata</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </disk>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <graphics supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vnc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>egl-headless</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dbus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </graphics>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <video supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='modelType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vga</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>cirrus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>none</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>bochs</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>ramfb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </video>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <hostdev supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='mode'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>subsystem</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='startupPolicy'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>default</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>mandatory</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>requisite</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>optional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='subsysType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pci</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>scsi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='capsType'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='pciBackend'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </hostdev>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <rng supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>random</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>egd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>builtin</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </rng>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <filesystem supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='driverType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>path</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>handle</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtiofs</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </filesystem>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <tpm supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tpm-tis</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tpm-crb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>emulator</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>external</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendVersion'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>2.0</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </tpm>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <redirdev supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='bus'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </redirdev>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <channel supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pty</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>unix</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </channel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <crypto supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>qemu</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>builtin</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </crypto>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <interface supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>default</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>passt</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </interface>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <panic supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>isa</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>hyperv</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </panic>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <console supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>null</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pty</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dev</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>file</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pipe</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>stdio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>udp</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tcp</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>unix</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>qemu-vdagent</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dbus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </console>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </devices>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <gic supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <genid supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <backup supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <async-teardown supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <ps2 supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <sev supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <sgx supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <hyperv supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='features'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>relaxed</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vapic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>spinlocks</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vpindex</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>runtime</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>synic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>stimer</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>reset</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vendor_id</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>frequencies</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>reenlightenment</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tlbflush</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>ipi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>avic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>emsr_bitmap</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>xmm_input</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <defaults>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </defaults>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </hyperv>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <launchSecurity supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='sectype'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tdx</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </launchSecurity>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: </domainCapabilities>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.327 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 01:41:25 np0005539505 nova_compute[186014]: <domainCapabilities>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <domain>kvm</domain>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <arch>i686</arch>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <vcpu max='4096'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <iothreads supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <os supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <enum name='firmware'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <loader supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>rom</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pflash</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='readonly'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>yes</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>no</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='secure'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>no</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </loader>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </os>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <cpu>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>on</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>off</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='maximumMigratable'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>on</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>off</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <vendor>AMD</vendor>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='succor'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='custom' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-128'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-256'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-512'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='KnightsMill'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SierraForest'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='athlon'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='athlon-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='core2duo'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='core2duo-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='coreduo'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='coreduo-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='n270'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='n270-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='phenom'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='phenom-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </cpu>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <memoryBacking supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <enum name='sourceType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>file</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>anonymous</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>memfd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </memoryBacking>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <devices>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <disk supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='diskDevice'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>disk</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>cdrom</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>floppy</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>lun</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='bus'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>fdc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>scsi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>sata</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </disk>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <graphics supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vnc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>egl-headless</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dbus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </graphics>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <video supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='modelType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vga</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>cirrus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>none</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>bochs</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>ramfb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </video>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <hostdev supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='mode'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>subsystem</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='startupPolicy'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>default</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>mandatory</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>requisite</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>optional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='subsysType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pci</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>scsi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='capsType'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='pciBackend'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </hostdev>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <rng supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>random</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>egd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>builtin</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </rng>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <filesystem supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='driverType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>path</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>handle</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtiofs</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </filesystem>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <tpm supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tpm-tis</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tpm-crb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>emulator</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>external</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendVersion'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>2.0</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </tpm>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <redirdev supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='bus'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </redirdev>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <channel supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pty</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>unix</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </channel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <crypto supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>qemu</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>builtin</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </crypto>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <interface supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>default</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>passt</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </interface>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <panic supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>isa</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>hyperv</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </panic>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <console supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>null</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pty</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dev</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>file</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pipe</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>stdio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>udp</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tcp</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>unix</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>qemu-vdagent</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dbus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </console>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </devices>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <gic supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <genid supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <backup supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <async-teardown supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <ps2 supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <sev supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <sgx supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <hyperv supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='features'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>relaxed</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vapic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>spinlocks</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vpindex</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>runtime</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>synic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>stimer</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>reset</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vendor_id</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>frequencies</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>reenlightenment</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tlbflush</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>ipi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>avic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>emsr_bitmap</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>xmm_input</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <defaults>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </defaults>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </hyperv>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <launchSecurity supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='sectype'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tdx</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </launchSecurity>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: </domainCapabilities>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.350 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.353 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 01:41:25 np0005539505 nova_compute[186014]: <domainCapabilities>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <domain>kvm</domain>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <arch>x86_64</arch>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <vcpu max='240'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <iothreads supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <os supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <enum name='firmware'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <loader supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>rom</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pflash</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='readonly'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>yes</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>no</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='secure'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>no</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </loader>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </os>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <cpu>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>on</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>off</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='maximumMigratable'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>on</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>off</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <vendor>AMD</vendor>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='succor'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='custom' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-128'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-256'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-512'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='KnightsMill'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SierraForest'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='athlon'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='athlon-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='core2duo'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='core2duo-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='coreduo'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='coreduo-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='n270'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='n270-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='phenom'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='phenom-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </cpu>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <memoryBacking supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <enum name='sourceType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>file</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>anonymous</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>memfd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </memoryBacking>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <devices>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <disk supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='diskDevice'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>disk</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>cdrom</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>floppy</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>lun</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='bus'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>ide</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>fdc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>scsi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>sata</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </disk>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <graphics supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vnc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>egl-headless</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dbus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </graphics>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <video supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='modelType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vga</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>cirrus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>none</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>bochs</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>ramfb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </video>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <hostdev supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='mode'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>subsystem</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='startupPolicy'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>default</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>mandatory</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>requisite</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>optional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='subsysType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pci</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>scsi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='capsType'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='pciBackend'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </hostdev>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <rng supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>random</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>egd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>builtin</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </rng>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <filesystem supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='driverType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>path</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>handle</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtiofs</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </filesystem>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <tpm supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tpm-tis</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tpm-crb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>emulator</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>external</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendVersion'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>2.0</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </tpm>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <redirdev supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='bus'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </redirdev>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <channel supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pty</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>unix</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </channel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <crypto supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>qemu</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>builtin</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </crypto>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <interface supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>default</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>passt</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </interface>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <panic supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>isa</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>hyperv</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </panic>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <console supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>null</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pty</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dev</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>file</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pipe</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>stdio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>udp</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tcp</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>unix</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>qemu-vdagent</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dbus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </console>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </devices>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <gic supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <genid supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <backup supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <async-teardown supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <ps2 supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <sev supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <sgx supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <hyperv supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='features'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>relaxed</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vapic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>spinlocks</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vpindex</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>runtime</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>synic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>stimer</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>reset</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vendor_id</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>frequencies</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>reenlightenment</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tlbflush</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>ipi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>avic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>emsr_bitmap</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>xmm_input</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <defaults>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </defaults>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </hyperv>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <launchSecurity supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='sectype'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tdx</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </launchSecurity>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: </domainCapabilities>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.409 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 01:41:25 np0005539505 nova_compute[186014]: <domainCapabilities>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <domain>kvm</domain>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <arch>x86_64</arch>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <vcpu max='4096'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <iothreads supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <os supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <enum name='firmware'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>efi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <loader supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>rom</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pflash</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='readonly'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>yes</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>no</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='secure'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>yes</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>no</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </loader>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </os>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <cpu>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>on</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>off</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='maximumMigratable'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>on</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>off</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <vendor>AMD</vendor>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='succor'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <mode name='custom' supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Denverton-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='auto-ibrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amd-psfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='stibp-always-on'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='EPYC-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-128'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-256'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx10-512'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='prefetchiti'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Haswell-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='KnightsMill'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512er'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512pf'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fma4'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tbm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xop'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='amx-tile'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-bf16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-fp16'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bitalg'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrc'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fzrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='la57'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='taa-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xfd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SierraForest'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ifma'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cmpccxadd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fbsdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='fsrs'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ibrs-all'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mcdt-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pbrsb-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='psdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='serialize'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vaes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='hle'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='rtm'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512bw'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512cd'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512dq'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512f'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='avx512vl'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='invpcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pcid'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='pku'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='mpx'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='core-capability'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='split-lock-detect'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='cldemote'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='erms'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='gfni'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdir64b'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='movdiri'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='xsaves'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='athlon'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='athlon-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='core2duo'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='core2duo-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='coreduo'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='coreduo-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='n270'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='n270-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='ss'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='phenom'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <blockers model='phenom-v1'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnow'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <feature name='3dnowext'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </blockers>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </mode>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </cpu>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <memoryBacking supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <enum name='sourceType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>file</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>anonymous</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <value>memfd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </memoryBacking>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <devices>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <disk supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='diskDevice'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>disk</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>cdrom</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>floppy</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>lun</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='bus'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>fdc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>scsi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>sata</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </disk>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <graphics supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vnc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>egl-headless</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dbus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </graphics>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <video supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='modelType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vga</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>cirrus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>none</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>bochs</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>ramfb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </video>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <hostdev supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='mode'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>subsystem</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='startupPolicy'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>default</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>mandatory</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>requisite</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>optional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='subsysType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pci</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>scsi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='capsType'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='pciBackend'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </hostdev>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <rng supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtio-non-transitional</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>random</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>egd</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>builtin</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </rng>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <filesystem supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='driverType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>path</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>handle</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>virtiofs</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </filesystem>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <tpm supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tpm-tis</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tpm-crb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>emulator</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>external</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendVersion'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>2.0</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </tpm>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <redirdev supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='bus'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>usb</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </redirdev>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <channel supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pty</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>unix</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </channel>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <crypto supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>qemu</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendModel'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>builtin</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </crypto>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <interface supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='backendType'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>default</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>passt</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </interface>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <panic supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='model'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>isa</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>hyperv</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </panic>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <console supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='type'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>null</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vc</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pty</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dev</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>file</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>pipe</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>stdio</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>udp</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tcp</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>unix</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>qemu-vdagent</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>dbus</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </console>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </devices>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <gic supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <genid supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <backup supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <async-teardown supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <ps2 supported='yes'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <sev supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <sgx supported='no'/>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <hyperv supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='features'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>relaxed</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vapic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>spinlocks</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vpindex</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>runtime</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>synic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>stimer</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>reset</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>vendor_id</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>frequencies</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>reenlightenment</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tlbflush</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>ipi</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>avic</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>emsr_bitmap</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>xmm_input</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <defaults>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </defaults>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </hyperv>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    <launchSecurity supported='yes'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      <enum name='sectype'>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:        <value>tdx</value>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:      </enum>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:    </launchSecurity>
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  </features>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: </domainCapabilities>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.464 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.464 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.465 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.465 186018 INFO nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Secure Boot support detected#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.467 186018 INFO nova.virt.libvirt.driver [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.467 186018 INFO nova.virt.libvirt.driver [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.477 186018 DEBUG nova.virt.libvirt.driver [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 01:41:25 np0005539505 nova_compute[186014]:  <model>Nehalem</model>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: </cpu>
Nov 29 01:41:25 np0005539505 nova_compute[186014]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.480 186018 DEBUG nova.virt.libvirt.driver [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.538 186018 INFO nova.virt.node [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Determined node identity 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from /var/lib/nova/compute_id#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.555 186018 WARNING nova.compute.manager [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Compute nodes ['2d55ea77-8118-4f48-9bb5-d62d10fd53c0'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.584 186018 INFO nova.compute.manager [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.664 186018 WARNING nova.compute.manager [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.664 186018 DEBUG oslo_concurrency.lockutils [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.665 186018 DEBUG oslo_concurrency.lockutils [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.665 186018 DEBUG oslo_concurrency.lockutils [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.665 186018 DEBUG nova.compute.resource_tracker [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:41:25 np0005539505 systemd[1]: Starting libvirt nodedev daemon...
Nov 29 01:41:25 np0005539505 systemd[1]: Started libvirt nodedev daemon.
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.907 186018 WARNING nova.virt.libvirt.driver [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.908 186018 DEBUG nova.compute.resource_tracker [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6188MB free_disk=73.54647064208984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.908 186018 DEBUG oslo_concurrency.lockutils [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.909 186018 DEBUG oslo_concurrency.lockutils [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.933 186018 WARNING nova.compute.resource_tracker [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] No compute node record for compute-2.ctlplane.example.com:2d55ea77-8118-4f48-9bb5-d62d10fd53c0: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 could not be found.#033[00m
Nov 29 01:41:25 np0005539505 nova_compute[186014]: 2025-11-29 06:41:25.953 186018 INFO nova.compute.resource_tracker [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0#033[00m
Nov 29 01:41:26 np0005539505 nova_compute[186014]: 2025-11-29 06:41:26.022 186018 DEBUG nova.compute.resource_tracker [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:41:26 np0005539505 nova_compute[186014]: 2025-11-29 06:41:26.023 186018 DEBUG nova.compute.resource_tracker [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:41:26 np0005539505 python3.9[186720]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 01:41:26 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:41:26.915 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:41:26.917 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:41:26.918 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.012 186018 INFO nova.scheduler.client.report [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] [req-f981e933-d432-4157-800a-54914e2d24c5] Created resource provider record via placement API for resource provider with UUID 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 and name compute-2.ctlplane.example.com.#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.038 186018 DEBUG nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 29 01:41:27 np0005539505 nova_compute[186014]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.038 186018 INFO nova.virt.libvirt.host [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.039 186018 DEBUG nova.compute.provider_tree [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.040 186018 DEBUG nova.virt.libvirt.driver [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.042 186018 DEBUG nova.virt.libvirt.driver [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Libvirt baseline CPU <cpu>
Nov 29 01:41:27 np0005539505 nova_compute[186014]:  <arch>x86_64</arch>
Nov 29 01:41:27 np0005539505 nova_compute[186014]:  <model>Nehalem</model>
Nov 29 01:41:27 np0005539505 nova_compute[186014]:  <vendor>AMD</vendor>
Nov 29 01:41:27 np0005539505 nova_compute[186014]:  <topology sockets="8" cores="1" threads="1"/>
Nov 29 01:41:27 np0005539505 nova_compute[186014]: </cpu>
Nov 29 01:41:27 np0005539505 nova_compute[186014]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.106 186018 DEBUG nova.scheduler.client.report [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Updated inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.107 186018 DEBUG nova.compute.provider_tree [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Updating resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.107 186018 DEBUG nova.compute.provider_tree [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.202 186018 DEBUG nova.compute.provider_tree [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Updating resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.233 186018 DEBUG nova.compute.resource_tracker [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.233 186018 DEBUG oslo_concurrency.lockutils [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.233 186018 DEBUG nova.service [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.316 186018 DEBUG nova.service [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.316 186018 DEBUG nova.servicegroup.drivers.db [None req-e90461ae-c81f-4328-b367-2f402751bdb3 - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 29 01:41:27 np0005539505 python3.9[186895]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:41:27 np0005539505 systemd[1]: Stopping nova_compute container...
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.894 186018 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.896 186018 DEBUG oslo_concurrency.lockutils [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.896 186018 DEBUG oslo_concurrency.lockutils [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:41:27 np0005539505 nova_compute[186014]: 2025-11-29 06:41:27.896 186018 DEBUG oslo_concurrency.lockutils [None req-cfe5ca66-999d-4af3-ab66-4371b6c9259b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:41:28 np0005539505 virtqemud[186353]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 29 01:41:28 np0005539505 virtqemud[186353]: hostname: compute-2
Nov 29 01:41:28 np0005539505 virtqemud[186353]: End of file while reading data: Input/output error
Nov 29 01:41:28 np0005539505 systemd[1]: libpod-d13111f2c6794b73cca51bcc4ec786a1758ab3ca4ac12afd43dd71ea5f2e31ad.scope: Deactivated successfully.
Nov 29 01:41:28 np0005539505 systemd[1]: libpod-d13111f2c6794b73cca51bcc4ec786a1758ab3ca4ac12afd43dd71ea5f2e31ad.scope: Consumed 3.356s CPU time.
Nov 29 01:41:28 np0005539505 podman[186899]: 2025-11-29 06:41:28.328283098 +0000 UTC m=+0.741718097 container died d13111f2c6794b73cca51bcc4ec786a1758ab3ca4ac12afd43dd71ea5f2e31ad (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Nov 29 01:41:28 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d13111f2c6794b73cca51bcc4ec786a1758ab3ca4ac12afd43dd71ea5f2e31ad-userdata-shm.mount: Deactivated successfully.
Nov 29 01:41:28 np0005539505 systemd[1]: var-lib-containers-storage-overlay-bdcbae7d770ff9091d27d75320c1fcfbd0ac339f53258d0e79bcebef2a63f1a3-merged.mount: Deactivated successfully.
Nov 29 01:41:29 np0005539505 podman[186899]: 2025-11-29 06:41:29.086751131 +0000 UTC m=+1.500186130 container cleanup d13111f2c6794b73cca51bcc4ec786a1758ab3ca4ac12afd43dd71ea5f2e31ad (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:41:29 np0005539505 podman[186899]: nova_compute
Nov 29 01:41:29 np0005539505 podman[186930]: nova_compute
Nov 29 01:41:29 np0005539505 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 29 01:41:29 np0005539505 systemd[1]: Stopped nova_compute container.
Nov 29 01:41:29 np0005539505 systemd[1]: Starting nova_compute container...
Nov 29 01:41:29 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:41:29 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdcbae7d770ff9091d27d75320c1fcfbd0ac339f53258d0e79bcebef2a63f1a3/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:29 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdcbae7d770ff9091d27d75320c1fcfbd0ac339f53258d0e79bcebef2a63f1a3/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:29 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdcbae7d770ff9091d27d75320c1fcfbd0ac339f53258d0e79bcebef2a63f1a3/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:29 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdcbae7d770ff9091d27d75320c1fcfbd0ac339f53258d0e79bcebef2a63f1a3/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:29 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdcbae7d770ff9091d27d75320c1fcfbd0ac339f53258d0e79bcebef2a63f1a3/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:30 np0005539505 podman[186943]: 2025-11-29 06:41:30.016439941 +0000 UTC m=+0.852026871 container init d13111f2c6794b73cca51bcc4ec786a1758ab3ca4ac12afd43dd71ea5f2e31ad (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:41:30 np0005539505 podman[186943]: 2025-11-29 06:41:30.022052694 +0000 UTC m=+0.857639604 container start d13111f2c6794b73cca51bcc4ec786a1758ab3ca4ac12afd43dd71ea5f2e31ad (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:41:30 np0005539505 nova_compute[186958]: + sudo -E kolla_set_configs
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Validating config file
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Copying service configuration files
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Deleting /etc/ceph
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Creating directory /etc/ceph
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /etc/ceph
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Writing out command to execute
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:30 np0005539505 nova_compute[186958]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 29 01:41:30 np0005539505 nova_compute[186958]: ++ cat /run_command
Nov 29 01:41:30 np0005539505 nova_compute[186958]: + CMD=nova-compute
Nov 29 01:41:30 np0005539505 nova_compute[186958]: + ARGS=
Nov 29 01:41:30 np0005539505 nova_compute[186958]: + sudo kolla_copy_cacerts
Nov 29 01:41:30 np0005539505 nova_compute[186958]: + [[ ! -n '' ]]
Nov 29 01:41:30 np0005539505 nova_compute[186958]: + . kolla_extend_start
Nov 29 01:41:30 np0005539505 nova_compute[186958]: Running command: 'nova-compute'
Nov 29 01:41:30 np0005539505 nova_compute[186958]: + echo 'Running command: '\''nova-compute'\'''
Nov 29 01:41:30 np0005539505 nova_compute[186958]: + umask 0022
Nov 29 01:41:30 np0005539505 nova_compute[186958]: + exec nova-compute
Nov 29 01:41:30 np0005539505 podman[186943]: nova_compute
Nov 29 01:41:30 np0005539505 systemd[1]: Started nova_compute container.
Nov 29 01:41:31 np0005539505 python3.9[187122]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 29 01:41:32 np0005539505 nova_compute[186958]: 2025-11-29 06:41:32.204 186962 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:32 np0005539505 nova_compute[186958]: 2025-11-29 06:41:32.205 186962 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:32 np0005539505 nova_compute[186958]: 2025-11-29 06:41:32.205 186962 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 29 01:41:32 np0005539505 nova_compute[186958]: 2025-11-29 06:41:32.205 186962 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 29 01:41:32 np0005539505 nova_compute[186958]: 2025-11-29 06:41:32.346 186962 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:41:32 np0005539505 nova_compute[186958]: 2025-11-29 06:41:32.367 186962 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:41:32 np0005539505 nova_compute[186958]: 2025-11-29 06:41:32.368 186962 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 01:41:32 np0005539505 systemd[1]: Started libpod-conmon-b7331d3ce283088ce521e3cccb36a9936cd85c1b13750ce9645f834b60b0e5ff.scope.
Nov 29 01:41:32 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:41:32 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3057c0e3b8572c767313e0a2f53acfb357aba7cf731cfaf4c7ecc642f8124eb/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:32 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3057c0e3b8572c767313e0a2f53acfb357aba7cf731cfaf4c7ecc642f8124eb/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:32 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3057c0e3b8572c767313e0a2f53acfb357aba7cf731cfaf4c7ecc642f8124eb/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 29 01:41:32 np0005539505 podman[187149]: 2025-11-29 06:41:32.634269328 +0000 UTC m=+0.529807029 container init b7331d3ce283088ce521e3cccb36a9936cd85c1b13750ce9645f834b60b0e5ff (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute_init)
Nov 29 01:41:32 np0005539505 podman[187149]: 2025-11-29 06:41:32.643405406 +0000 UTC m=+0.538943107 container start b7331d3ce283088ce521e3cccb36a9936cd85c1b13750ce9645f834b60b0e5ff (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=nova_compute_init, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Applying nova statedir ownership
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 29 01:41:32 np0005539505 nova_compute_init[187173]: INFO:nova_statedir:Nova statedir ownership complete
Nov 29 01:41:32 np0005539505 systemd[1]: libpod-b7331d3ce283088ce521e3cccb36a9936cd85c1b13750ce9645f834b60b0e5ff.scope: Deactivated successfully.
Nov 29 01:41:32 np0005539505 python3.9[187122]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 29 01:41:32 np0005539505 podman[187174]: 2025-11-29 06:41:32.816078004 +0000 UTC m=+0.105854651 container died b7331d3ce283088ce521e3cccb36a9936cd85c1b13750ce9645f834b60b0e5ff (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0)
Nov 29 01:41:32 np0005539505 nova_compute[186958]: 2025-11-29 06:41:32.974 186962 INFO nova.virt.driver [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.102 186962 INFO nova.compute.provider_config [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 29 01:41:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7331d3ce283088ce521e3cccb36a9936cd85c1b13750ce9645f834b60b0e5ff-userdata-shm.mount: Deactivated successfully.
Nov 29 01:41:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay-d3057c0e3b8572c767313e0a2f53acfb357aba7cf731cfaf4c7ecc642f8124eb-merged.mount: Deactivated successfully.
Nov 29 01:41:33 np0005539505 podman[187174]: 2025-11-29 06:41:33.23874828 +0000 UTC m=+0.528524827 container cleanup b7331d3ce283088ce521e3cccb36a9936cd85c1b13750ce9645f834b60b0e5ff (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:41:33 np0005539505 systemd[1]: libpod-conmon-b7331d3ce283088ce521e3cccb36a9936cd85c1b13750ce9645f834b60b0e5ff.scope: Deactivated successfully.
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.257 186962 DEBUG oslo_concurrency.lockutils [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.257 186962 DEBUG oslo_concurrency.lockutils [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.257 186962 DEBUG oslo_concurrency.lockutils [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.258 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.258 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.258 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.258 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.258 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.258 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.259 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.259 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.259 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.259 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.259 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.259 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.260 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.260 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.260 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.260 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.260 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.260 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.261 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.261 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.261 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.261 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.261 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.261 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.261 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.262 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.262 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.262 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.262 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.262 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.262 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.263 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.263 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.263 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.263 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.263 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.263 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.263 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.264 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.264 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.264 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.264 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.264 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.265 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.265 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.265 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.265 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.265 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.265 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.266 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.266 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.266 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.266 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.266 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.266 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.266 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.267 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.267 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.267 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.267 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.267 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.267 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.268 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.268 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.268 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.268 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.268 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.269 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.269 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.269 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.269 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.269 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.270 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.270 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.270 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.270 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.271 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.271 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.271 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.271 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.271 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.271 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.272 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.272 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.272 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.272 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.272 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.273 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.273 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.273 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.274 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.275 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.275 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.275 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.275 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.275 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.276 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.276 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.276 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.276 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.276 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.276 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.277 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.277 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.277 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.277 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.277 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.278 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.278 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.278 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.278 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.278 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.278 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.278 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.279 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.279 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.279 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.279 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.279 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.279 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.280 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.280 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.280 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.280 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.280 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.280 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.281 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.281 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.281 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.281 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.281 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.281 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.282 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.282 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.282 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.282 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.282 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.282 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.283 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.283 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.283 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.283 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.283 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.283 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.284 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.284 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.284 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.284 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.284 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.284 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.285 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.285 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.289 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.289 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.289 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.290 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.290 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.290 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.290 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.290 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.290 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.291 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.291 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.291 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.291 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.291 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.291 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.292 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.292 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.292 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.292 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.292 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.292 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.293 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.293 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.293 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.293 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.294 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.294 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.294 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.294 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.295 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.295 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.295 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.295 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.295 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.295 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.296 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.296 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.296 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.296 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.296 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.296 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.297 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.297 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.297 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.297 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.297 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.297 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.298 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.298 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.298 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.298 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.298 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.298 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.298 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.299 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.299 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.299 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.299 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.299 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.300 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.300 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.300 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.300 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.300 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.300 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.301 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.301 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.301 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.301 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.301 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.301 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.302 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.302 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.302 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.302 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.302 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.302 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.302 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.303 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.303 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.303 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.303 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.303 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.303 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.304 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.304 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.304 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.304 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.304 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.304 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.305 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.305 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.305 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.305 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.305 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.305 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.306 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.306 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.306 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.306 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.306 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.306 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.306 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.307 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.307 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.307 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.307 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.307 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.307 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.307 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.308 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.308 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.308 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.308 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.308 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.308 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.309 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.309 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.309 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.309 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.309 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.309 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.310 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.310 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.310 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.310 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.310 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.310 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.311 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.311 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.311 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.311 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.311 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.312 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.312 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.312 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.312 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.312 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.312 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.312 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.313 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.313 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.313 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.313 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.313 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.313 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.313 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.314 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.314 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.314 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.314 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.314 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.314 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.315 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.315 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.315 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.315 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.315 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.315 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.315 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.316 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.316 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.316 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.316 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.316 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.316 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.316 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.317 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.317 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.317 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.317 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.317 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.317 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.318 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.318 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.318 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.318 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.318 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.318 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.318 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.319 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.319 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.319 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.319 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.319 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.319 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.320 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.320 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.320 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.320 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.320 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.320 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.321 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.321 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.321 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.321 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.321 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.321 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.321 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.322 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.322 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.322 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.322 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.322 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.323 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.323 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.323 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.323 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.323 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.323 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.324 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.324 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.324 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.324 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.324 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.324 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.324 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.325 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.325 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.325 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.325 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.325 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.325 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.325 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.326 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.326 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.326 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.326 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.326 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.326 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.326 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.327 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.327 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.327 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.327 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.327 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.327 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.327 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.328 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.328 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.328 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.328 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.328 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.328 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.329 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.329 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.329 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.329 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.329 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.329 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.330 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.330 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.330 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.330 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.330 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.330 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.331 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.331 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.331 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.331 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.331 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.331 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.331 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.332 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.332 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.332 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.332 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.332 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.332 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.333 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.333 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.333 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.333 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.333 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.333 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.333 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.334 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.334 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.334 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.334 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.334 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.335 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.335 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.335 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.335 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.335 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.335 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.336 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.336 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.336 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.336 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.336 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.336 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.336 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.337 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.337 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.337 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.337 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.337 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.337 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.338 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.338 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.338 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.338 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.338 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.338 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.339 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.339 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.339 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.339 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.339 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.339 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.339 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.340 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.340 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.340 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.340 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.341 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.341 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.341 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.341 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.341 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.341 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.342 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.342 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.342 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.342 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.342 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.343 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.343 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.343 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.343 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.343 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.343 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.344 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.344 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.344 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.344 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.344 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.344 186962 WARNING oslo_config.cfg [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 29 01:41:33 np0005539505 nova_compute[186958]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 29 01:41:33 np0005539505 nova_compute[186958]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 29 01:41:33 np0005539505 nova_compute[186958]: and ``live_migration_inbound_addr`` respectively.
Nov 29 01:41:33 np0005539505 nova_compute[186958]: ).  Its value may be silently ignored in the future.#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.345 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.345 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.345 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.345 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.345 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.345 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.346 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.346 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.346 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.346 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.346 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.346 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.346 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.347 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.347 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.347 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.347 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.347 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.347 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.348 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.348 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.348 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.348 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.348 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.348 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.348 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.349 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.349 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.349 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.349 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.349 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.350 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.350 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.350 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.350 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.350 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.350 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.351 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.351 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.351 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.351 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.351 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.351 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.351 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.352 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.352 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.352 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.352 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.352 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.352 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.353 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.353 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.353 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.353 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.353 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.353 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.353 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.354 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.354 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.354 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.354 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.354 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.354 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.354 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.355 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.355 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.355 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.355 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.355 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.355 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.355 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.356 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.356 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.356 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.356 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.356 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.356 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.357 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.357 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.357 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.357 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.357 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.357 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.358 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.358 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.358 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.358 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.358 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.358 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.359 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.359 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.359 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.359 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.359 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.359 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.359 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.360 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.360 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.360 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.360 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.360 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.361 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.361 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.361 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.361 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.361 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.361 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.361 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.362 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.362 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.362 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.362 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.362 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.362 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.363 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.363 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.363 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.363 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.363 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.363 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.363 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.364 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.364 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.364 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.364 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.364 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.364 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.365 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.365 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.365 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.365 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.365 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.365 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.366 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.366 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.366 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.366 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.366 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.366 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.367 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.367 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.367 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.367 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.367 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.367 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.368 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.368 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.368 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.368 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.368 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.368 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.369 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.369 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.369 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.369 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.369 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.369 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.369 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.370 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.370 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.370 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.370 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.370 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.370 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.371 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.371 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.371 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.371 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.371 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.371 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.371 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.372 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.372 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.372 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.372 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.372 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.372 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.372 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.373 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.373 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.373 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.373 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.373 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.374 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.374 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.374 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.374 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.374 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.374 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.374 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.375 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.375 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.375 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.375 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.375 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.375 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.376 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.376 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.376 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.376 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.376 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.377 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.377 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.377 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.377 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.377 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.377 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.378 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.378 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.378 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.378 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.378 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.378 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.378 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.379 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.379 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.379 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.379 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.379 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.379 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.379 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.380 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.380 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.380 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.380 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.380 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.380 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.381 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.381 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.381 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.381 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.381 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.381 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.381 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.382 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.382 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.382 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.382 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.382 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.382 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.382 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.383 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.383 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.383 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.383 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.383 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.383 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.383 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.384 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.384 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.384 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.384 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.384 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.385 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.385 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.385 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.385 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.385 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.385 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.385 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.386 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.386 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.386 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.386 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.386 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.386 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.386 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.387 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.387 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.387 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.387 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.387 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.387 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.387 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.388 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.388 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.388 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.388 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.388 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.389 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.389 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.389 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.389 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.389 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.390 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.390 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.390 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.390 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.390 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.390 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.391 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.391 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.391 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.391 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.391 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.391 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.392 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.392 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.392 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.392 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.392 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.393 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.393 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.393 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.393 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.393 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.393 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.394 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.394 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.394 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.394 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.394 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.394 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.394 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.395 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.395 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.395 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.395 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.395 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.395 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.395 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.396 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.396 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.396 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.396 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.396 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.396 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.397 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.397 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.397 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.397 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.397 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.397 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.397 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.398 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.398 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.398 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.398 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.398 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.398 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.398 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.399 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.399 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.399 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.399 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.399 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.399 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.399 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.400 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.400 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.400 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.400 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.400 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.400 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.400 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.401 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.401 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.401 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.401 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.401 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.401 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.401 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.402 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.402 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.402 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.402 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.402 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.402 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.402 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.403 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.403 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.403 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.403 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.403 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.403 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.403 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.404 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.404 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.404 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.404 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.404 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.404 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.404 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.405 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.405 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.405 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.405 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.405 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.405 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.406 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.406 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.406 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.406 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.406 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.406 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.406 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.407 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.407 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.407 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.407 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.407 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.407 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.408 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.408 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.408 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.408 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.408 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.409 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.409 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.409 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.409 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.409 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.409 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.410 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.410 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.410 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.410 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.410 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.411 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.411 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.411 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.411 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.411 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.411 186962 DEBUG oslo_service.service [None req-347fc608-ed5a-46c6-b93e-e62a2c72c531 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.413 186962 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.471 186962 INFO nova.virt.node [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Determined node identity 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from /var/lib/nova/compute_id#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.471 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.472 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.472 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.473 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.485 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f4e72f84910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.487 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f4e72f84910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.487 186962 INFO nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.494 186962 INFO nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Libvirt host capabilities <capabilities>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <host>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <uuid>e5303377-33ba-4369-bb3f-4ec9de742582</uuid>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <cpu>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <arch>x86_64</arch>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model>EPYC-Rome-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <vendor>AMD</vendor>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <microcode version='16777317'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <signature family='23' model='49' stepping='0'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='x2apic'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='tsc-deadline'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='osxsave'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='hypervisor'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='tsc_adjust'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='spec-ctrl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='stibp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='arch-capabilities'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='cmp_legacy'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='topoext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='virt-ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='lbrv'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='tsc-scale'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='vmcb-clean'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='pause-filter'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='pfthreshold'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='svme-addr-chk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='rdctl-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='skip-l1dfl-vmentry'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='mds-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature name='pschange-mc-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <pages unit='KiB' size='4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <pages unit='KiB' size='2048'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <pages unit='KiB' size='1048576'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </cpu>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <power_management>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <suspend_mem/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <suspend_disk/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <suspend_hybrid/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </power_management>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <iommu support='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <migration_features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <live/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <uri_transports>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <uri_transport>tcp</uri_transport>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <uri_transport>rdma</uri_transport>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </uri_transports>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </migration_features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <topology>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <cells num='1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <cell id='0'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:          <memory unit='KiB'>7864312</memory>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:          <pages unit='KiB' size='4'>1966078</pages>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:          <pages unit='KiB' size='2048'>0</pages>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:          <distances>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:            <sibling id='0' value='10'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:          </distances>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:          <cpus num='8'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:          </cpus>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        </cell>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </cells>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </topology>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <cache>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </cache>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <secmodel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model>selinux</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <doi>0</doi>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </secmodel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <secmodel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model>dac</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <doi>0</doi>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </secmodel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </host>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <guest>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <os_type>hvm</os_type>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <arch name='i686'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <wordsize>32</wordsize>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <domain type='qemu'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <domain type='kvm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </arch>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <pae/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <nonpae/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <acpi default='on' toggle='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <apic default='on' toggle='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <cpuselection/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <deviceboot/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <externalSnapshot/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </guest>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <guest>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <os_type>hvm</os_type>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <arch name='x86_64'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <wordsize>64</wordsize>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <domain type='qemu'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <domain type='kvm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </arch>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <acpi default='on' toggle='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <apic default='on' toggle='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <cpuselection/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <deviceboot/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <disksnapshot default='on' toggle='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <externalSnapshot/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </guest>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 
Nov 29 01:41:33 np0005539505 nova_compute[186958]: </capabilities>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: #033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.501 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.505 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 29 01:41:33 np0005539505 nova_compute[186958]: <domainCapabilities>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <domain>kvm</domain>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <arch>i686</arch>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <vcpu max='4096'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <iothreads supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <os supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <enum name='firmware'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <loader supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>rom</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pflash</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='readonly'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>yes</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>no</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='secure'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>no</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </loader>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <cpu>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>on</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>off</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='maximumMigratable'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>on</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>off</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <vendor>AMD</vendor>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='succor'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='custom' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='auto-ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='auto-ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-128'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-256'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-512'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='KnightsMill'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512er'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512pf'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512er'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512pf'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tbm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tbm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SierraForest'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cmpccxadd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cmpccxadd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='athlon'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='athlon-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='core2duo'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='core2duo-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='coreduo'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='coreduo-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='n270'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='n270-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='phenom'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='phenom-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <memoryBacking supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <enum name='sourceType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>file</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>anonymous</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>memfd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </memoryBacking>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <disk supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='diskDevice'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>disk</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>cdrom</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>floppy</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>lun</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='bus'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>fdc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>scsi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>sata</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-non-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <graphics supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vnc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>egl-headless</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dbus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <video supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='modelType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vga</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>cirrus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>none</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>bochs</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>ramfb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <hostdev supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='mode'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>subsystem</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='startupPolicy'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>default</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>mandatory</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>requisite</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>optional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='subsysType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pci</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>scsi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='capsType'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='pciBackend'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </hostdev>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <rng supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-non-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>random</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>egd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>builtin</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <filesystem supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='driverType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>path</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>handle</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtiofs</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </filesystem>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <tpm supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tpm-tis</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tpm-crb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>emulator</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>external</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendVersion'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>2.0</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </tpm>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <redirdev supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='bus'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </redirdev>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <channel supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pty</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>unix</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </channel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <crypto supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>qemu</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>builtin</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </crypto>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <interface supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>default</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>passt</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <panic supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>isa</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>hyperv</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </panic>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <console supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>null</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pty</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dev</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>file</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pipe</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>stdio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>udp</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tcp</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>unix</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>qemu-vdagent</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dbus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </console>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <gic supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <genid supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <backup supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <async-teardown supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <ps2 supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <sev supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <sgx supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <hyperv supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='features'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>relaxed</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vapic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>spinlocks</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vpindex</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>runtime</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>synic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>stimer</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>reset</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vendor_id</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>frequencies</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>reenlightenment</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tlbflush</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>ipi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>avic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>emsr_bitmap</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>xmm_input</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <defaults>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </defaults>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </hyperv>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <launchSecurity supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='sectype'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tdx</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </launchSecurity>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: </domainCapabilities>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.508 186962 DEBUG nova.virt.libvirt.volume.mount [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.511 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 29 01:41:33 np0005539505 nova_compute[186958]: <domainCapabilities>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <domain>kvm</domain>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <arch>i686</arch>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <vcpu max='240'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <iothreads supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <os supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <enum name='firmware'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <loader supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>rom</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pflash</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='readonly'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>yes</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>no</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='secure'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>no</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </loader>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <cpu>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>on</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>off</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='maximumMigratable'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>on</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>off</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <vendor>AMD</vendor>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='succor'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='custom' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='auto-ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='auto-ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-128'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-256'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-512'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='KnightsMill'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512er'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512pf'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512er'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512pf'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tbm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tbm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SierraForest'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cmpccxadd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cmpccxadd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='athlon'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='athlon-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='core2duo'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='core2duo-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='coreduo'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='coreduo-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='n270'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='n270-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='phenom'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='phenom-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <memoryBacking supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <enum name='sourceType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>file</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>anonymous</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>memfd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </memoryBacking>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <disk supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='diskDevice'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>disk</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>cdrom</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>floppy</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>lun</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='bus'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>ide</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>fdc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>scsi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>sata</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-non-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <graphics supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vnc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>egl-headless</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dbus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <video supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='modelType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vga</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>cirrus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>none</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>bochs</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>ramfb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <hostdev supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='mode'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>subsystem</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='startupPolicy'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>default</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>mandatory</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>requisite</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>optional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='subsysType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pci</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>scsi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='capsType'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='pciBackend'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </hostdev>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <rng supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-non-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>random</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>egd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>builtin</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <filesystem supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='driverType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>path</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>handle</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtiofs</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </filesystem>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <tpm supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tpm-tis</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tpm-crb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>emulator</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>external</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendVersion'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>2.0</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </tpm>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <redirdev supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='bus'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </redirdev>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <channel supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pty</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>unix</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </channel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <crypto supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>qemu</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>builtin</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </crypto>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <interface supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>default</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>passt</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <panic supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>isa</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>hyperv</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </panic>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <console supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>null</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pty</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dev</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>file</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pipe</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>stdio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>udp</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tcp</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>unix</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>qemu-vdagent</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dbus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </console>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <gic supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <genid supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <backup supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <async-teardown supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <ps2 supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <sev supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <sgx supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <hyperv supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='features'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>relaxed</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vapic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>spinlocks</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vpindex</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>runtime</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>synic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>stimer</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>reset</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vendor_id</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>frequencies</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>reenlightenment</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tlbflush</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>ipi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>avic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>emsr_bitmap</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>xmm_input</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <defaults>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </defaults>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </hyperv>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <launchSecurity supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='sectype'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tdx</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </launchSecurity>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: </domainCapabilities>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.540 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.544 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 29 01:41:33 np0005539505 nova_compute[186958]: <domainCapabilities>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <domain>kvm</domain>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <arch>x86_64</arch>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <vcpu max='4096'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <iothreads supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <os supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <enum name='firmware'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>efi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <loader supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>rom</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pflash</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='readonly'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>yes</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>no</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='secure'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>yes</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>no</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </loader>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <cpu>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>on</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>off</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='maximumMigratable'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>on</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>off</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <vendor>AMD</vendor>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='succor'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='custom' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='auto-ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='auto-ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-128'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-256'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-512'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='KnightsMill'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512er'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512pf'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512er'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512pf'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tbm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tbm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SierraForest'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cmpccxadd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cmpccxadd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='athlon'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='athlon-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='core2duo'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='core2duo-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='coreduo'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='coreduo-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='n270'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='n270-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='phenom'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='phenom-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <memoryBacking supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <enum name='sourceType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>file</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>anonymous</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>memfd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </memoryBacking>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <disk supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='diskDevice'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>disk</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>cdrom</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>floppy</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>lun</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='bus'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>fdc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>scsi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>sata</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-non-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <graphics supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vnc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>egl-headless</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dbus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <video supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='modelType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vga</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>cirrus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>none</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>bochs</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>ramfb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <hostdev supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='mode'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>subsystem</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='startupPolicy'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>default</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>mandatory</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>requisite</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>optional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='subsysType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pci</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>scsi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='capsType'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='pciBackend'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </hostdev>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <rng supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-non-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>random</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>egd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>builtin</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <filesystem supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='driverType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>path</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>handle</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtiofs</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </filesystem>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <tpm supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tpm-tis</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tpm-crb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>emulator</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>external</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendVersion'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>2.0</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </tpm>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <redirdev supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='bus'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </redirdev>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <channel supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pty</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>unix</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </channel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <crypto supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>qemu</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>builtin</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </crypto>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <interface supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>default</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>passt</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <panic supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>isa</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>hyperv</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </panic>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <console supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>null</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pty</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dev</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>file</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pipe</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>stdio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>udp</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tcp</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>unix</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>qemu-vdagent</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dbus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </console>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <gic supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <genid supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <backup supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <async-teardown supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <ps2 supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <sev supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <sgx supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <hyperv supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='features'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>relaxed</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vapic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>spinlocks</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vpindex</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>runtime</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>synic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>stimer</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>reset</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vendor_id</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>frequencies</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>reenlightenment</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tlbflush</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>ipi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>avic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>emsr_bitmap</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>xmm_input</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <defaults>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </defaults>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </hyperv>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <launchSecurity supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='sectype'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tdx</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </launchSecurity>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: </domainCapabilities>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.601 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 29 01:41:33 np0005539505 nova_compute[186958]: <domainCapabilities>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <path>/usr/libexec/qemu-kvm</path>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <domain>kvm</domain>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <arch>x86_64</arch>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <vcpu max='240'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <iothreads supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <os supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <enum name='firmware'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <loader supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>rom</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pflash</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='readonly'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>yes</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>no</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='secure'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>no</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </loader>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <cpu>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='host-passthrough' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='hostPassthroughMigratable'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>on</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>off</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='maximum' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='maximumMigratable'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>on</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>off</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='host-model' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <vendor>AMD</vendor>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='x2apic'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc-deadline'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='hypervisor'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc_adjust'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='spec-ctrl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='stibp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='cmp_legacy'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='overflow-recov'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='succor'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='amd-ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='virt-ssbd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='lbrv'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='tsc-scale'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='vmcb-clean'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='flushbyasid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='pause-filter'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='pfthreshold'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='svme-addr-chk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <feature policy='disable' name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <mode name='custom' supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Broadwell-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cascadelake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Cooperlake-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Denverton-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Dhyana-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Genoa'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='auto-ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Genoa-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='auto-ibrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Milan-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amd-psfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='no-nested-data-bp'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='null-sel-clr-base'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='stibp-always-on'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-Rome-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='EPYC-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='GraniteRapids-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-128'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-256'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx10-512'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='prefetchiti'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Haswell-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-noTSX'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v6'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Icelake-Server-v7'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='IvyBridge-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='KnightsMill'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512er'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512pf'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='KnightsMill-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4fmaps'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-4vnniw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512er'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512pf'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G4-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tbm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Opteron_G5-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fma4'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tbm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xop'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SapphireRapids-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='amx-tile'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-bf16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-fp16'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512-vpopcntdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bitalg'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vbmi2'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrc'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fzrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='la57'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='taa-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='tsx-ldtrk'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xfd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SierraForest'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cmpccxadd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='SierraForest-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ifma'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-ne-convert'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx-vnni-int8'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='bus-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cmpccxadd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fbsdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='fsrs'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ibrs-all'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mcdt-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pbrsb-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='psdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='sbdr-ssdp-no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='serialize'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vaes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='vpclmulqdq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Client-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='hle'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='rtm'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Skylake-Server-v5'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512bw'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512cd'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512dq'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512f'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='avx512vl'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='invpcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pcid'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='pku'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='mpx'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v2'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v3'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='core-capability'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='split-lock-detect'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='Snowridge-v4'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='cldemote'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='erms'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='gfni'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdir64b'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='movdiri'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='xsaves'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='athlon'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='athlon-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='core2duo'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='core2duo-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='coreduo'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='coreduo-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='n270'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='n270-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='ss'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='phenom'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <blockers model='phenom-v1'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnow'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <feature name='3dnowext'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </blockers>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </mode>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <memoryBacking supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <enum name='sourceType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>file</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>anonymous</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <value>memfd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </memoryBacking>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <disk supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='diskDevice'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>disk</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>cdrom</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>floppy</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>lun</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='bus'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>ide</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>fdc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>scsi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>sata</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-non-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <graphics supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vnc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>egl-headless</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dbus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <video supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='modelType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vga</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>cirrus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>none</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>bochs</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>ramfb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <hostdev supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='mode'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>subsystem</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='startupPolicy'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>default</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>mandatory</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>requisite</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>optional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='subsysType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pci</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>scsi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='capsType'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='pciBackend'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </hostdev>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <rng supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtio-non-transitional</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>random</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>egd</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>builtin</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <filesystem supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='driverType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>path</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>handle</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>virtiofs</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </filesystem>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <tpm supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tpm-tis</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tpm-crb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>emulator</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>external</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendVersion'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>2.0</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </tpm>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <redirdev supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='bus'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>usb</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </redirdev>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <channel supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pty</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>unix</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </channel>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <crypto supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>qemu</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendModel'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>builtin</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </crypto>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <interface supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='backendType'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>default</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>passt</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <panic supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='model'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>isa</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>hyperv</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </panic>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <console supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='type'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>null</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vc</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pty</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dev</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>file</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>pipe</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>stdio</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>udp</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tcp</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>unix</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>qemu-vdagent</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>dbus</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </console>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <gic supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <vmcoreinfo supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <genid supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <backingStoreInput supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <backup supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <async-teardown supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <ps2 supported='yes'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <sev supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <sgx supported='no'/>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <hyperv supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='features'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>relaxed</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vapic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>spinlocks</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vpindex</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>runtime</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>synic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>stimer</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>reset</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>vendor_id</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>frequencies</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>reenlightenment</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tlbflush</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>ipi</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>avic</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>emsr_bitmap</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>xmm_input</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <defaults>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <spinlocks>4095</spinlocks>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <stimer_direct>on</stimer_direct>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <tlbflush_direct>on</tlbflush_direct>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <tlbflush_extended>on</tlbflush_extended>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </defaults>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </hyperv>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    <launchSecurity supported='yes'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      <enum name='sectype'>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:        <value>tdx</value>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:      </enum>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:    </launchSecurity>
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: </domainCapabilities>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.664 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.665 186962 INFO nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Secure Boot support detected#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.666 186962 INFO nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.667 186962 INFO nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.675 186962 DEBUG nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] cpu compare xml: <cpu match="exact">
Nov 29 01:41:33 np0005539505 nova_compute[186958]:  <model>Nehalem</model>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: </cpu>
Nov 29 01:41:33 np0005539505 nova_compute[186958]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.677 186962 DEBUG nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.700 186962 INFO nova.virt.node [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Determined node identity 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from /var/lib/nova/compute_id#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.723 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Verified node 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 matches my host compute-2.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Nov 29 01:41:33 np0005539505 nova_compute[186958]: 2025-11-29 06:41:33.772 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.041 186962 DEBUG oslo_concurrency.lockutils [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.041 186962 DEBUG oslo_concurrency.lockutils [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.041 186962 DEBUG oslo_concurrency.lockutils [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.042 186962 DEBUG nova.compute.resource_tracker [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:41:34 np0005539505 systemd[1]: session-24.scope: Deactivated successfully.
Nov 29 01:41:34 np0005539505 systemd[1]: session-24.scope: Consumed 1min 48.032s CPU time.
Nov 29 01:41:34 np0005539505 systemd-logind[794]: Session 24 logged out. Waiting for processes to exit.
Nov 29 01:41:34 np0005539505 systemd-logind[794]: Removed session 24.
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.193 186962 WARNING nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.195 186962 DEBUG nova.compute.resource_tracker [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6212MB free_disk=73.54496383666992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.195 186962 DEBUG oslo_concurrency.lockutils [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.195 186962 DEBUG oslo_concurrency.lockutils [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.580 186962 DEBUG nova.compute.resource_tracker [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.580 186962 DEBUG nova.compute.resource_tracker [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.607 186962 DEBUG nova.scheduler.client.report [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.652 186962 DEBUG nova.scheduler.client.report [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.652 186962 DEBUG nova.compute.provider_tree [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.692 186962 DEBUG nova.scheduler.client.report [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.738 186962 DEBUG nova.scheduler.client.report [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.858 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 29 01:41:34 np0005539505 nova_compute[186958]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.859 186962 INFO nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.859 186962 DEBUG nova.compute.provider_tree [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.860 186962 DEBUG nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.861 186962 DEBUG nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Libvirt baseline CPU <cpu>
Nov 29 01:41:34 np0005539505 nova_compute[186958]:  <arch>x86_64</arch>
Nov 29 01:41:34 np0005539505 nova_compute[186958]:  <model>Nehalem</model>
Nov 29 01:41:34 np0005539505 nova_compute[186958]:  <vendor>AMD</vendor>
Nov 29 01:41:34 np0005539505 nova_compute[186958]:  <topology sockets="8" cores="1" threads="1"/>
Nov 29 01:41:34 np0005539505 nova_compute[186958]: </cpu>
Nov 29 01:41:34 np0005539505 nova_compute[186958]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.885 186962 DEBUG nova.scheduler.client.report [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.927 186962 DEBUG nova.compute.resource_tracker [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.928 186962 DEBUG oslo_concurrency.lockutils [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.928 186962 DEBUG nova.service [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.966 186962 DEBUG nova.service [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 29 01:41:34 np0005539505 nova_compute[186958]: 2025-11-29 06:41:34.966 186962 DEBUG nova.servicegroup.drivers.db [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 29 01:41:36 np0005539505 podman[187260]: 2025-11-29 06:41:36.731013649 +0000 UTC m=+0.057589568 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:41:40 np0005539505 systemd-logind[794]: New session 26 of user zuul.
Nov 29 01:41:40 np0005539505 systemd[1]: Started Session 26 of User zuul.
Nov 29 01:41:41 np0005539505 python3.9[187432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 29 01:41:42 np0005539505 podman[187560]: 2025-11-29 06:41:42.893132778 +0000 UTC m=+0.115017229 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 01:41:43 np0005539505 python3.9[187604]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:41:43 np0005539505 systemd[1]: Reloading.
Nov 29 01:41:43 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:43 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:44 np0005539505 python3.9[187798]: ansible-ansible.builtin.service_facts Invoked
Nov 29 01:41:44 np0005539505 network[187815]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 29 01:41:44 np0005539505 network[187816]: 'network-scripts' will be removed from distribution in near future.
Nov 29 01:41:44 np0005539505 network[187817]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 29 01:41:49 np0005539505 podman[187936]: 2025-11-29 06:41:49.386014878 +0000 UTC m=+0.061865307 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 01:41:50 np0005539505 python3.9[188113]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:41:51 np0005539505 python3.9[188266]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:51 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:41:51 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:41:52 np0005539505 python3.9[188419]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:41:53 np0005539505 python3.9[188571]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:54 np0005539505 python3.9[188723]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:41:55 np0005539505 python3.9[188875]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:41:55 np0005539505 systemd[1]: Reloading.
Nov 29 01:41:55 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:41:55 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:41:56 np0005539505 python3.9[189063]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:41:57 np0005539505 python3.9[189216]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:41:58 np0005539505 python3.9[189366]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:41:59 np0005539505 python3.9[189518]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:00 np0005539505 python3.9[189639]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398518.9048648-366-156009595410212/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=d3d36c542f4af449a66988015465dd0bb4b47bb9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:01 np0005539505 python3.9[189791]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 29 01:42:01 np0005539505 nova_compute[186958]: 2025-11-29 06:42:01.969 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:02 np0005539505 nova_compute[186958]: 2025-11-29 06:42:02.067 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:02 np0005539505 python3.9[189943]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 29 01:42:03 np0005539505 python3.9[190096]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 29 01:42:04 np0005539505 python3.9[190254]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 29 01:42:06 np0005539505 python3.9[190412]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:06 np0005539505 python3.9[190533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764398525.6821663-570-179335537021233/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:07 np0005539505 podman[190657]: 2025-11-29 06:42:07.056617848 +0000 UTC m=+0.063542604 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:42:07 np0005539505 python3.9[190693]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:07 np0005539505 python3.9[190823]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764398526.7395039-570-43404973140450/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:08 np0005539505 python3.9[190973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:08 np0005539505 python3.9[191094]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764398527.8722625-570-133334308635889/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:10 np0005539505 python3.9[191244]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:10 np0005539505 python3.9[191396]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:11 np0005539505 python3.9[191548]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:12 np0005539505 python3.9[191669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398531.2045977-747-209719375098529/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:12 np0005539505 python3.9[191819]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:13 np0005539505 podman[191820]: 2025-11-29 06:42:13.093794168 +0000 UTC m=+0.101305573 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:42:13 np0005539505 python3.9[191922]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:14 np0005539505 python3.9[192072]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:14 np0005539505 python3.9[192193]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398533.6112823-747-206793738161217/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:15 np0005539505 python3.9[192343]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:16 np0005539505 python3.9[192464]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398535.1882734-747-51100819511995/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:16 np0005539505 python3.9[192614]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:17 np0005539505 python3.9[192735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398536.4020212-747-128722286336481/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:17 np0005539505 python3.9[192885]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:18 np0005539505 python3.9[193006]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398537.505172-747-76404195271151/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:19 np0005539505 python3.9[193156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:19 np0005539505 podman[193251]: 2025-11-29 06:42:19.567206514 +0000 UTC m=+0.087801514 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 29 01:42:19 np0005539505 python3.9[193293]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398538.7279353-747-200340068259164/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:20 np0005539505 python3.9[193446]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:20 np0005539505 python3.9[193567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398539.834317-747-27997966827206/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:21 np0005539505 python3.9[193717]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:21 np0005539505 python3.9[193838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398540.963319-747-240841221325419/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:22 np0005539505 python3.9[193988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:23 np0005539505 python3.9[194109]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398542.0677123-747-257306914864103/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:23 np0005539505 python3.9[194259]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:24 np0005539505 python3.9[194380]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398543.4595644-747-123654738227796/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:26 np0005539505 python3.9[194530]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:26 np0005539505 python3.9[194606]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:42:26.916 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:42:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:42:26.918 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:42:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:42:26.918 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:42:27 np0005539505 python3.9[194756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:27 np0005539505 python3.9[194832]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:28 np0005539505 python3.9[194982]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:28 np0005539505 python3.9[195058]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:29 np0005539505 python3.9[195210]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:30 np0005539505 python3.9[195362]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:31 np0005539505 python3.9[195514]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:32 np0005539505 python3.9[195666]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:42:32 np0005539505 systemd[1]: Reloading.
Nov 29 01:42:32 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:42:32 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.381 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.382 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.382 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.401 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.402 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.402 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.403 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.403 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.403 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.404 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.404 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.404 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.427 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.427 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.427 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.428 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:42:32 np0005539505 systemd[1]: Listening on Podman API Socket.
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.619 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.620 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6173MB free_disk=73.54440689086914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.620 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.620 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.712 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.712 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.734 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.749 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.751 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:42:32 np0005539505 nova_compute[186958]: 2025-11-29 06:42:32.751 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:42:34 np0005539505 python3.9[195857]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:34 np0005539505 python3.9[195980]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398553.8863602-1413-149358024094375/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:35 np0005539505 python3.9[196056]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:35 np0005539505 python3.9[196179]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398553.8863602-1413-149358024094375/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:37 np0005539505 python3.9[196331]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 29 01:42:37 np0005539505 podman[196408]: 2025-11-29 06:42:37.75755802 +0000 UTC m=+0.073659777 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 01:42:38 np0005539505 python3.9[196502]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:42:39 np0005539505 python3[196654]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:42:39 np0005539505 podman[196692]: 2025-11-29 06:42:39.573393994 +0000 UTC m=+0.046930958 container create d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:42:39 np0005539505 podman[196692]: 2025-11-29 06:42:39.547185039 +0000 UTC m=+0.020722003 image pull e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 29 01:42:39 np0005539505 python3[196654]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 29 01:42:40 np0005539505 python3.9[196885]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:41 np0005539505 python3.9[197039]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:41 np0005539505 python3.9[197190]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398561.3981583-1604-188062871873122/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:42 np0005539505 python3.9[197266]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:42:42 np0005539505 systemd[1]: Reloading.
Nov 29 01:42:42 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:42:42 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:42:43 np0005539505 podman[197301]: 2025-11-29 06:42:43.490102215 +0000 UTC m=+0.124113173 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:42:43 np0005539505 python3.9[197402]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:42:43 np0005539505 systemd[1]: Reloading.
Nov 29 01:42:44 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:42:44 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:42:44 np0005539505 systemd[1]: Starting ceilometer_agent_compute container...
Nov 29 01:42:44 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:42:44 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87f46962b1d2b1bde4d4d71442807a78b071b2d6c407c90ceeff7fcf09a0a1d2/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:44 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87f46962b1d2b1bde4d4d71442807a78b071b2d6c407c90ceeff7fcf09a0a1d2/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:44 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87f46962b1d2b1bde4d4d71442807a78b071b2d6c407c90ceeff7fcf09a0a1d2/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:44 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87f46962b1d2b1bde4d4d71442807a78b071b2d6c407c90ceeff7fcf09a0a1d2/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:44 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623.
Nov 29 01:42:44 np0005539505 podman[197442]: 2025-11-29 06:42:44.414349109 +0000 UTC m=+0.124590525 container init d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: + sudo -E kolla_set_configs
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: sudo: unable to send audit message: Operation not permitted
Nov 29 01:42:44 np0005539505 podman[197442]: 2025-11-29 06:42:44.451395789 +0000 UTC m=+0.161637155 container start d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:42:44 np0005539505 podman[197442]: ceilometer_agent_compute
Nov 29 01:42:44 np0005539505 systemd[1]: Started ceilometer_agent_compute container.
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Validating config file
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Copying service configuration files
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: INFO:__main__:Writing out command to execute
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: ++ cat /run_command
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: + ARGS=
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: + sudo kolla_copy_cacerts
Nov 29 01:42:44 np0005539505 podman[197463]: 2025-11-29 06:42:44.518363687 +0000 UTC m=+0.055376924 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: sudo: unable to send audit message: Operation not permitted
Nov 29 01:42:44 np0005539505 systemd[1]: d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-2d00c3d57756fa4f.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:42:44 np0005539505 systemd[1]: d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-2d00c3d57756fa4f.service: Failed with result 'exit-code'.
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: + [[ ! -n '' ]]
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: + . kolla_extend_start
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: + umask 0022
Nov 29 01:42:44 np0005539505 ceilometer_agent_compute[197456]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 29 01:42:45 np0005539505 python3.9[197638]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.486 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.486 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.486 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.486 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.487 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.487 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.487 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.487 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.487 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.487 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.487 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.487 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.488 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.488 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.488 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.488 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.488 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.488 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.488 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.488 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.488 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.489 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.490 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.490 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.490 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.490 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.490 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.490 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.490 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.490 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.490 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.491 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.491 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.491 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.491 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.491 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.491 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.491 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.491 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.492 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.492 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.492 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.492 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.492 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.492 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.492 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.492 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.492 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.492 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.493 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.493 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.493 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.493 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.493 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.493 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.493 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.493 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.493 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.493 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.494 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.494 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.494 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.494 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.494 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.494 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.494 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.494 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.494 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.494 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.494 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.495 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.496 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.496 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.497 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.497 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.497 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.497 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.497 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.497 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.497 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.498 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.498 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.498 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.498 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.498 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.498 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.498 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.498 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.498 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.498 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.499 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.499 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.499 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.499 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.499 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.500 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.500 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.500 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.500 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.500 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.501 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.502 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.503 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.503 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.503 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.503 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.503 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.503 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.522 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.523 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.524 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.609 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.684 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.684 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.684 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.684 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.684 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.685 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.685 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.685 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.685 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.685 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.685 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.685 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.685 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.685 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.685 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.686 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.686 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.686 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.686 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.686 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.686 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.686 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.686 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.686 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.687 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.688 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.688 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.688 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.688 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.688 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.688 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.688 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.688 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.688 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.688 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.688 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.689 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.689 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.689 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.689 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.689 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.689 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.689 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.689 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.689 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.689 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.690 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.690 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.690 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.690 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.690 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.690 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.690 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.690 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.690 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.690 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.690 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.691 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.691 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.691 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.691 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.691 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.691 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.691 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.691 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.691 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.691 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.691 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.692 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.693 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.693 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.693 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.693 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.693 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.693 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.693 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.693 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.693 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.693 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.693 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.694 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.694 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.694 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.694 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.694 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.694 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.694 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.694 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.694 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.695 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.695 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.695 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.695 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.695 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.695 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.695 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.695 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.695 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.696 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.696 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.696 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.696 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.696 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.696 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.696 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.696 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.696 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.696 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.696 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.697 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.698 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.698 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.698 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.698 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.698 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.698 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.698 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.698 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.698 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.698 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.698 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.699 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.699 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.699 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.699 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.699 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.699 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.699 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.700 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.701 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.701 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.701 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.701 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.701 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.701 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.701 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.701 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.701 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.701 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.702 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.702 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.702 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.702 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.702 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.702 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.702 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.702 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.702 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.703 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.703 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.703 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.703 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.703 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.703 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.703 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.703 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.703 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.704 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.704 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.704 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.704 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.704 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.704 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.704 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.704 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.704 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.705 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.705 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.705 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.705 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.707 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.712 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.856 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.957 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.957 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.957 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 29 01:42:45 np0005539505 ceilometer_agent_compute[197456]: 2025-11-29 06:42:45.966 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 29 01:42:45 np0005539505 virtqemud[186353]: End of file while reading data: Input/output error
Nov 29 01:42:45 np0005539505 virtqemud[186353]: End of file while reading data: Input/output error
Nov 29 01:42:46 np0005539505 systemd[1]: libpod-d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623.scope: Deactivated successfully.
Nov 29 01:42:46 np0005539505 systemd[1]: libpod-d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623.scope: Consumed 1.449s CPU time.
Nov 29 01:42:46 np0005539505 conmon[197456]: conmon d18825ddbfe692f198ea <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623.scope/container/memory.events
Nov 29 01:42:46 np0005539505 podman[197642]: 2025-11-29 06:42:46.121651808 +0000 UTC m=+0.607754509 container died d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 29 01:42:46 np0005539505 systemd[1]: d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-2d00c3d57756fa4f.timer: Deactivated successfully.
Nov 29 01:42:46 np0005539505 systemd[1]: Stopped /usr/bin/podman healthcheck run d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623.
Nov 29 01:42:46 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-userdata-shm.mount: Deactivated successfully.
Nov 29 01:42:46 np0005539505 systemd[1]: var-lib-containers-storage-overlay-87f46962b1d2b1bde4d4d71442807a78b071b2d6c407c90ceeff7fcf09a0a1d2-merged.mount: Deactivated successfully.
Nov 29 01:42:46 np0005539505 podman[197642]: 2025-11-29 06:42:46.384391558 +0000 UTC m=+0.870494259 container cleanup d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 01:42:46 np0005539505 podman[197642]: ceilometer_agent_compute
Nov 29 01:42:46 np0005539505 podman[197677]: ceilometer_agent_compute
Nov 29 01:42:46 np0005539505 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 29 01:42:46 np0005539505 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 29 01:42:46 np0005539505 systemd[1]: Starting ceilometer_agent_compute container...
Nov 29 01:42:46 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:42:46 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87f46962b1d2b1bde4d4d71442807a78b071b2d6c407c90ceeff7fcf09a0a1d2/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:46 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87f46962b1d2b1bde4d4d71442807a78b071b2d6c407c90ceeff7fcf09a0a1d2/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:46 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87f46962b1d2b1bde4d4d71442807a78b071b2d6c407c90ceeff7fcf09a0a1d2/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:46 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87f46962b1d2b1bde4d4d71442807a78b071b2d6c407c90ceeff7fcf09a0a1d2/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:46 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623.
Nov 29 01:42:46 np0005539505 podman[197690]: 2025-11-29 06:42:46.735504977 +0000 UTC m=+0.252855484 container init d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: + sudo -E kolla_set_configs
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: sudo: unable to send audit message: Operation not permitted
Nov 29 01:42:46 np0005539505 podman[197690]: 2025-11-29 06:42:46.7783864 +0000 UTC m=+0.295736917 container start d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Validating config file
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Copying service configuration files
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: INFO:__main__:Writing out command to execute
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: ++ cat /run_command
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: + ARGS=
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: + sudo kolla_copy_cacerts
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: sudo: unable to send audit message: Operation not permitted
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: + [[ ! -n '' ]]
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: + . kolla_extend_start
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: + umask 0022
Nov 29 01:42:46 np0005539505 ceilometer_agent_compute[197706]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 29 01:42:46 np0005539505 podman[197690]: ceilometer_agent_compute
Nov 29 01:42:46 np0005539505 systemd[1]: Started ceilometer_agent_compute container.
Nov 29 01:42:47 np0005539505 podman[197713]: 2025-11-29 06:42:47.022164778 +0000 UTC m=+0.233190992 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:42:47 np0005539505 systemd[1]: d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-26cbd15752bc557b.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:42:47 np0005539505 systemd[1]: d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-26cbd15752bc557b.service: Failed with result 'exit-code'.
Nov 29 01:42:47 np0005539505 python3.9[197889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.853 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.853 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.853 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.853 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.854 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.854 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.854 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.854 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.854 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.854 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.854 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.854 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.855 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.855 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.855 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.855 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.855 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.855 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.855 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.855 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.855 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.855 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.856 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.857 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.857 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.857 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.857 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.857 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.857 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.857 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.857 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.857 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.857 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.858 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.858 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.858 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.858 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.858 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.858 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.858 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.858 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.858 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.858 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.858 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.859 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.859 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.859 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.859 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.859 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.859 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.859 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.859 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.859 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.859 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.859 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.860 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.860 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.860 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.860 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.860 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.861 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.862 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.863 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.863 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.863 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.863 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.863 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.863 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.863 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.863 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.863 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.863 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.864 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.864 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.864 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.864 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.864 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.864 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.864 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.864 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.864 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.864 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.864 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.865 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.865 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.865 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.865 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.865 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.865 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.865 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.865 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.865 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.865 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.866 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.867 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.867 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.867 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.867 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.867 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.867 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.867 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.867 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.867 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.867 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.867 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.868 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.868 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.868 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.868 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.868 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.868 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.868 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.868 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.868 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.869 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.888 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.889 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.889 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 29 01:42:47 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:47.901 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.040 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.040 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.040 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.041 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.041 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.041 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.041 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.041 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.041 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.041 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.041 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.042 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.042 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.042 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.042 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.042 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.042 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.043 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.043 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.043 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.043 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.043 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.043 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.043 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.043 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.044 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.044 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.044 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.044 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.044 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.044 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.044 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.044 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.044 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.044 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.045 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.045 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.045 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.045 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.045 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.045 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.045 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.045 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.045 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.046 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.046 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.046 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.046 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.046 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.046 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.046 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.046 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.047 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.047 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.047 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.047 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.047 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.047 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.047 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.047 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.047 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.048 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.048 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.048 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.048 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.048 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.048 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.048 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.048 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.049 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.049 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.049 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.049 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.049 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.049 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.049 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.049 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.049 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.050 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.050 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.050 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.050 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.050 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.050 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.050 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.050 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.051 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.051 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.051 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.051 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.051 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.051 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.051 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.051 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.052 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.052 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.052 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.052 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.052 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.052 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.052 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.052 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.052 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.053 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.053 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.053 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.053 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.053 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.053 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.053 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.054 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.054 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.054 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.054 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.054 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.054 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.054 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.054 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.055 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.055 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.055 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.055 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.055 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.055 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.055 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.055 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.056 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.056 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.056 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.056 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.056 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.056 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.056 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.056 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.056 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.057 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.057 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.057 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.057 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.057 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.057 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.057 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.057 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.057 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.058 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.058 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.058 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.058 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.058 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.058 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.058 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.058 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.059 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.059 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.059 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.059 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.059 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.059 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.059 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.059 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.059 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.060 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.060 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.060 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.060 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.060 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.060 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.060 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.060 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.060 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.061 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.061 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.061 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.061 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.061 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.061 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.061 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.061 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.061 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.062 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.062 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.062 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.062 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.062 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.062 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.062 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.062 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.062 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.063 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.063 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.063 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.063 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.063 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.063 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.063 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.063 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.064 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.064 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.064 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.064 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.064 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.064 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.064 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.064 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.064 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.065 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.065 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.065 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.065 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.065 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.065 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.065 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.065 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.066 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.066 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.066 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.066 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.066 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.066 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.068 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.074 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:42:48.080 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:42:48 np0005539505 python3.9[198015]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398567.218527-1701-153759587966413/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:42:49 np0005539505 python3.9[198170]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 29 01:42:49 np0005539505 podman[198195]: 2025-11-29 06:42:49.739905278 +0000 UTC m=+0.073191854 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:42:50 np0005539505 python3.9[198344]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:42:53 np0005539505 python3[198496]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:42:53 np0005539505 podman[198529]: 2025-11-29 06:42:53.801464543 +0000 UTC m=+0.020720102 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 29 01:42:54 np0005539505 podman[198529]: 2025-11-29 06:42:54.973456167 +0000 UTC m=+1.192711696 container create 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter)
Nov 29 01:42:54 np0005539505 python3[198496]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 29 01:42:55 np0005539505 python3.9[198718]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:42:56 np0005539505 python3.9[198872]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:57 np0005539505 python3.9[199023]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398576.7204423-1859-40760731049020/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:42:57 np0005539505 python3.9[199099]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:42:57 np0005539505 systemd[1]: Reloading.
Nov 29 01:42:57 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:42:57 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:42:58 np0005539505 python3.9[199211]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:42:58 np0005539505 systemd[1]: Reloading.
Nov 29 01:42:58 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:42:58 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:42:58 np0005539505 systemd[1]: Starting node_exporter container...
Nov 29 01:42:59 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:42:59 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b34e0d88569c0ff6ba65c0ca866e613fbf017586e5846d2f24bed10c7defaa7f/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:59 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b34e0d88569c0ff6ba65c0ca866e613fbf017586e5846d2f24bed10c7defaa7f/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:42:59 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf.
Nov 29 01:42:59 np0005539505 podman[199252]: 2025-11-29 06:42:59.058330956 +0000 UTC m=+0.103909136 container init 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.072Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.073Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.073Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.073Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.073Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.073Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.073Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=arp
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=bcache
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=bonding
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=cpu
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=edac
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=filefd
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=netclass
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=netdev
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=netstat
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=nfs
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=nvme
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=softnet
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=systemd
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=xfs
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.074Z caller=node_exporter.go:117 level=info collector=zfs
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.075Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 29 01:42:59 np0005539505 node_exporter[199267]: ts=2025-11-29T06:42:59.075Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 29 01:42:59 np0005539505 podman[199252]: 2025-11-29 06:42:59.084344846 +0000 UTC m=+0.129923026 container start 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 01:42:59 np0005539505 podman[199252]: node_exporter
Nov 29 01:42:59 np0005539505 systemd[1]: Started node_exporter container.
Nov 29 01:42:59 np0005539505 podman[199276]: 2025-11-29 06:42:59.145064418 +0000 UTC m=+0.051148506 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:43:01 np0005539505 python3.9[199451]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:43:01 np0005539505 systemd[1]: Stopping node_exporter container...
Nov 29 01:43:01 np0005539505 systemd[1]: libpod-934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf.scope: Deactivated successfully.
Nov 29 01:43:01 np0005539505 podman[199455]: 2025-11-29 06:43:01.417067446 +0000 UTC m=+0.056968949 container died 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:43:01 np0005539505 systemd[1]: 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf-75ec829866a23c46.timer: Deactivated successfully.
Nov 29 01:43:01 np0005539505 systemd[1]: Stopped /usr/bin/podman healthcheck run 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf.
Nov 29 01:43:01 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf-userdata-shm.mount: Deactivated successfully.
Nov 29 01:43:01 np0005539505 systemd[1]: var-lib-containers-storage-overlay-b34e0d88569c0ff6ba65c0ca866e613fbf017586e5846d2f24bed10c7defaa7f-merged.mount: Deactivated successfully.
Nov 29 01:43:01 np0005539505 podman[199455]: 2025-11-29 06:43:01.468227911 +0000 UTC m=+0.108129414 container cleanup 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:43:01 np0005539505 podman[199455]: node_exporter
Nov 29 01:43:01 np0005539505 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 29 01:43:01 np0005539505 podman[199483]: node_exporter
Nov 29 01:43:01 np0005539505 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 29 01:43:01 np0005539505 systemd[1]: Stopped node_exporter container.
Nov 29 01:43:01 np0005539505 systemd[1]: Starting node_exporter container...
Nov 29 01:43:01 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:43:01 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b34e0d88569c0ff6ba65c0ca866e613fbf017586e5846d2f24bed10c7defaa7f/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:01 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b34e0d88569c0ff6ba65c0ca866e613fbf017586e5846d2f24bed10c7defaa7f/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:01 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf.
Nov 29 01:43:01 np0005539505 podman[199493]: 2025-11-29 06:43:01.671869833 +0000 UTC m=+0.114408090 container init 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.683Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.683Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.683Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.683Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.683Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=arp
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=bcache
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=bonding
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=cpu
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=edac
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=filefd
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=netclass
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=netdev
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=netstat
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=nfs
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=nvme
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=softnet
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=systemd
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=xfs
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.684Z caller=node_exporter.go:117 level=info collector=zfs
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.685Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 29 01:43:01 np0005539505 node_exporter[199509]: ts=2025-11-29T06:43:01.685Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 29 01:43:01 np0005539505 podman[199493]: 2025-11-29 06:43:01.696941497 +0000 UTC m=+0.139479744 container start 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:43:01 np0005539505 podman[199493]: node_exporter
Nov 29 01:43:01 np0005539505 systemd[1]: Started node_exporter container.
Nov 29 01:43:01 np0005539505 podman[199518]: 2025-11-29 06:43:01.778451183 +0000 UTC m=+0.072300859 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:43:02 np0005539505 python3.9[199694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:43:03 np0005539505 python3.9[199817]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398581.958342-1956-246549541717413/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:43:03 np0005539505 python3.9[199969]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 29 01:43:04 np0005539505 python3.9[200121]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:43:05 np0005539505 python3[200273]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:43:07 np0005539505 podman[200286]: 2025-11-29 06:43:07.149917259 +0000 UTC m=+1.333548556 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 29 01:43:07 np0005539505 podman[200386]: 2025-11-29 06:43:07.277866088 +0000 UTC m=+0.045780415 container create d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:43:07 np0005539505 podman[200386]: 2025-11-29 06:43:07.253385291 +0000 UTC m=+0.021299638 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 29 01:43:07 np0005539505 python3[200273]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 29 01:43:08 np0005539505 podman[200501]: 2025-11-29 06:43:08.716087729 +0000 UTC m=+0.050396764 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 01:43:09 np0005539505 python3.9[200596]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:43:09 np0005539505 python3.9[200750]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:43:10 np0005539505 python3.9[200901]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398589.9978921-2114-269314824465032/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:43:11 np0005539505 python3.9[200977]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:43:11 np0005539505 systemd[1]: Reloading.
Nov 29 01:43:11 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:43:11 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:12 np0005539505 python3.9[201088]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:43:12 np0005539505 systemd[1]: Reloading.
Nov 29 01:43:12 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:43:12 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:12 np0005539505 systemd[1]: Starting podman_exporter container...
Nov 29 01:43:12 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:43:12 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d4eecd20bd2c34379609af6ea7f2a5356c35a1db7b0affc2bfe03836dd14d0e/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:12 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d4eecd20bd2c34379609af6ea7f2a5356c35a1db7b0affc2bfe03836dd14d0e/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:12 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a.
Nov 29 01:43:12 np0005539505 podman[201128]: 2025-11-29 06:43:12.496818416 +0000 UTC m=+0.122676642 container init d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:43:12 np0005539505 podman_exporter[201143]: ts=2025-11-29T06:43:12.512Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 29 01:43:12 np0005539505 podman_exporter[201143]: ts=2025-11-29T06:43:12.512Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 29 01:43:12 np0005539505 podman_exporter[201143]: ts=2025-11-29T06:43:12.512Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 29 01:43:12 np0005539505 podman_exporter[201143]: ts=2025-11-29T06:43:12.512Z caller=handler.go:105 level=info collector=container
Nov 29 01:43:12 np0005539505 podman[201128]: 2025-11-29 06:43:12.520764978 +0000 UTC m=+0.146623194 container start d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:43:12 np0005539505 podman[201128]: podman_exporter
Nov 29 01:43:12 np0005539505 systemd[1]: Starting Podman API Service...
Nov 29 01:43:12 np0005539505 systemd[1]: Started Podman API Service.
Nov 29 01:43:12 np0005539505 systemd[1]: Started podman_exporter container.
Nov 29 01:43:12 np0005539505 podman[201154]: time="2025-11-29T06:43:12Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 29 01:43:12 np0005539505 podman[201154]: time="2025-11-29T06:43:12Z" level=info msg="Setting parallel job count to 25"
Nov 29 01:43:12 np0005539505 podman[201154]: time="2025-11-29T06:43:12Z" level=info msg="Using sqlite as database backend"
Nov 29 01:43:12 np0005539505 podman[201154]: time="2025-11-29T06:43:12Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 29 01:43:12 np0005539505 podman[201154]: time="2025-11-29T06:43:12Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 29 01:43:12 np0005539505 podman[201154]: time="2025-11-29T06:43:12Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 29 01:43:12 np0005539505 podman[201154]: @ - - [29/Nov/2025:06:43:12 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 29 01:43:12 np0005539505 podman[201154]: time="2025-11-29T06:43:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 29 01:43:12 np0005539505 podman[201152]: 2025-11-29 06:43:12.591013398 +0000 UTC m=+0.058337838 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:43:12 np0005539505 podman[201154]: @ - - [29/Nov/2025:06:43:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19569 "" "Go-http-client/1.1"
Nov 29 01:43:12 np0005539505 systemd[1]: d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a-283661924b69355d.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:43:12 np0005539505 podman_exporter[201143]: ts=2025-11-29T06:43:12.596Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 29 01:43:12 np0005539505 systemd[1]: d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a-283661924b69355d.service: Failed with result 'exit-code'.
Nov 29 01:43:12 np0005539505 podman_exporter[201143]: ts=2025-11-29T06:43:12.597Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 29 01:43:12 np0005539505 podman_exporter[201143]: ts=2025-11-29T06:43:12.597Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 29 01:43:13 np0005539505 podman[201311]: 2025-11-29 06:43:13.73907533 +0000 UTC m=+0.077204476 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller)
Nov 29 01:43:14 np0005539505 python3.9[201359]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:43:14 np0005539505 systemd[1]: Stopping podman_exporter container...
Nov 29 01:43:14 np0005539505 podman[201154]: @ - - [29/Nov/2025:06:43:12 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 3437 "" "Go-http-client/1.1"
Nov 29 01:43:14 np0005539505 systemd[1]: libpod-d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a.scope: Deactivated successfully.
Nov 29 01:43:14 np0005539505 podman[201372]: 2025-11-29 06:43:14.141236901 +0000 UTC m=+0.040724713 container died d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:43:14 np0005539505 systemd[1]: d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a-283661924b69355d.timer: Deactivated successfully.
Nov 29 01:43:14 np0005539505 systemd[1]: Stopped /usr/bin/podman healthcheck run d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a.
Nov 29 01:43:14 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a-userdata-shm.mount: Deactivated successfully.
Nov 29 01:43:14 np0005539505 systemd[1]: var-lib-containers-storage-overlay-6d4eecd20bd2c34379609af6ea7f2a5356c35a1db7b0affc2bfe03836dd14d0e-merged.mount: Deactivated successfully.
Nov 29 01:43:14 np0005539505 podman[201372]: 2025-11-29 06:43:14.905588641 +0000 UTC m=+0.805076453 container cleanup d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:43:14 np0005539505 podman[201372]: podman_exporter
Nov 29 01:43:14 np0005539505 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 29 01:43:14 np0005539505 podman[201398]: podman_exporter
Nov 29 01:43:14 np0005539505 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 29 01:43:14 np0005539505 systemd[1]: Stopped podman_exporter container.
Nov 29 01:43:14 np0005539505 systemd[1]: Starting podman_exporter container...
Nov 29 01:43:15 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:43:15 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d4eecd20bd2c34379609af6ea7f2a5356c35a1db7b0affc2bfe03836dd14d0e/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:15 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d4eecd20bd2c34379609af6ea7f2a5356c35a1db7b0affc2bfe03836dd14d0e/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:15 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a.
Nov 29 01:43:15 np0005539505 podman[201411]: 2025-11-29 06:43:15.131268741 +0000 UTC m=+0.147263562 container init d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:43:15 np0005539505 podman_exporter[201423]: ts=2025-11-29T06:43:15.144Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 29 01:43:15 np0005539505 podman_exporter[201423]: ts=2025-11-29T06:43:15.144Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 29 01:43:15 np0005539505 podman_exporter[201423]: ts=2025-11-29T06:43:15.144Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 29 01:43:15 np0005539505 podman_exporter[201423]: ts=2025-11-29T06:43:15.144Z caller=handler.go:105 level=info collector=container
Nov 29 01:43:15 np0005539505 podman[201154]: @ - - [29/Nov/2025:06:43:15 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 29 01:43:15 np0005539505 podman[201154]: time="2025-11-29T06:43:15Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 29 01:43:15 np0005539505 podman[201411]: 2025-11-29 06:43:15.161609902 +0000 UTC m=+0.177604713 container start d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:43:15 np0005539505 podman[201411]: podman_exporter
Nov 29 01:43:15 np0005539505 podman[201154]: @ - - [29/Nov/2025:06:43:15 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19571 "" "Go-http-client/1.1"
Nov 29 01:43:15 np0005539505 podman_exporter[201423]: ts=2025-11-29T06:43:15.277Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 29 01:43:15 np0005539505 podman_exporter[201423]: ts=2025-11-29T06:43:15.277Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 29 01:43:15 np0005539505 podman_exporter[201423]: ts=2025-11-29T06:43:15.278Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 29 01:43:15 np0005539505 systemd[1]: Started podman_exporter container.
Nov 29 01:43:15 np0005539505 podman[201433]: 2025-11-29 06:43:15.300954881 +0000 UTC m=+0.130683067 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:43:15 np0005539505 python3.9[201607]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:43:16 np0005539505 python3.9[201730]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764398595.571725-2211-102798502058261/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 29 01:43:17 np0005539505 podman[201854]: 2025-11-29 06:43:17.426710925 +0000 UTC m=+0.042647969 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:43:17 np0005539505 systemd[1]: d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-26cbd15752bc557b.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:43:17 np0005539505 systemd[1]: d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-26cbd15752bc557b.service: Failed with result 'exit-code'.
Nov 29 01:43:17 np0005539505 python3.9[201901]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 29 01:43:18 np0005539505 python3.9[202054]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 29 01:43:19 np0005539505 python3[202206]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 29 01:43:22 np0005539505 podman[202246]: 2025-11-29 06:43:22.633017738 +0000 UTC m=+1.931014135 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 01:43:25 np0005539505 podman[202219]: 2025-11-29 06:43:25.353888189 +0000 UTC m=+5.975449411 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 29 01:43:25 np0005539505 podman[202334]: 2025-11-29 06:43:25.468716883 +0000 UTC m=+0.023127640 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 29 01:43:25 np0005539505 podman[202334]: 2025-11-29 06:43:25.776535977 +0000 UTC m=+0.330946704 container create 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 01:43:25 np0005539505 python3[202206]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 29 01:43:26 np0005539505 auditd[700]: Audit daemon rotating log files
Nov 29 01:43:26 np0005539505 python3.9[202523]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:43:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:43:26.917 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:43:26.918 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:43:26.918 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:27 np0005539505 python3.9[202677]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:43:28 np0005539505 python3.9[202828]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764398607.44753-2369-155285484676132/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:43:28 np0005539505 python3.9[202904]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 29 01:43:28 np0005539505 systemd[1]: Reloading.
Nov 29 01:43:28 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:43:28 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:29 np0005539505 python3.9[203015]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 29 01:43:29 np0005539505 systemd[1]: Reloading.
Nov 29 01:43:29 np0005539505 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 29 01:43:29 np0005539505 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 29 01:43:30 np0005539505 systemd[1]: Starting openstack_network_exporter container...
Nov 29 01:43:30 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:43:30 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23d528187175e950e9bcd70a5e7d24296d94257d41090f42f734af75f1317926/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:30 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23d528187175e950e9bcd70a5e7d24296d94257d41090f42f734af75f1317926/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:30 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23d528187175e950e9bcd70a5e7d24296d94257d41090f42f734af75f1317926/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:30 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e.
Nov 29 01:43:30 np0005539505 podman[203054]: 2025-11-29 06:43:30.832492979 +0000 UTC m=+0.748167390 container init 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, version=9.6, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: INFO    06:43:30 main.go:48: registering *bridge.Collector
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: INFO    06:43:30 main.go:48: registering *coverage.Collector
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: INFO    06:43:30 main.go:48: registering *datapath.Collector
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: INFO    06:43:30 main.go:48: registering *iface.Collector
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: INFO    06:43:30 main.go:48: registering *memory.Collector
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: INFO    06:43:30 main.go:48: registering *ovnnorthd.Collector
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: INFO    06:43:30 main.go:48: registering *ovn.Collector
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: INFO    06:43:30 main.go:48: registering *ovsdbserver.Collector
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: INFO    06:43:30 main.go:48: registering *pmd_perf.Collector
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: INFO    06:43:30 main.go:48: registering *pmd_rxq.Collector
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: INFO    06:43:30 main.go:48: registering *vswitch.Collector
Nov 29 01:43:30 np0005539505 openstack_network_exporter[203070]: NOTICE  06:43:30 main.go:76: listening on https://:9105/metrics
Nov 29 01:43:30 np0005539505 podman[203054]: 2025-11-29 06:43:30.873453939 +0000 UTC m=+0.789128310 container start 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 01:43:30 np0005539505 podman[203054]: openstack_network_exporter
Nov 29 01:43:30 np0005539505 systemd[1]: Started openstack_network_exporter container.
Nov 29 01:43:30 np0005539505 podman[203080]: 2025-11-29 06:43:30.985612108 +0000 UTC m=+0.098686102 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 01:43:31 np0005539505 python3.9[203252]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 29 01:43:31 np0005539505 systemd[1]: Stopping openstack_network_exporter container...
Nov 29 01:43:31 np0005539505 systemd[1]: libpod-88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e.scope: Deactivated successfully.
Nov 29 01:43:31 np0005539505 podman[203256]: 2025-11-29 06:43:31.868303074 +0000 UTC m=+0.130124835 container died 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, vcs-type=git, config_id=edpm, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 01:43:31 np0005539505 systemd[1]: 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e-593f0736240248aa.timer: Deactivated successfully.
Nov 29 01:43:31 np0005539505 systemd[1]: Stopped /usr/bin/podman healthcheck run 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e.
Nov 29 01:43:32 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e-userdata-shm.mount: Deactivated successfully.
Nov 29 01:43:32 np0005539505 systemd[1]: var-lib-containers-storage-overlay-23d528187175e950e9bcd70a5e7d24296d94257d41090f42f734af75f1317926-merged.mount: Deactivated successfully.
Nov 29 01:43:32 np0005539505 podman[203272]: 2025-11-29 06:43:32.041645181 +0000 UTC m=+0.155067495 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:43:32 np0005539505 nova_compute[186958]: 2025-11-29 06:43:32.745 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:32 np0005539505 nova_compute[186958]: 2025-11-29 06:43:32.885 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:33 np0005539505 nova_compute[186958]: 2025-11-29 06:43:33.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:33 np0005539505 nova_compute[186958]: 2025-11-29 06:43:33.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:33 np0005539505 nova_compute[186958]: 2025-11-29 06:43:33.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:33 np0005539505 nova_compute[186958]: 2025-11-29 06:43:33.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.395 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.396 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.396 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.397 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.397 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.419 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.420 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.420 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.420 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.591 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.592 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5900MB free_disk=73.37907028198242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.593 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.593 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.689 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.690 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.709 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.721 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.723 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:43:34 np0005539505 nova_compute[186958]: 2025-11-29 06:43:34.723 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:43:35 np0005539505 podman[203256]: 2025-11-29 06:43:35.085546062 +0000 UTC m=+3.347367753 container cleanup 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Nov 29 01:43:35 np0005539505 podman[203256]: openstack_network_exporter
Nov 29 01:43:35 np0005539505 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 29 01:43:35 np0005539505 podman[203308]: openstack_network_exporter
Nov 29 01:43:35 np0005539505 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 29 01:43:35 np0005539505 systemd[1]: Stopped openstack_network_exporter container.
Nov 29 01:43:35 np0005539505 systemd[1]: Starting openstack_network_exporter container...
Nov 29 01:43:35 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:43:35 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23d528187175e950e9bcd70a5e7d24296d94257d41090f42f734af75f1317926/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:35 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23d528187175e950e9bcd70a5e7d24296d94257d41090f42f734af75f1317926/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:35 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23d528187175e950e9bcd70a5e7d24296d94257d41090f42f734af75f1317926/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 29 01:43:35 np0005539505 systemd[1]: Started /usr/bin/podman healthcheck run 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e.
Nov 29 01:43:35 np0005539505 podman[203321]: 2025-11-29 06:43:35.874907029 +0000 UTC m=+0.708772934 container init 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64)
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: INFO    06:43:35 main.go:48: registering *bridge.Collector
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: INFO    06:43:35 main.go:48: registering *coverage.Collector
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: INFO    06:43:35 main.go:48: registering *datapath.Collector
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: INFO    06:43:35 main.go:48: registering *iface.Collector
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: INFO    06:43:35 main.go:48: registering *memory.Collector
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: INFO    06:43:35 main.go:48: registering *ovnnorthd.Collector
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: INFO    06:43:35 main.go:48: registering *ovn.Collector
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: INFO    06:43:35 main.go:48: registering *ovsdbserver.Collector
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: INFO    06:43:35 main.go:48: registering *pmd_perf.Collector
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: INFO    06:43:35 main.go:48: registering *pmd_rxq.Collector
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: INFO    06:43:35 main.go:48: registering *vswitch.Collector
Nov 29 01:43:35 np0005539505 openstack_network_exporter[203336]: NOTICE  06:43:35 main.go:76: listening on https://:9105/metrics
Nov 29 01:43:35 np0005539505 podman[203321]: 2025-11-29 06:43:35.907210836 +0000 UTC m=+0.741076761 container start 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, release=1755695350, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=edpm)
Nov 29 01:43:36 np0005539505 podman[203321]: openstack_network_exporter
Nov 29 01:43:36 np0005539505 systemd[1]: Started openstack_network_exporter container.
Nov 29 01:43:36 np0005539505 podman[203346]: 2025-11-29 06:43:36.214388731 +0000 UTC m=+0.299744568 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 01:43:36 np0005539505 python3.9[203518]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 29 01:43:39 np0005539505 podman[203543]: 2025-11-29 06:43:39.737473359 +0000 UTC m=+0.074213415 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:43:44 np0005539505 podman[203564]: 2025-11-29 06:43:44.773042439 +0000 UTC m=+0.094143885 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:43:45 np0005539505 podman[203590]: 2025-11-29 06:43:45.744887357 +0000 UTC m=+0.063470273 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:43:47 np0005539505 podman[203614]: 2025-11-29 06:43:47.753763748 +0000 UTC m=+0.078401353 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0)
Nov 29 01:43:47 np0005539505 systemd[1]: d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-26cbd15752bc557b.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:43:47 np0005539505 systemd[1]: d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-26cbd15752bc557b.service: Failed with result 'exit-code'.
Nov 29 01:43:53 np0005539505 podman[203633]: 2025-11-29 06:43:53.733246191 +0000 UTC m=+0.061613092 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 29 01:44:05 np0005539505 podman[203653]: 2025-11-29 06:44:05.744439063 +0000 UTC m=+0.077972720 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:44:06 np0005539505 podman[203677]: 2025-11-29 06:44:06.728118166 +0000 UTC m=+0.057451795 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Nov 29 01:44:10 np0005539505 podman[203699]: 2025-11-29 06:44:10.755278597 +0000 UTC m=+0.074889954 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:44:15 np0005539505 podman[203718]: 2025-11-29 06:44:15.754782714 +0000 UTC m=+0.090578944 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:44:15 np0005539505 podman[203744]: 2025-11-29 06:44:15.87391681 +0000 UTC m=+0.077817897 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:44:18 np0005539505 podman[203768]: 2025-11-29 06:44:18.758419496 +0000 UTC m=+0.085823431 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 29 01:44:18 np0005539505 systemd[1]: d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-26cbd15752bc557b.service: Main process exited, code=exited, status=1/FAILURE
Nov 29 01:44:18 np0005539505 systemd[1]: d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623-26cbd15752bc557b.service: Failed with result 'exit-code'.
Nov 29 01:44:24 np0005539505 podman[203787]: 2025-11-29 06:44:24.767657471 +0000 UTC m=+0.079125097 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 01:44:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:44:26.918 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:44:26.919 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:44:26.919 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:27 np0005539505 python3.9[203935]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 29 01:44:28 np0005539505 python3.9[204100]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:29 np0005539505 systemd[1]: Started libpod-conmon-f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43.scope.
Nov 29 01:44:29 np0005539505 podman[204101]: 2025-11-29 06:44:29.413060447 +0000 UTC m=+0.575610764 container exec f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:44:29 np0005539505 podman[204101]: 2025-11-29 06:44:29.685635746 +0000 UTC m=+0.848186063 container exec_died f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 01:44:30 np0005539505 systemd[1]: libpod-conmon-f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43.scope: Deactivated successfully.
Nov 29 01:44:31 np0005539505 python3.9[204286]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:31 np0005539505 systemd[1]: Started libpod-conmon-f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43.scope.
Nov 29 01:44:31 np0005539505 podman[204287]: 2025-11-29 06:44:31.336870639 +0000 UTC m=+0.097753841 container exec f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 01:44:31 np0005539505 podman[204287]: 2025-11-29 06:44:31.369893268 +0000 UTC m=+0.130776420 container exec_died f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 01:44:31 np0005539505 systemd[1]: libpod-conmon-f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43.scope: Deactivated successfully.
Nov 29 01:44:32 np0005539505 python3.9[204471]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:32 np0005539505 nova_compute[186958]: 2025-11-29 06:44:32.704 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:33 np0005539505 nova_compute[186958]: 2025-11-29 06:44:33.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:33 np0005539505 python3.9[204623]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 29 01:44:34 np0005539505 python3.9[204788]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:34 np0005539505 nova_compute[186958]: 2025-11-29 06:44:34.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:34 np0005539505 nova_compute[186958]: 2025-11-29 06:44:34.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:34 np0005539505 nova_compute[186958]: 2025-11-29 06:44:34.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:44:34 np0005539505 nova_compute[186958]: 2025-11-29 06:44:34.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:44:34 np0005539505 nova_compute[186958]: 2025-11-29 06:44:34.440 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:44:34 np0005539505 nova_compute[186958]: 2025-11-29 06:44:34.441 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:34 np0005539505 nova_compute[186958]: 2025-11-29 06:44:34.441 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:34 np0005539505 nova_compute[186958]: 2025-11-29 06:44:34.442 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:34 np0005539505 nova_compute[186958]: 2025-11-29 06:44:34.442 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:44:34 np0005539505 systemd[1]: Started libpod-conmon-ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282.scope.
Nov 29 01:44:34 np0005539505 podman[204789]: 2025-11-29 06:44:34.693436868 +0000 UTC m=+0.490851370 container exec ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 01:44:34 np0005539505 podman[204789]: 2025-11-29 06:44:34.73083043 +0000 UTC m=+0.528244922 container exec_died ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:44:34 np0005539505 systemd[1]: libpod-conmon-ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282.scope: Deactivated successfully.
Nov 29 01:44:35 np0005539505 nova_compute[186958]: 2025-11-29 06:44:35.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:35 np0005539505 python3.9[204970]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:35 np0005539505 systemd[1]: Started libpod-conmon-ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282.scope.
Nov 29 01:44:35 np0005539505 podman[204971]: 2025-11-29 06:44:35.861822487 +0000 UTC m=+0.069169527 container exec ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:44:35 np0005539505 podman[204971]: 2025-11-29 06:44:35.892528571 +0000 UTC m=+0.099875601 container exec_died ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:44:35 np0005539505 systemd[1]: libpod-conmon-ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282.scope: Deactivated successfully.
Nov 29 01:44:35 np0005539505 podman[204988]: 2025-11-29 06:44:35.94866765 +0000 UTC m=+0.080678010 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:44:36 np0005539505 nova_compute[186958]: 2025-11-29 06:44:36.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:44:36 np0005539505 python3.9[205176]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.250 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.250 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.251 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.251 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.412 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.413 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5979MB free_disk=73.37947845458984GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.413 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.414 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:44:37 np0005539505 podman[205300]: 2025-11-29 06:44:37.522115806 +0000 UTC m=+0.053352632 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, managed_by=edpm_ansible, maintainer=Red Hat, Inc., distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=)
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.539 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.539 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.593 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.631 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.632 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:44:37 np0005539505 nova_compute[186958]: 2025-11-29 06:44:37.632 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:44:37 np0005539505 python3.9[205349]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 29 01:44:38 np0005539505 python3.9[205514]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:38 np0005539505 systemd[1]: Started libpod-conmon-25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3.scope.
Nov 29 01:44:38 np0005539505 podman[205515]: 2025-11-29 06:44:38.621515725 +0000 UTC m=+0.076773301 container exec 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 01:44:38 np0005539505 podman[205515]: 2025-11-29 06:44:38.655585894 +0000 UTC m=+0.110843450 container exec_died 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 01:44:38 np0005539505 systemd[1]: libpod-conmon-25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3.scope: Deactivated successfully.
Nov 29 01:44:39 np0005539505 python3.9[205698]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:39 np0005539505 systemd[1]: Started libpod-conmon-25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3.scope.
Nov 29 01:44:39 np0005539505 podman[205699]: 2025-11-29 06:44:39.496680885 +0000 UTC m=+0.063056375 container exec 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Nov 29 01:44:39 np0005539505 podman[205699]: 2025-11-29 06:44:39.531543866 +0000 UTC m=+0.097919336 container exec_died 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:44:39 np0005539505 systemd[1]: libpod-conmon-25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3.scope: Deactivated successfully.
Nov 29 01:44:40 np0005539505 python3.9[205881]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:40 np0005539505 podman[206033]: 2025-11-29 06:44:40.858354043 +0000 UTC m=+0.057491989 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:44:40 np0005539505 python3.9[206034]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 29 01:44:41 np0005539505 python3.9[206219]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:42 np0005539505 systemd[1]: Started libpod-conmon-d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623.scope.
Nov 29 01:44:42 np0005539505 podman[206220]: 2025-11-29 06:44:42.61597202 +0000 UTC m=+0.779259974 container exec d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:44:42 np0005539505 podman[206220]: 2025-11-29 06:44:42.649960787 +0000 UTC m=+0.813248721 container exec_died d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:44:42 np0005539505 systemd[1]: libpod-conmon-d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623.scope: Deactivated successfully.
Nov 29 01:44:43 np0005539505 python3.9[206403]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:44 np0005539505 systemd[1]: Started libpod-conmon-d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623.scope.
Nov 29 01:44:44 np0005539505 podman[206404]: 2025-11-29 06:44:44.351949907 +0000 UTC m=+1.013838933 container exec d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:44:44 np0005539505 podman[206404]: 2025-11-29 06:44:44.493129869 +0000 UTC m=+1.155018875 container exec_died d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:44:44 np0005539505 systemd[1]: libpod-conmon-d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623.scope: Deactivated successfully.
Nov 29 01:44:45 np0005539505 python3.9[206587]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:46 np0005539505 podman[206711]: 2025-11-29 06:44:46.006813543 +0000 UTC m=+0.058220989 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:44:46 np0005539505 podman[206712]: 2025-11-29 06:44:46.041968142 +0000 UTC m=+0.093740408 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:44:46 np0005539505 python3.9[206780]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 29 01:44:46 np0005539505 python3.9[206951]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:47 np0005539505 systemd[1]: Started libpod-conmon-934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf.scope.
Nov 29 01:44:47 np0005539505 podman[206952]: 2025-11-29 06:44:47.350683289 +0000 UTC m=+0.369077955 container exec 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:44:47 np0005539505 podman[206972]: 2025-11-29 06:44:47.508404856 +0000 UTC m=+0.142244213 container exec_died 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:44:47 np0005539505 podman[206952]: 2025-11-29 06:44:47.59706564 +0000 UTC m=+0.615460316 container exec_died 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:44:47 np0005539505 systemd[1]: libpod-conmon-934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf.scope: Deactivated successfully.
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:44:48.079 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:44:48 np0005539505 python3.9[207136]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:48 np0005539505 systemd[1]: Started libpod-conmon-934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf.scope.
Nov 29 01:44:48 np0005539505 podman[207137]: 2025-11-29 06:44:48.80738926 +0000 UTC m=+0.081805212 container exec 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:44:48 np0005539505 podman[207137]: 2025-11-29 06:44:48.844528015 +0000 UTC m=+0.118943947 container exec_died 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:44:48 np0005539505 systemd[1]: libpod-conmon-934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf.scope: Deactivated successfully.
Nov 29 01:44:48 np0005539505 podman[207153]: 2025-11-29 06:44:48.920735269 +0000 UTC m=+0.104262574 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:44:49 np0005539505 python3.9[207338]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:50 np0005539505 python3.9[207490]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 29 01:44:51 np0005539505 python3.9[207656]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:51 np0005539505 systemd[1]: Started libpod-conmon-d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a.scope.
Nov 29 01:44:51 np0005539505 podman[207657]: 2025-11-29 06:44:51.478842855 +0000 UTC m=+0.360150723 container exec d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:44:51 np0005539505 podman[207676]: 2025-11-29 06:44:51.876933084 +0000 UTC m=+0.387663057 container exec_died d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:44:51 np0005539505 podman[207657]: 2025-11-29 06:44:51.99336358 +0000 UTC m=+0.874671438 container exec_died d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:44:52 np0005539505 systemd[1]: libpod-conmon-d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a.scope: Deactivated successfully.
Nov 29 01:44:52 np0005539505 python3.9[207840]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:53 np0005539505 systemd[1]: Started libpod-conmon-d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a.scope.
Nov 29 01:44:53 np0005539505 podman[207841]: 2025-11-29 06:44:53.356615042 +0000 UTC m=+0.373140299 container exec d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:44:53 np0005539505 podman[207861]: 2025-11-29 06:44:53.476476224 +0000 UTC m=+0.109431339 container exec_died d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:44:53 np0005539505 podman[207841]: 2025-11-29 06:44:53.688462767 +0000 UTC m=+0.704988024 container exec_died d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:44:53 np0005539505 systemd[1]: libpod-conmon-d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a.scope: Deactivated successfully.
Nov 29 01:44:54 np0005539505 python3.9[208025]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:55 np0005539505 podman[208149]: 2025-11-29 06:44:55.242006222 +0000 UTC m=+0.070081342 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:44:55 np0005539505 python3.9[208196]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 29 01:44:56 np0005539505 python3.9[208362]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:56 np0005539505 systemd[1]: Started libpod-conmon-88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e.scope.
Nov 29 01:44:56 np0005539505 podman[208363]: 2025-11-29 06:44:56.447047744 +0000 UTC m=+0.305320721 container exec 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64)
Nov 29 01:44:56 np0005539505 podman[208382]: 2025-11-29 06:44:56.533501777 +0000 UTC m=+0.070761872 container exec_died 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 01:44:56 np0005539505 podman[208363]: 2025-11-29 06:44:56.707579254 +0000 UTC m=+0.565852261 container exec_died 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter)
Nov 29 01:44:56 np0005539505 systemd[1]: libpod-conmon-88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e.scope: Deactivated successfully.
Nov 29 01:44:57 np0005539505 python3.9[208546]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 29 01:44:57 np0005539505 systemd[1]: Started libpod-conmon-88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e.scope.
Nov 29 01:44:57 np0005539505 podman[208547]: 2025-11-29 06:44:57.523641731 +0000 UTC m=+0.077634255 container exec 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 01:44:57 np0005539505 podman[208547]: 2025-11-29 06:44:57.554592992 +0000 UTC m=+0.108585486 container exec_died 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc.)
Nov 29 01:44:57 np0005539505 systemd[1]: libpod-conmon-88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e.scope: Deactivated successfully.
Nov 29 01:44:58 np0005539505 python3.9[208731]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:59 np0005539505 python3.9[208883]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:44:59 np0005539505 python3.9[209035]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:00 np0005539505 python3.9[209158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764398699.3818426-3213-223735385196862/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:01 np0005539505 python3.9[209310]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:02 np0005539505 python3.9[209462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:02 np0005539505 python3.9[209540]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:03 np0005539505 python3.9[209692]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:03 np0005539505 python3.9[209770]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.7j0glo07 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:04 np0005539505 python3.9[209922]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:05 np0005539505 python3.9[210000]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:06 np0005539505 python3.9[210152]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:45:06 np0005539505 podman[210230]: 2025-11-29 06:45:06.762488013 +0000 UTC m=+0.083747497 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:45:07 np0005539505 python3[210329]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 29 01:45:07 np0005539505 podman[210429]: 2025-11-29 06:45:07.771385137 +0000 UTC m=+0.090410654 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 01:45:08 np0005539505 python3.9[210503]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:08 np0005539505 python3.9[210581]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:09 np0005539505 python3.9[210733]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:09 np0005539505 python3.9[210811]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:10 np0005539505 python3.9[210963]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:10 np0005539505 python3.9[211041]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:11 np0005539505 podman[211165]: 2025-11-29 06:45:11.603196214 +0000 UTC m=+0.053130835 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:45:11 np0005539505 python3.9[211209]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:12 np0005539505 python3.9[211288]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:12 np0005539505 python3.9[211440]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 29 01:45:13 np0005539505 python3.9[211565]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764398712.470684-3588-217990371086866/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:14 np0005539505 python3.9[211717]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:15 np0005539505 python3.9[211869]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:45:16 np0005539505 podman[212024]: 2025-11-29 06:45:16.154118424 +0000 UTC m=+0.104344826 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:45:16 np0005539505 python3.9[212025]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:16 np0005539505 podman[212049]: 2025-11-29 06:45:16.250987679 +0000 UTC m=+0.074481546 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:45:17 np0005539505 python3.9[212226]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:45:18 np0005539505 python3.9[212379]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 29 01:45:18 np0005539505 python3.9[212533]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 29 01:45:19 np0005539505 podman[212660]: 2025-11-29 06:45:19.393051143 +0000 UTC m=+0.056437799 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:45:19 np0005539505 python3.9[212706]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 29 01:45:20 np0005539505 systemd-logind[794]: Session 26 logged out. Waiting for processes to exit.
Nov 29 01:45:20 np0005539505 systemd[1]: session-26.scope: Deactivated successfully.
Nov 29 01:45:20 np0005539505 systemd[1]: session-26.scope: Consumed 1min 36.170s CPU time.
Nov 29 01:45:20 np0005539505 systemd-logind[794]: Removed session 26.
Nov 29 01:45:25 np0005539505 podman[212733]: 2025-11-29 06:45:25.888448235 +0000 UTC m=+0.059100212 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 29 01:45:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:45:26.920 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:45:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:45:26.922 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:45:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:45:26.922 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:45:33 np0005539505 nova_compute[186958]: 2025-11-29 06:45:33.633 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:34 np0005539505 nova_compute[186958]: 2025-11-29 06:45:34.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:34 np0005539505 nova_compute[186958]: 2025-11-29 06:45:34.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:34 np0005539505 nova_compute[186958]: 2025-11-29 06:45:34.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:45:35 np0005539505 nova_compute[186958]: 2025-11-29 06:45:35.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:35 np0005539505 nova_compute[186958]: 2025-11-29 06:45:35.390 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:35 np0005539505 nova_compute[186958]: 2025-11-29 06:45:35.391 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:36 np0005539505 nova_compute[186958]: 2025-11-29 06:45:36.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:36 np0005539505 nova_compute[186958]: 2025-11-29 06:45:36.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:36 np0005539505 nova_compute[186958]: 2025-11-29 06:45:36.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:45:36 np0005539505 nova_compute[186958]: 2025-11-29 06:45:36.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:45:36 np0005539505 nova_compute[186958]: 2025-11-29 06:45:36.492 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:45:37 np0005539505 nova_compute[186958]: 2025-11-29 06:45:37.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:37 np0005539505 podman[212755]: 2025-11-29 06:45:37.721993407 +0000 UTC m=+0.050863731 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.408 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.409 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.409 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.409 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.537 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.538 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5994MB free_disk=73.37909698486328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.538 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.538 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.621 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.622 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.640 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.662 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.663 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:45:38 np0005539505 nova_compute[186958]: 2025-11-29 06:45:38.663 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:45:38 np0005539505 podman[212779]: 2025-11-29 06:45:38.733142382 +0000 UTC m=+0.072638138 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Nov 29 01:45:41 np0005539505 podman[212800]: 2025-11-29 06:45:41.75024998 +0000 UTC m=+0.074636893 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 29 01:45:46 np0005539505 podman[212818]: 2025-11-29 06:45:46.759601661 +0000 UTC m=+0.076901536 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:45:46 np0005539505 podman[212819]: 2025-11-29 06:45:46.765738892 +0000 UTC m=+0.085717262 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:45:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:45:47.988 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:45:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:45:47.989 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:45:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:45:47.990 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:45:49 np0005539505 podman[212866]: 2025-11-29 06:45:49.726050636 +0000 UTC m=+0.056429656 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Nov 29 01:45:56 np0005539505 podman[212886]: 2025-11-29 06:45:56.738952822 +0000 UTC m=+0.064446369 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:46:08 np0005539505 podman[212906]: 2025-11-29 06:46:08.710384692 +0000 UTC m=+0.043105663 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:46:09 np0005539505 podman[212930]: 2025-11-29 06:46:09.777994142 +0000 UTC m=+0.093917132 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, name=ubi9-minimal, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 01:46:12 np0005539505 podman[212949]: 2025-11-29 06:46:12.740752204 +0000 UTC m=+0.071350842 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:46:17 np0005539505 podman[212968]: 2025-11-29 06:46:17.756105331 +0000 UTC m=+0.077064841 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:46:17 np0005539505 podman[212969]: 2025-11-29 06:46:17.780965285 +0000 UTC m=+0.102283975 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 01:46:20 np0005539505 podman[213018]: 2025-11-29 06:46:20.752814421 +0000 UTC m=+0.074761847 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:46:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:46:26.922 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:46:26.923 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:46:26.923 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:27 np0005539505 podman[213039]: 2025-11-29 06:46:27.739432264 +0000 UTC m=+0.077247637 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 01:46:32 np0005539505 nova_compute[186958]: 2025-11-29 06:46:32.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:32 np0005539505 nova_compute[186958]: 2025-11-29 06:46:32.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:32 np0005539505 nova_compute[186958]: 2025-11-29 06:46:32.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:46:32 np0005539505 nova_compute[186958]: 2025-11-29 06:46:32.395 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:46:32 np0005539505 nova_compute[186958]: 2025-11-29 06:46:32.396 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:32 np0005539505 nova_compute[186958]: 2025-11-29 06:46:32.396 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:46:32 np0005539505 nova_compute[186958]: 2025-11-29 06:46:32.408 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:35 np0005539505 nova_compute[186958]: 2025-11-29 06:46:35.417 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:36 np0005539505 nova_compute[186958]: 2025-11-29 06:46:36.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:36 np0005539505 nova_compute[186958]: 2025-11-29 06:46:36.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:36 np0005539505 nova_compute[186958]: 2025-11-29 06:46:36.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:36 np0005539505 nova_compute[186958]: 2025-11-29 06:46:36.377 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:46:37 np0005539505 nova_compute[186958]: 2025-11-29 06:46:37.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:37 np0005539505 nova_compute[186958]: 2025-11-29 06:46:37.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:46:37 np0005539505 nova_compute[186958]: 2025-11-29 06:46:37.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:46:37 np0005539505 nova_compute[186958]: 2025-11-29 06:46:37.591 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:46:37 np0005539505 nova_compute[186958]: 2025-11-29 06:46:37.592 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:38 np0005539505 nova_compute[186958]: 2025-11-29 06:46:38.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:39 np0005539505 nova_compute[186958]: 2025-11-29 06:46:39.243 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:39 np0005539505 nova_compute[186958]: 2025-11-29 06:46:39.243 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:39 np0005539505 nova_compute[186958]: 2025-11-29 06:46:39.244 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:39 np0005539505 nova_compute[186958]: 2025-11-29 06:46:39.244 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:46:39 np0005539505 nova_compute[186958]: 2025-11-29 06:46:39.389 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:46:39 np0005539505 nova_compute[186958]: 2025-11-29 06:46:39.389 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6035MB free_disk=73.37909698486328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:46:39 np0005539505 nova_compute[186958]: 2025-11-29 06:46:39.390 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:46:39 np0005539505 nova_compute[186958]: 2025-11-29 06:46:39.390 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:46:39 np0005539505 podman[213059]: 2025-11-29 06:46:39.727727266 +0000 UTC m=+0.058340012 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 01:46:40 np0005539505 podman[213083]: 2025-11-29 06:46:40.735914804 +0000 UTC m=+0.062228951 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, name=ubi9-minimal, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Nov 29 01:46:43 np0005539505 podman[213104]: 2025-11-29 06:46:43.749927135 +0000 UTC m=+0.076724336 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:46:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:46:48 np0005539505 podman[213123]: 2025-11-29 06:46:48.717124357 +0000 UTC m=+0.051383648 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:46:48 np0005539505 podman[213124]: 2025-11-29 06:46:48.74798862 +0000 UTC m=+0.078099845 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 01:46:49 np0005539505 nova_compute[186958]: 2025-11-29 06:46:49.544 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:46:49 np0005539505 nova_compute[186958]: 2025-11-29 06:46:49.545 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:46:49 np0005539505 nova_compute[186958]: 2025-11-29 06:46:49.598 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:46:49 np0005539505 nova_compute[186958]: 2025-11-29 06:46:49.651 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:46:49 np0005539505 nova_compute[186958]: 2025-11-29 06:46:49.651 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:46:49 np0005539505 nova_compute[186958]: 2025-11-29 06:46:49.674 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:46:49 np0005539505 nova_compute[186958]: 2025-11-29 06:46:49.705 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:46:49 np0005539505 nova_compute[186958]: 2025-11-29 06:46:49.725 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:46:50 np0005539505 nova_compute[186958]: 2025-11-29 06:46:50.286 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:46:50 np0005539505 nova_compute[186958]: 2025-11-29 06:46:50.289 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:46:50 np0005539505 nova_compute[186958]: 2025-11-29 06:46:50.290 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 10.900s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:46:51 np0005539505 nova_compute[186958]: 2025-11-29 06:46:51.291 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:46:51 np0005539505 podman[213173]: 2025-11-29 06:46:51.758820932 +0000 UTC m=+0.081468058 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:46:58 np0005539505 podman[213193]: 2025-11-29 06:46:58.745147438 +0000 UTC m=+0.075817151 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 01:47:10 np0005539505 podman[213214]: 2025-11-29 06:47:10.762539409 +0000 UTC m=+0.097964740 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:47:10 np0005539505 podman[213238]: 2025-11-29 06:47:10.888141581 +0000 UTC m=+0.083295660 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6, architecture=x86_64)
Nov 29 01:47:14 np0005539505 podman[213261]: 2025-11-29 06:47:14.757045405 +0000 UTC m=+0.076717416 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 29 01:47:19 np0005539505 podman[213280]: 2025-11-29 06:47:19.722136857 +0000 UTC m=+0.053960490 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:47:19 np0005539505 podman[213281]: 2025-11-29 06:47:19.811443084 +0000 UTC m=+0.128312609 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:47:22 np0005539505 podman[213329]: 2025-11-29 06:47:22.74878861 +0000 UTC m=+0.080329407 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:47:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:47:26.923 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:47:26.923 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:47:26.924 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:29 np0005539505 podman[213349]: 2025-11-29 06:47:29.756158202 +0000 UTC m=+0.078283970 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:47:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:47:30.684 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:47:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:47:30.685 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:47:33 np0005539505 nova_compute[186958]: 2025-11-29 06:47:33.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:47:33.686 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:47:35 np0005539505 nova_compute[186958]: 2025-11-29 06:47:35.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:36 np0005539505 nova_compute[186958]: 2025-11-29 06:47:36.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:36 np0005539505 nova_compute[186958]: 2025-11-29 06:47:36.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:37 np0005539505 nova_compute[186958]: 2025-11-29 06:47:37.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:37 np0005539505 nova_compute[186958]: 2025-11-29 06:47:37.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:37 np0005539505 nova_compute[186958]: 2025-11-29 06:47:37.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:47:38 np0005539505 nova_compute[186958]: 2025-11-29 06:47:38.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.394 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.394 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.417 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.417 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.417 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.418 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.546 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.547 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6061MB free_disk=73.37911605834961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.547 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.547 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.608 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.609 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.633 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.645 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.646 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:47:39 np0005539505 nova_compute[186958]: 2025-11-29 06:47:39.646 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:41 np0005539505 nova_compute[186958]: 2025-11-29 06:47:41.630 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:47:41 np0005539505 podman[213369]: 2025-11-29 06:47:41.711983054 +0000 UTC m=+0.048310472 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 01:47:41 np0005539505 podman[213370]: 2025-11-29 06:47:41.725210684 +0000 UTC m=+0.057932881 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:47:45 np0005539505 podman[213415]: 2025-11-29 06:47:45.743337978 +0000 UTC m=+0.068668901 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:47:46 np0005539505 nova_compute[186958]: 2025-11-29 06:47:46.669 186962 DEBUG oslo_concurrency.processutils [None req-7ed93898-f050-4446-a839-fc08ea4dee1b c2724998d3164e38b48f25d41018a8e1 50c59264d1984f368b2b5da56ea48520 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:47:46 np0005539505 nova_compute[186958]: 2025-11-29 06:47:46.707 186962 DEBUG oslo_concurrency.processutils [None req-7ed93898-f050-4446-a839-fc08ea4dee1b c2724998d3164e38b48f25d41018a8e1 50c59264d1984f368b2b5da56ea48520 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:47:50 np0005539505 podman[213434]: 2025-11-29 06:47:50.737419901 +0000 UTC m=+0.067766106 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:47:50 np0005539505 podman[213435]: 2025-11-29 06:47:50.796114812 +0000 UTC m=+0.112500637 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:47:53 np0005539505 podman[213482]: 2025-11-29 06:47:53.744906479 +0000 UTC m=+0.074923716 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.332 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "7a843394-1a35-483a-89ee-e5946c49ac74" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.332 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.352 186962 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.584 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.585 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.593 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.593 186962 INFO nova.compute.claims [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.772 186962 DEBUG nova.compute.provider_tree [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.809 186962 DEBUG nova.scheduler.client.report [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.836 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.837 186962 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.904 186962 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.905 186962 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.942 186962 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:47:59 np0005539505 nova_compute[186958]: 2025-11-29 06:47:59.971 186962 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:48:00 np0005539505 nova_compute[186958]: 2025-11-29 06:48:00.164 186962 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:48:00 np0005539505 nova_compute[186958]: 2025-11-29 06:48:00.165 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:48:00 np0005539505 nova_compute[186958]: 2025-11-29 06:48:00.166 186962 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Creating image(s)#033[00m
Nov 29 01:48:00 np0005539505 nova_compute[186958]: 2025-11-29 06:48:00.166 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "/var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:00 np0005539505 nova_compute[186958]: 2025-11-29 06:48:00.167 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "/var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:00 np0005539505 nova_compute[186958]: 2025-11-29 06:48:00.167 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "/var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:00 np0005539505 nova_compute[186958]: 2025-11-29 06:48:00.168 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:00 np0005539505 nova_compute[186958]: 2025-11-29 06:48:00.168 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:00 np0005539505 nova_compute[186958]: 2025-11-29 06:48:00.663 186962 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Automatically allocating a network for project 6d2e7db012114f9eb8e8e1b0123c9974. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Nov 29 01:48:00 np0005539505 podman[213502]: 2025-11-29 06:48:00.743982841 +0000 UTC m=+0.071778058 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 29 01:48:02 np0005539505 nova_compute[186958]: 2025-11-29 06:48:02.205 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:02 np0005539505 nova_compute[186958]: 2025-11-29 06:48:02.300 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:02 np0005539505 nova_compute[186958]: 2025-11-29 06:48:02.302 186962 DEBUG nova.virt.images [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] 5d270706-931c-4fd1-846d-ba6ddeac2a79 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 01:48:02 np0005539505 nova_compute[186958]: 2025-11-29 06:48:02.303 186962 DEBUG nova.privsep.utils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:48:02 np0005539505 nova_compute[186958]: 2025-11-29 06:48:02.304 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:02 np0005539505 nova_compute[186958]: 2025-11-29 06:48:02.704 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.part /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted" returned: 0 in 0.400s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:02 np0005539505 nova_compute[186958]: 2025-11-29 06:48:02.710 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:02 np0005539505 nova_compute[186958]: 2025-11-29 06:48:02.799 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28.converted --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:02 np0005539505 nova_compute[186958]: 2025-11-29 06:48:02.802 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:02 np0005539505 nova_compute[186958]: 2025-11-29 06:48:02.834 186962 INFO oslo.privsep.daemon [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpenxl399f/privsep.sock']#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.519 186962 INFO oslo.privsep.daemon [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.398 213540 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.406 213540 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.410 213540 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.411 213540 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213540#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.625 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.697 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.698 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.699 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.709 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.770 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.771 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.940 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk 1073741824" returned: 0 in 0.169s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.941 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:03 np0005539505 nova_compute[186958]: 2025-11-29 06:48:03.942 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:04 np0005539505 nova_compute[186958]: 2025-11-29 06:48:04.011 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:04 np0005539505 nova_compute[186958]: 2025-11-29 06:48:04.013 186962 DEBUG nova.virt.disk.api [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Checking if we can resize image /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:48:04 np0005539505 nova_compute[186958]: 2025-11-29 06:48:04.013 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:04 np0005539505 nova_compute[186958]: 2025-11-29 06:48:04.062 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:04 np0005539505 nova_compute[186958]: 2025-11-29 06:48:04.063 186962 DEBUG nova.virt.disk.api [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Cannot resize image /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:48:04 np0005539505 nova_compute[186958]: 2025-11-29 06:48:04.063 186962 DEBUG nova.objects.instance [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'migration_context' on Instance uuid 7a843394-1a35-483a-89ee-e5946c49ac74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:48:04 np0005539505 nova_compute[186958]: 2025-11-29 06:48:04.090 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:48:04 np0005539505 nova_compute[186958]: 2025-11-29 06:48:04.091 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Ensure instance console log exists: /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:48:04 np0005539505 nova_compute[186958]: 2025-11-29 06:48:04.091 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:04 np0005539505 nova_compute[186958]: 2025-11-29 06:48:04.091 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:04 np0005539505 nova_compute[186958]: 2025-11-29 06:48:04.092 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:12 np0005539505 podman[213558]: 2025-11-29 06:48:12.711468358 +0000 UTC m=+0.042096578 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 01:48:12 np0005539505 podman[213557]: 2025-11-29 06:48:12.719147953 +0000 UTC m=+0.055756410 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350)
Nov 29 01:48:16 np0005539505 podman[213602]: 2025-11-29 06:48:16.76845138 +0000 UTC m=+0.082170508 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 29 01:48:21 np0005539505 podman[213621]: 2025-11-29 06:48:21.752091891 +0000 UTC m=+0.077162819 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:48:21 np0005539505 podman[213622]: 2025-11-29 06:48:21.786728319 +0000 UTC m=+0.108023671 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 01:48:24 np0005539505 podman[213671]: 2025-11-29 06:48:24.937157934 +0000 UTC m=+0.263416015 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:48:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:26.925 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:26.927 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:26.927 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:30 np0005539505 nova_compute[186958]: 2025-11-29 06:48:30.738 186962 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Automatically allocated network: {'id': '425e933e-ca72-466c-8d2b-499c7ba67318', 'name': 'auto_allocated_network', 'tenant_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['89838152-b4b2-434b-a7d9-d3f897cb4399', 'a56d2d79-817f-461e-9014-0136415cc45e'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-11-29T06:48:00Z', 'updated_at': '2025-11-29T06:48:09Z', 'revision_number': 4, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Nov 29 01:48:30 np0005539505 nova_compute[186958]: 2025-11-29 06:48:30.748 186962 WARNING oslo_policy.policy [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 29 01:48:30 np0005539505 nova_compute[186958]: 2025-11-29 06:48:30.748 186962 WARNING oslo_policy.policy [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 29 01:48:30 np0005539505 nova_compute[186958]: 2025-11-29 06:48:30.750 186962 DEBUG nova.policy [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7a31c969c2f744a9810fc9890dd7acb2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:48:31 np0005539505 podman[213691]: 2025-11-29 06:48:31.732713837 +0000 UTC m=+0.069500244 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 01:48:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:32.196 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:48:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:32.197 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:48:33 np0005539505 nova_compute[186958]: 2025-11-29 06:48:33.412 186962 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Successfully created port: 2fb0f375-9270-4e22-8277-6d04ca007319 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:48:35 np0005539505 nova_compute[186958]: 2025-11-29 06:48:35.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:36 np0005539505 nova_compute[186958]: 2025-11-29 06:48:36.008 186962 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Creating tmpfile /var/lib/nova/instances/tmpkcenheiq to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 29 01:48:36 np0005539505 nova_compute[186958]: 2025-11-29 06:48:36.453 186962 DEBUG nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkcenheiq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 29 01:48:36 np0005539505 nova_compute[186958]: 2025-11-29 06:48:36.483 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:48:36 np0005539505 nova_compute[186958]: 2025-11-29 06:48:36.484 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:48:36 np0005539505 nova_compute[186958]: 2025-11-29 06:48:36.494 186962 INFO nova.compute.rpcapi [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Nov 29 01:48:36 np0005539505 nova_compute[186958]: 2025-11-29 06:48:36.495 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:48:37 np0005539505 nova_compute[186958]: 2025-11-29 06:48:37.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:38 np0005539505 nova_compute[186958]: 2025-11-29 06:48:38.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.594 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.595 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.596 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.596 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.597 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.597 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.840 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.840 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.840 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:39 np0005539505 nova_compute[186958]: 2025-11-29 06:48:39.840 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.011 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.012 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5994MB free_disk=73.34423446655273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.012 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.012 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.263 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Migration for instance af865d23-0f24-47aa-aeab-1c12d04b5a1e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.394 186962 INFO nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating resource usage from migration e8267ed6-ce75-49c9-85a6-d08b827f6aea#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.394 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Starting to track incoming migration e8267ed6-ce75-49c9-85a6-d08b827f6aea with flavor 1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.656 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 7a843394-1a35-483a-89ee-e5946c49ac74 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.752 186962 WARNING nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance af865d23-0f24-47aa-aeab-1c12d04b5a1e has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.752 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.753 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.862 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.892 186962 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Successfully updated port: 2fb0f375-9270-4e22-8277-6d04ca007319 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.937 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "refresh_cache-7a843394-1a35-483a-89ee-e5946c49ac74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.938 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquired lock "refresh_cache-7a843394-1a35-483a-89ee-e5946c49ac74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:48:40 np0005539505 nova_compute[186958]: 2025-11-29 06:48:40.938 186962 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.017 186962 ERROR nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [req-46cd7c8c-f908-444e-9c73-425820b793c7] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 2d55ea77-8118-4f48-9bb5-d62d10fd53c0.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-46cd7c8c-f908-444e-9c73-425820b793c7"}]}#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.055 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.091 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.091 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.127 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.151 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.222 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.306 186962 DEBUG nova.compute.manager [req-53ebce78-c6c1-4a15-b913-d564d013ee29 req-607681c6-d885-462f-b7b6-48aed56b1f35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Received event network-changed-2fb0f375-9270-4e22-8277-6d04ca007319 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.307 186962 DEBUG nova.compute.manager [req-53ebce78-c6c1-4a15-b913-d564d013ee29 req-607681c6-d885-462f-b7b6-48aed56b1f35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Refreshing instance network info cache due to event network-changed-2fb0f375-9270-4e22-8277-6d04ca007319. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.307 186962 DEBUG oslo_concurrency.lockutils [req-53ebce78-c6c1-4a15-b913-d564d013ee29 req-607681c6-d885-462f-b7b6-48aed56b1f35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7a843394-1a35-483a-89ee-e5946c49ac74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.360 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updated inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with generation 8 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.360 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 generation from 8 to 9 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.360 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.396 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.397 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.418 186962 DEBUG nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkcenheiq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.666 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.666 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquired lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.666 186962 DEBUG nova.network.neutron [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:48:41 np0005539505 nova_compute[186958]: 2025-11-29 06:48:41.720 186962 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:48:42 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:42.199 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:42 np0005539505 nova_compute[186958]: 2025-11-29 06:48:42.392 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:43 np0005539505 nova_compute[186958]: 2025-11-29 06:48:43.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:48:43 np0005539505 podman[213712]: 2025-11-29 06:48:43.735708856 +0000 UTC m=+0.057503218 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:48:43 np0005539505 podman[213711]: 2025-11-29 06:48:43.746013166 +0000 UTC m=+0.064612249 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, version=9.6, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Nov 29 01:48:47 np0005539505 podman[213757]: 2025-11-29 06:48:47.72669533 +0000 UTC m=+0.060870343 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.076 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.077 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:48:48.078 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.120 186962 DEBUG nova.network.neutron [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating instance_info_cache with network_info: [{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.142 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Releasing lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.160 186962 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkcenheiq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.161 186962 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Creating instance directory: /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.162 186962 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Creating disk.info with the contents: {'/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk': 'qcow2', '/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.162 186962 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.163 186962 DEBUG nova.objects.instance [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lazy-loading 'trusted_certs' on Instance uuid af865d23-0f24-47aa-aeab-1c12d04b5a1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.207 186962 DEBUG nova.network.neutron [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Updating instance_info_cache with network_info: [{"id": "2fb0f375-9270-4e22-8277-6d04ca007319", "address": "fa:16:3e:99:e3:8f", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::2f4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fb0f375-92", "ovs_interfaceid": "2fb0f375-9270-4e22-8277-6d04ca007319", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.210 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.235 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Releasing lock "refresh_cache-7a843394-1a35-483a-89ee-e5946c49ac74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.236 186962 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Instance network_info: |[{"id": "2fb0f375-9270-4e22-8277-6d04ca007319", "address": "fa:16:3e:99:e3:8f", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::2f4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fb0f375-92", "ovs_interfaceid": "2fb0f375-9270-4e22-8277-6d04ca007319", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.237 186962 DEBUG oslo_concurrency.lockutils [req-53ebce78-c6c1-4a15-b913-d564d013ee29 req-607681c6-d885-462f-b7b6-48aed56b1f35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7a843394-1a35-483a-89ee-e5946c49ac74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.237 186962 DEBUG nova.network.neutron [req-53ebce78-c6c1-4a15-b913-d564d013ee29 req-607681c6-d885-462f-b7b6-48aed56b1f35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Refreshing network info cache for port 2fb0f375-9270-4e22-8277-6d04ca007319 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.241 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Start _get_guest_xml network_info=[{"id": "2fb0f375-9270-4e22-8277-6d04ca007319", "address": "fa:16:3e:99:e3:8f", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::2f4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fb0f375-92", "ovs_interfaceid": "2fb0f375-9270-4e22-8277-6d04ca007319", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.247 186962 WARNING nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.252 186962 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.253 186962 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.257 186962 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.258 186962 DEBUG nova.virt.libvirt.host [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.260 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.261 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.262 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.262 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.262 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.262 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.263 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.263 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.263 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.264 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.264 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.264 186962 DEBUG nova.virt.hardware [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.269 186962 DEBUG nova.privsep.utils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.270 186962 DEBUG nova.virt.libvirt.vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:47:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-526752650-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-526752650-1',id=4,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d2e7db012114f9eb8e8e1b0123c9974',ramdisk_id='',reservation_id='r-o9m0h1rn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-224859463',owner_user_name='tempest-AutoAllocateNetworkTest-224859463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:48:00Z,user_data=None,user_id='7a31c969c2f744a9810fc9890dd7acb2',uuid=7a843394-1a35-483a-89ee-e5946c49ac74,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2fb0f375-9270-4e22-8277-6d04ca007319", "address": "fa:16:3e:99:e3:8f", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::2f4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fb0f375-92", "ovs_interfaceid": "2fb0f375-9270-4e22-8277-6d04ca007319", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.271 186962 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converting VIF {"id": "2fb0f375-9270-4e22-8277-6d04ca007319", "address": "fa:16:3e:99:e3:8f", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::2f4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fb0f375-92", "ovs_interfaceid": "2fb0f375-9270-4e22-8277-6d04ca007319", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.272 186962 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:e3:8f,bridge_name='br-int',has_traffic_filtering=True,id=2fb0f375-9270-4e22-8277-6d04ca007319,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fb0f375-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.274 186962 DEBUG nova.objects.instance [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7a843394-1a35-483a-89ee-e5946c49ac74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.294 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  <uuid>7a843394-1a35-483a-89ee-e5946c49ac74</uuid>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  <name>instance-00000004</name>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <nova:name>tempest-tempest.common.compute-instance-526752650-1</nova:name>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:48:49</nova:creationTime>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:        <nova:user uuid="7a31c969c2f744a9810fc9890dd7acb2">tempest-AutoAllocateNetworkTest-224859463-project-member</nova:user>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:        <nova:project uuid="6d2e7db012114f9eb8e8e1b0123c9974">tempest-AutoAllocateNetworkTest-224859463</nova:project>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:        <nova:port uuid="2fb0f375-9270-4e22-8277-6d04ca007319">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="fdfe:381f:8400::2f4" ipVersion="6"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.1.0.39" ipVersion="4"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <entry name="serial">7a843394-1a35-483a-89ee-e5946c49ac74</entry>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <entry name="uuid">7a843394-1a35-483a-89ee-e5946c49ac74</entry>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk.config"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:99:e3:8f"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <target dev="tap2fb0f375-92"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/console.log" append="off"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:48:49 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:48:49 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:48:49 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:48:49 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.295 186962 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Preparing to wait for external event network-vif-plugged-2fb0f375-9270-4e22-8277-6d04ca007319 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.296 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.296 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.296 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.297 186962 DEBUG nova.virt.libvirt.vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:47:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-526752650-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-526752650-1',id=4,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d2e7db012114f9eb8e8e1b0123c9974',ramdisk_id='',reservation_id='r-o9m0h1rn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-224859463',owner_user_name='tempest-AutoAllocateNetworkTest-224859463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:48:00Z,user_data=None,user_id='7a31c969c2f744a9810fc9890dd7acb2',uuid=7a843394-1a35-483a-89ee-e5946c49ac74,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2fb0f375-9270-4e22-8277-6d04ca007319", "address": "fa:16:3e:99:e3:8f", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::2f4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fb0f375-92", "ovs_interfaceid": "2fb0f375-9270-4e22-8277-6d04ca007319", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.297 186962 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converting VIF {"id": "2fb0f375-9270-4e22-8277-6d04ca007319", "address": "fa:16:3e:99:e3:8f", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::2f4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fb0f375-92", "ovs_interfaceid": "2fb0f375-9270-4e22-8277-6d04ca007319", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.298 186962 DEBUG nova.network.os_vif_util [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:e3:8f,bridge_name='br-int',has_traffic_filtering=True,id=2fb0f375-9270-4e22-8277-6d04ca007319,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fb0f375-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.299 186962 DEBUG os_vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:e3:8f,bridge_name='br-int',has_traffic_filtering=True,id=2fb0f375-9270-4e22-8277-6d04ca007319,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fb0f375-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.324 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.325 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.325 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.339 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.364 186962 DEBUG ovsdbapp.backend.ovs_idl [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.365 186962 DEBUG ovsdbapp.backend.ovs_idl [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.365 186962 DEBUG ovsdbapp.backend.ovs_idl [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.366 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.367 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [POLLOUT] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.367 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.368 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.369 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.372 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.385 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.386 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.386 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.387 186962 INFO oslo.privsep.daemon [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpqcp_uqkg/privsep.sock']#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.421 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.422 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.583 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk 1073741824" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.585 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.586 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.676 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.676 186962 DEBUG nova.virt.disk.api [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Checking if we can resize image /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.677 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.732 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.733 186962 DEBUG nova.virt.disk.api [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Cannot resize image /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:48:49 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.733 186962 DEBUG nova.objects.instance [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lazy-loading 'migration_context' on Instance uuid af865d23-0f24-47aa-aeab-1c12d04b5a1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:50.070 186962 INFO oslo.privsep.daemon [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.958 213797 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.962 213797 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.964 213797 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:49.964 213797 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213797#033[00m
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:50.383 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:50.384 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fb0f375-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:50.385 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2fb0f375-92, col_values=(('external_ids', {'iface-id': '2fb0f375-9270-4e22-8277-6d04ca007319', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:99:e3:8f', 'vm-uuid': '7a843394-1a35-483a-89ee-e5946c49ac74'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:50.387 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:50 np0005539505 NetworkManager[55134]: <info>  [1764398930.3888] manager: (tap2fb0f375-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:50.390 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:50.394 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:50 np0005539505 nova_compute[186958]: 2025-11-29 06:48:50.395 186962 INFO os_vif [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:e3:8f,bridge_name='br-int',has_traffic_filtering=True,id=2fb0f375-9270-4e22-8277-6d04ca007319,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fb0f375-92')#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.033 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.072 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config 485376" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.074 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config to /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.075 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.207 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.208 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.208 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] No VIF found with MAC fa:16:3e:99:e3:8f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.210 186962 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Using config drive#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.592 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.619 186962 DEBUG oslo_concurrency.processutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk.config /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.620 186962 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.621 186962 DEBUG nova.virt.libvirt.vif [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:48:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1791593514',display_name='tempest-LiveMigrationTest-server-1791593514',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1791593514',id=7,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:48:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-4vq3oq0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:48:27Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=af865d23-0f24-47aa-aeab-1c12d04b5a1e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.622 186962 DEBUG nova.network.os_vif_util [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converting VIF {"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.623 186962 DEBUG nova.network.os_vif_util [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.623 186962 DEBUG os_vif [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.624 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.624 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.625 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.628 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.628 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap60d45f94-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.629 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap60d45f94-ad, col_values=(('external_ids', {'iface-id': '60d45f94-ad4f-48ba-a0a9-6b5406aa616c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:09:58', 'vm-uuid': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.630 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:51 np0005539505 NetworkManager[55134]: <info>  [1764398931.6315] manager: (tap60d45f94-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.632 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.639 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.640 186962 INFO os_vif [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad')#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.641 186962 DEBUG nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 29 01:48:51 np0005539505 nova_compute[186958]: 2025-11-29 06:48:51.641 186962 DEBUG nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkcenheiq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 29 01:48:52 np0005539505 nova_compute[186958]: 2025-11-29 06:48:52.334 186962 INFO nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Creating config drive at /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk.config#033[00m
Nov 29 01:48:52 np0005539505 nova_compute[186958]: 2025-11-29 06:48:52.338 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvbw93gpm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:48:52 np0005539505 nova_compute[186958]: 2025-11-29 06:48:52.459 186962 DEBUG oslo_concurrency.processutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvbw93gpm" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:48:52 np0005539505 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 29 01:48:52 np0005539505 NetworkManager[55134]: <info>  [1764398932.5560] manager: (tap2fb0f375-92): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Nov 29 01:48:52 np0005539505 kernel: tap2fb0f375-92: entered promiscuous mode
Nov 29 01:48:52 np0005539505 ovn_controller[95143]: 2025-11-29T06:48:52Z|00027|binding|INFO|Claiming lport 2fb0f375-9270-4e22-8277-6d04ca007319 for this chassis.
Nov 29 01:48:52 np0005539505 ovn_controller[95143]: 2025-11-29T06:48:52Z|00028|binding|INFO|2fb0f375-9270-4e22-8277-6d04ca007319: Claiming fa:16:3e:99:e3:8f 10.1.0.39 fdfe:381f:8400::2f4
Nov 29 01:48:52 np0005539505 systemd-udevd[213855]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:48:52 np0005539505 nova_compute[186958]: 2025-11-29 06:48:52.609 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:52 np0005539505 NetworkManager[55134]: <info>  [1764398932.6235] device (tap2fb0f375-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:48:52 np0005539505 NetworkManager[55134]: <info>  [1764398932.6241] device (tap2fb0f375-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:48:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:52.639 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:e3:8f 10.1.0.39 fdfe:381f:8400::2f4'], port_security=['fa:16:3e:99:e3:8f 10.1.0.39 fdfe:381f:8400::2f4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.39/26 fdfe:381f:8400::2f4/64', 'neutron:device_id': '7a843394-1a35-483a-89ee-e5946c49ac74', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-425e933e-ca72-466c-8d2b-499c7ba67318', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5cacaa01-dff2-46af-9e49-4a741508795b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=236265de-856a-468e-8ed3-00d3e824203d, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=2fb0f375-9270-4e22-8277-6d04ca007319) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:48:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:52.641 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 2fb0f375-9270-4e22-8277-6d04ca007319 in datapath 425e933e-ca72-466c-8d2b-499c7ba67318 bound to our chassis#033[00m
Nov 29 01:48:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:52.644 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 425e933e-ca72-466c-8d2b-499c7ba67318#033[00m
Nov 29 01:48:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:52.646 104094 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpf94doslz/privsep.sock']#033[00m
Nov 29 01:48:52 np0005539505 systemd-machined[153285]: New machine qemu-1-instance-00000004.
Nov 29 01:48:52 np0005539505 podman[213821]: 2025-11-29 06:48:52.666146788 +0000 UTC m=+0.124505815 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:48:52 np0005539505 nova_compute[186958]: 2025-11-29 06:48:52.692 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:52 np0005539505 podman[213822]: 2025-11-29 06:48:52.693038355 +0000 UTC m=+0.153291525 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 01:48:52 np0005539505 systemd[1]: Started Virtual Machine qemu-1-instance-00000004.
Nov 29 01:48:52 np0005539505 ovn_controller[95143]: 2025-11-29T06:48:52Z|00029|binding|INFO|Setting lport 2fb0f375-9270-4e22-8277-6d04ca007319 ovn-installed in OVS
Nov 29 01:48:52 np0005539505 ovn_controller[95143]: 2025-11-29T06:48:52Z|00030|binding|INFO|Setting lport 2fb0f375-9270-4e22-8277-6d04ca007319 up in Southbound
Nov 29 01:48:52 np0005539505 nova_compute[186958]: 2025-11-29 06:48:52.701 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.219 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764398933.218406, 7a843394-1a35-483a-89ee-e5946c49ac74 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.221 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] VM Started (Lifecycle Event)#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.277 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:53.281 104094 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:48:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:53.282 104094 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpf94doslz/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.282 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764398933.2189753, 7a843394-1a35-483a-89ee-e5946c49ac74 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.282 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:48:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:53.162 213906 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:48:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:53.167 213906 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:48:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:53.169 213906 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Nov 29 01:48:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:53.169 213906 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213906#033[00m
Nov 29 01:48:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:53.285 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[749c269b-7015-4424-a709-ac5e03e92698]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.307 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.310 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.338 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.504 186962 DEBUG nova.compute.manager [req-754c64f0-7a14-48f3-8707-c3546ae8c038 req-13b6ef9c-385d-4eac-a473-85b0eaa61ea0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Received event network-vif-plugged-2fb0f375-9270-4e22-8277-6d04ca007319 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.505 186962 DEBUG oslo_concurrency.lockutils [req-754c64f0-7a14-48f3-8707-c3546ae8c038 req-13b6ef9c-385d-4eac-a473-85b0eaa61ea0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.506 186962 DEBUG oslo_concurrency.lockutils [req-754c64f0-7a14-48f3-8707-c3546ae8c038 req-13b6ef9c-385d-4eac-a473-85b0eaa61ea0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.506 186962 DEBUG oslo_concurrency.lockutils [req-754c64f0-7a14-48f3-8707-c3546ae8c038 req-13b6ef9c-385d-4eac-a473-85b0eaa61ea0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.507 186962 DEBUG nova.compute.manager [req-754c64f0-7a14-48f3-8707-c3546ae8c038 req-13b6ef9c-385d-4eac-a473-85b0eaa61ea0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Processing event network-vif-plugged-2fb0f375-9270-4e22-8277-6d04ca007319 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.509 186962 DEBUG nova.network.neutron [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.512 186962 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.517 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.518 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764398933.5164793, 7a843394-1a35-483a-89ee-e5946c49ac74 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.518 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.534 186962 INFO nova.virt.libvirt.driver [-] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Instance spawned successfully.#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.535 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.539 186962 DEBUG nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkcenheiq',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.545 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.548 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.572 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.572 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.572 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.573 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.573 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.573 186962 DEBUG nova.virt.libvirt.driver [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.576 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.725 186962 INFO nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Took 53.56 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.727 186962 DEBUG nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:53 np0005539505 systemd[1]: Starting libvirt proxy daemon...
Nov 29 01:48:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:53.770 213906 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:53.770 213906 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:53.770 213906 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:53 np0005539505 systemd[1]: Started libvirt proxy daemon.
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.898 186962 INFO nova.compute.manager [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Took 54.43 seconds to build instance.#033[00m
Nov 29 01:48:53 np0005539505 kernel: tap60d45f94-ad: entered promiscuous mode
Nov 29 01:48:53 np0005539505 NetworkManager[55134]: <info>  [1764398933.9231] manager: (tap60d45f94-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Nov 29 01:48:53 np0005539505 systemd-udevd[213854]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.932 186962 DEBUG oslo_concurrency.lockutils [None req-cfe00df3-e868-4ae7-b672-847985152362 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 54.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:53 np0005539505 NetworkManager[55134]: <info>  [1764398933.9354] device (tap60d45f94-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:48:53 np0005539505 NetworkManager[55134]: <info>  [1764398933.9364] device (tap60d45f94-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:48:53 np0005539505 ovn_controller[95143]: 2025-11-29T06:48:53Z|00031|binding|INFO|Claiming lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c for this additional chassis.
Nov 29 01:48:53 np0005539505 ovn_controller[95143]: 2025-11-29T06:48:53Z|00032|binding|INFO|60d45f94-ad4f-48ba-a0a9-6b5406aa616c: Claiming fa:16:3e:86:09:58 10.100.0.4
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.958 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:53 np0005539505 nova_compute[186958]: 2025-11-29 06:48:53.966 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:53 np0005539505 systemd-machined[153285]: New machine qemu-2-instance-00000007.
Nov 29 01:48:54 np0005539505 systemd[1]: Started Virtual Machine qemu-2-instance-00000007.
Nov 29 01:48:54 np0005539505 ovn_controller[95143]: 2025-11-29T06:48:54Z|00033|binding|INFO|Setting lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c ovn-installed in OVS
Nov 29 01:48:54 np0005539505 nova_compute[186958]: 2025-11-29 06:48:54.018 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.339 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd57fe9-a863-4b16-9ca1-f78f02cdd1ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.340 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap425e933e-c1 in ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:48:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.342 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap425e933e-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:48:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.342 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ab7814-d3f4-4b3f-bfb9-7a8ef7f5de1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.345 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[23776cae-879b-46a9-b135-60653572bb53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.370 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b4c427-3397-4284-b8fb-00a1f37e07ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.396 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f346ad9e-7b8a-4d1d-b509-ab528325c5cd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.398 104094 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp_guib6wu/privsep.sock']#033[00m
Nov 29 01:48:54 np0005539505 nova_compute[186958]: 2025-11-29 06:48:54.592 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764398934.5919907, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:54 np0005539505 nova_compute[186958]: 2025-11-29 06:48:54.593 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Started (Lifecycle Event)#033[00m
Nov 29 01:48:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:55.051 104094 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 29 01:48:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:55.051 104094 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp_guib6wu/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 29 01:48:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.948 213984 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 29 01:48:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.953 213984 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 29 01:48:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.955 213984 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 29 01:48:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:54.956 213984 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213984#033[00m
Nov 29 01:48:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:55.054 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[8958cfc1-13c8-490f-b2bb-6e9733779e78]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:55.537 213984 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:48:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:55.538 213984 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:48:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:55.538 213984 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:48:55 np0005539505 podman[213989]: 2025-11-29 06:48:55.750411067 +0000 UTC m=+0.075869547 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.114 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9ca1ee-3e38-4459-b1d2-b8333b51be49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539505 NetworkManager[55134]: <info>  [1764398936.1381] manager: (tap425e933e-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.132 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4704cbd0-95a0-4dbc-8d69-4127e13fd840]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539505 systemd-udevd[214014]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.176 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f885d26a-aefb-4c2e-a1d1-ed3c751f769d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.181 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[4a4ca094-9a5b-4d48-bf1c-adbc0e5d5a8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539505 NetworkManager[55134]: <info>  [1764398936.2021] device (tap425e933e-c0): carrier: link connected
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.206 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[31d89777-79c9-438d-8fe2-98c8febef111]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.223 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f78031ba-109b-4c76-b346-04aa0a20e131]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap425e933e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:96:d2:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438378, 'reachable_time': 29598, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214032, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.241 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9595c383-c44f-425d-8e9c-292379e71ff6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe96:d291'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438378, 'tstamp': 438378}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214033, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.257 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b92dcc82-3267-4384-90d5-5325345ca13d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap425e933e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:96:d2:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438378, 'reachable_time': 29598, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214034, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.294 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[59c413eb-bff9-4b1f-8507-14e3e8972e44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.351 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3a1e7e6d-f373-4b7f-b16f-cdbe5c815bdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.353 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap425e933e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.354 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.354 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap425e933e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.356 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:56 np0005539505 NetworkManager[55134]: <info>  [1764398936.3575] manager: (tap425e933e-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Nov 29 01:48:56 np0005539505 kernel: tap425e933e-c0: entered promiscuous mode
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.360 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.362 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap425e933e-c0, col_values=(('external_ids', {'iface-id': 'c143daec-964e-4591-a13b-43e2014d70b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.363 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:56 np0005539505 ovn_controller[95143]: 2025-11-29T06:48:56Z|00034|binding|INFO|Releasing lport c143daec-964e-4591-a13b-43e2014d70b5 from this chassis (sb_readonly=1)
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.380 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.383 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/425e933e-ca72-466c-8d2b-499c7ba67318.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/425e933e-ca72-466c-8d2b-499c7ba67318.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.384 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[00cedd65-18f0-4914-8246-a89826765fa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.385 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-425e933e-ca72-466c-8d2b-499c7ba67318
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/425e933e-ca72-466c-8d2b-499c7ba67318.pid.haproxy
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 425e933e-ca72-466c-8d2b-499c7ba67318
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:48:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:48:56.386 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'env', 'PROCESS_TAG=haproxy-425e933e-ca72-466c-8d2b-499c7ba67318', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/425e933e-ca72-466c-8d2b-499c7ba67318.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.420 186962 DEBUG nova.network.neutron [req-53ebce78-c6c1-4a15-b913-d564d013ee29 req-607681c6-d885-462f-b7b6-48aed56b1f35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Updated VIF entry in instance network info cache for port 2fb0f375-9270-4e22-8277-6d04ca007319. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.420 186962 DEBUG nova.network.neutron [req-53ebce78-c6c1-4a15-b913-d564d013ee29 req-607681c6-d885-462f-b7b6-48aed56b1f35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Updating instance_info_cache with network_info: [{"id": "2fb0f375-9270-4e22-8277-6d04ca007319", "address": "fa:16:3e:99:e3:8f", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::2f4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fb0f375-92", "ovs_interfaceid": "2fb0f375-9270-4e22-8277-6d04ca007319", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.600 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.626 186962 DEBUG oslo_concurrency.lockutils [req-53ebce78-c6c1-4a15-b913-d564d013ee29 req-607681c6-d885-462f-b7b6-48aed56b1f35 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7a843394-1a35-483a-89ee-e5946c49ac74" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.631 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.633 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.681 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.682 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764398935.8634589, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.683 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.706 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.710 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:48:56 np0005539505 podman[214067]: 2025-11-29 06:48:56.800428387 +0000 UTC m=+0.054562277 container create 40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:48:56 np0005539505 systemd[1]: Started libpod-conmon-40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6.scope.
Nov 29 01:48:56 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:48:56 np0005539505 podman[214067]: 2025-11-29 06:48:56.772089689 +0000 UTC m=+0.026223589 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:48:56 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03e6631e1a3552956825ed608c138c4046918855a2bd7abdbcc7b2cac1a338bc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:48:56 np0005539505 podman[214067]: 2025-11-29 06:48:56.884567735 +0000 UTC m=+0.138701715 container init 40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 01:48:56 np0005539505 podman[214067]: 2025-11-29 06:48:56.89150442 +0000 UTC m=+0.145638330 container start 40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:48:56 np0005539505 nova_compute[186958]: 2025-11-29 06:48:56.900 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Nov 29 01:48:56 np0005539505 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214083]: [NOTICE]   (214087) : New worker (214089) forked
Nov 29 01:48:56 np0005539505 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214083]: [NOTICE]   (214087) : Loading success.
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.218 186962 DEBUG nova.compute.manager [req-6f90086e-a01c-4f20-9f80-0998eb30b6ca req-7ac1668d-bf20-484c-853f-b187bb8f11d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Received event network-vif-plugged-2fb0f375-9270-4e22-8277-6d04ca007319 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.219 186962 DEBUG oslo_concurrency.lockutils [req-6f90086e-a01c-4f20-9f80-0998eb30b6ca req-7ac1668d-bf20-484c-853f-b187bb8f11d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.219 186962 DEBUG oslo_concurrency.lockutils [req-6f90086e-a01c-4f20-9f80-0998eb30b6ca req-7ac1668d-bf20-484c-853f-b187bb8f11d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.220 186962 DEBUG oslo_concurrency.lockutils [req-6f90086e-a01c-4f20-9f80-0998eb30b6ca req-7ac1668d-bf20-484c-853f-b187bb8f11d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.220 186962 DEBUG nova.compute.manager [req-6f90086e-a01c-4f20-9f80-0998eb30b6ca req-7ac1668d-bf20-484c-853f-b187bb8f11d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] No waiting events found dispatching network-vif-plugged-2fb0f375-9270-4e22-8277-6d04ca007319 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.220 186962 WARNING nova.compute.manager [req-6f90086e-a01c-4f20-9f80-0998eb30b6ca req-7ac1668d-bf20-484c-853f-b187bb8f11d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Received unexpected event network-vif-plugged-2fb0f375-9270-4e22-8277-6d04ca007319 for instance with vm_state active and task_state None.#033[00m
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.632 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.633 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.634 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.634 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.635 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 01:49:01 np0005539505 nova_compute[186958]: 2025-11-29 06:49:01.637 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:01 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:01Z|00035|binding|INFO|Claiming lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c for this chassis.
Nov 29 01:49:01 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:01Z|00036|binding|INFO|60d45f94-ad4f-48ba-a0a9-6b5406aa616c: Claiming fa:16:3e:86:09:58 10.100.0.4
Nov 29 01:49:01 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:01Z|00037|binding|INFO|Setting lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c up in Southbound
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.718 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:09:58 10.100.0.4'], port_security=['fa:16:3e:86:09:58 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '11', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fafd611f-c010-460d-b1cc-2d52a79696f1, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=60d45f94-ad4f-48ba-a0a9-6b5406aa616c) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.720 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c in datapath 24ee44f0-2b10-459c-aabf-bf9ef2c8d950 bound to our chassis#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.723 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24ee44f0-2b10-459c-aabf-bf9ef2c8d950#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.734 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[413ce655-d1a0-47c2-b26f-659f17cf92c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.735 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24ee44f0-21 in ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.736 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24ee44f0-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.736 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[98d0563c-c79d-441d-8e65-1b83d2c9924f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.737 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[25454446-c592-4862-adb6-dd865752cb7d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.765 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[270d4121-8be1-4797-a275-64ce2e698380]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.792 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[190d3b70-9fbe-415f-90d4-384668af93ca]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.827 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[eba21c18-eb3b-4c25-a771-8986c86b08f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.834 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[096d0843-943c-4967-8408-cb8e33c52d51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 NetworkManager[55134]: <info>  [1764398941.8401] manager: (tap24ee44f0-20): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.873 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[987559db-8d1d-4bf9-9b55-d66453713bd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.876 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[937ba842-caa7-4fca-8512-763797881b57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 systemd-udevd[214125]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:49:01 np0005539505 podman[214101]: 2025-11-29 06:49:01.892982743 +0000 UTC m=+0.099126991 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:49:01 np0005539505 NetworkManager[55134]: <info>  [1764398941.9015] device (tap24ee44f0-20): carrier: link connected
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.910 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[acde507c-fe3a-45a9-9fae-345c9b7aa1d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.926 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[51701240-1a95-4449-bd6e-2e8dac7f562c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24ee44f0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:94:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438948, 'reachable_time': 41915, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214145, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.940 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f17186c8-135b-4d27-a636-98d83fa7ca43]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:940c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438948, 'tstamp': 438948}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214146, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.955 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d898bc5c-f1ae-4288-82e0-d70aa595af14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24ee44f0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:94:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438948, 'reachable_time': 41915, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214148, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:01.982 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0413cffa-fc68-4bfc-a02f-345e4fa81df2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:02.038 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[02b76e62-622e-4c63-a41f-1eb741af8fa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:02.040 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24ee44f0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:02.040 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:02.040 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24ee44f0-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:02 np0005539505 kernel: tap24ee44f0-20: entered promiscuous mode
Nov 29 01:49:02 np0005539505 NetworkManager[55134]: <info>  [1764398942.0432] manager: (tap24ee44f0-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Nov 29 01:49:02 np0005539505 nova_compute[186958]: 2025-11-29 06:49:02.042 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:02.046 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24ee44f0-20, col_values=(('external_ids', {'iface-id': 'ffbd3b8f-7e45-45d4-84ce-cd74c712f992'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:02 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:02Z|00038|binding|INFO|Releasing lport ffbd3b8f-7e45-45d4-84ce-cd74c712f992 from this chassis (sb_readonly=0)
Nov 29 01:49:02 np0005539505 nova_compute[186958]: 2025-11-29 06:49:02.049 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:02.051 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:02.051 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3112bff3-c5c4-4725-9466-b4f318cf81f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:02.052 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-24ee44f0-2b10-459c-aabf-bf9ef2c8d950
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 24ee44f0-2b10-459c-aabf-bf9ef2c8d950
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:49:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:02.053 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'env', 'PROCESS_TAG=haproxy-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:49:02 np0005539505 nova_compute[186958]: 2025-11-29 06:49:02.062 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:02 np0005539505 nova_compute[186958]: 2025-11-29 06:49:02.073 186962 INFO nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Post operation of migration started#033[00m
Nov 29 01:49:02 np0005539505 podman[214181]: 2025-11-29 06:49:02.412544915 +0000 UTC m=+0.032731393 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:49:02 np0005539505 podman[214181]: 2025-11-29 06:49:02.516301395 +0000 UTC m=+0.136487833 container create 247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:49:02 np0005539505 systemd[1]: Started libpod-conmon-247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78.scope.
Nov 29 01:49:02 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:49:02 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/738ec18c83fec97b4b6bdf7c715be75e79b66d6b6428d67df91aafbba909561b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:49:02 np0005539505 podman[214181]: 2025-11-29 06:49:02.635631423 +0000 UTC m=+0.255817941 container init 247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 01:49:02 np0005539505 podman[214181]: 2025-11-29 06:49:02.646062656 +0000 UTC m=+0.266249084 container start 247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 01:49:02 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214196]: [NOTICE]   (214200) : New worker (214202) forked
Nov 29 01:49:02 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214196]: [NOTICE]   (214200) : Loading success.
Nov 29 01:49:02 np0005539505 nova_compute[186958]: 2025-11-29 06:49:02.915 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:49:02 np0005539505 nova_compute[186958]: 2025-11-29 06:49:02.916 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquired lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:49:02 np0005539505 nova_compute[186958]: 2025-11-29 06:49:02.917 186962 DEBUG nova.network.neutron [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:49:05 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:05Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:99:e3:8f 10.1.0.39
Nov 29 01:49:05 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:05Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:99:e3:8f 10.1.0.39
Nov 29 01:49:06 np0005539505 nova_compute[186958]: 2025-11-29 06:49:06.635 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:06 np0005539505 nova_compute[186958]: 2025-11-29 06:49:06.638 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:06 np0005539505 nova_compute[186958]: 2025-11-29 06:49:06.652 186962 DEBUG nova.network.neutron [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating instance_info_cache with network_info: [{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:49:06 np0005539505 nova_compute[186958]: 2025-11-29 06:49:06.685 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Releasing lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:49:06 np0005539505 nova_compute[186958]: 2025-11-29 06:49:06.730 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:06 np0005539505 nova_compute[186958]: 2025-11-29 06:49:06.730 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:06 np0005539505 nova_compute[186958]: 2025-11-29 06:49:06.730 186962 DEBUG oslo_concurrency.lockutils [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:06 np0005539505 nova_compute[186958]: 2025-11-29 06:49:06.734 186962 INFO nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 29 01:49:06 np0005539505 virtqemud[186353]: Domain id=2 name='instance-00000007' uuid=af865d23-0f24-47aa-aeab-1c12d04b5a1e is tainted: custom-monitor
Nov 29 01:49:07 np0005539505 nova_compute[186958]: 2025-11-29 06:49:07.741 186962 INFO nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.189 186962 DEBUG oslo_concurrency.lockutils [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "7a843394-1a35-483a-89ee-e5946c49ac74" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.189 186962 DEBUG oslo_concurrency.lockutils [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.189 186962 DEBUG oslo_concurrency.lockutils [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.189 186962 DEBUG oslo_concurrency.lockutils [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.190 186962 DEBUG oslo_concurrency.lockutils [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.199 186962 INFO nova.compute.manager [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Terminating instance#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.207 186962 DEBUG nova.compute.manager [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:49:08 np0005539505 kernel: tap2fb0f375-92 (unregistering): left promiscuous mode
Nov 29 01:49:08 np0005539505 NetworkManager[55134]: <info>  [1764398948.2507] device (tap2fb0f375-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.253 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:08 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:08Z|00039|binding|INFO|Releasing lport 2fb0f375-9270-4e22-8277-6d04ca007319 from this chassis (sb_readonly=0)
Nov 29 01:49:08 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:08Z|00040|binding|INFO|Setting lport 2fb0f375-9270-4e22-8277-6d04ca007319 down in Southbound
Nov 29 01:49:08 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:08Z|00041|binding|INFO|Removing iface tap2fb0f375-92 ovn-installed in OVS
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.255 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:08.266 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:e3:8f 10.1.0.39 fdfe:381f:8400::2f4'], port_security=['fa:16:3e:99:e3:8f 10.1.0.39 fdfe:381f:8400::2f4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.39/26 fdfe:381f:8400::2f4/64', 'neutron:device_id': '7a843394-1a35-483a-89ee-e5946c49ac74', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-425e933e-ca72-466c-8d2b-499c7ba67318', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d2e7db012114f9eb8e8e1b0123c9974', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5cacaa01-dff2-46af-9e49-4a741508795b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=236265de-856a-468e-8ed3-00d3e824203d, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=2fb0f375-9270-4e22-8277-6d04ca007319) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:49:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:08.268 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 2fb0f375-9270-4e22-8277-6d04ca007319 in datapath 425e933e-ca72-466c-8d2b-499c7ba67318 unbound from our chassis#033[00m
Nov 29 01:49:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:08.271 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 425e933e-ca72-466c-8d2b-499c7ba67318, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:49:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:08.272 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a34ea311-479e-4d36-8d8d-07d36708ad79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:08.272 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 namespace which is not needed anymore#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.274 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:08 np0005539505 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000004.scope: Deactivated successfully.
Nov 29 01:49:08 np0005539505 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000004.scope: Consumed 12.519s CPU time.
Nov 29 01:49:08 np0005539505 systemd-machined[153285]: Machine qemu-1-instance-00000004 terminated.
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.428 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.435 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.477 186962 INFO nova.virt.libvirt.driver [-] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Instance destroyed successfully.#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.478 186962 DEBUG nova.objects.instance [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lazy-loading 'resources' on Instance uuid 7a843394-1a35-483a-89ee-e5946c49ac74 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.499 186962 DEBUG nova.virt.libvirt.vif [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:47:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-526752650-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-526752650-1',id=4,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:48:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d2e7db012114f9eb8e8e1b0123c9974',ramdisk_id='',reservation_id='r-o9m0h1rn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-224859463',owner_user_name='tempest-AutoAllocateNetworkTest-224859463-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:48:53Z,user_data=None,user_id='7a31c969c2f744a9810fc9890dd7acb2',uuid=7a843394-1a35-483a-89ee-e5946c49ac74,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2fb0f375-9270-4e22-8277-6d04ca007319", "address": "fa:16:3e:99:e3:8f", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::2f4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fb0f375-92", "ovs_interfaceid": "2fb0f375-9270-4e22-8277-6d04ca007319", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.499 186962 DEBUG nova.network.os_vif_util [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converting VIF {"id": "2fb0f375-9270-4e22-8277-6d04ca007319", "address": "fa:16:3e:99:e3:8f", "network": {"id": "425e933e-ca72-466c-8d2b-499c7ba67318", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::2f4", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.39", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d2e7db012114f9eb8e8e1b0123c9974", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fb0f375-92", "ovs_interfaceid": "2fb0f375-9270-4e22-8277-6d04ca007319", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.500 186962 DEBUG nova.network.os_vif_util [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:e3:8f,bridge_name='br-int',has_traffic_filtering=True,id=2fb0f375-9270-4e22-8277-6d04ca007319,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fb0f375-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.501 186962 DEBUG os_vif [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:e3:8f,bridge_name='br-int',has_traffic_filtering=True,id=2fb0f375-9270-4e22-8277-6d04ca007319,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fb0f375-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.504 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.505 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fb0f375-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.506 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.508 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.509 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.512 186962 INFO os_vif [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:e3:8f,bridge_name='br-int',has_traffic_filtering=True,id=2fb0f375-9270-4e22-8277-6d04ca007319,network=Network(425e933e-ca72-466c-8d2b-499c7ba67318),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fb0f375-92')#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.512 186962 INFO nova.virt.libvirt.driver [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Deleting instance files /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74_del#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.513 186962 INFO nova.virt.libvirt.driver [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Deletion of /var/lib/nova/instances/7a843394-1a35-483a-89ee-e5946c49ac74_del complete#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.595 186962 DEBUG nova.virt.libvirt.host [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.595 186962 INFO nova.virt.libvirt.host [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] UEFI support detected#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.597 186962 INFO nova.compute.manager [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.597 186962 DEBUG oslo.service.loopingcall [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.597 186962 DEBUG nova.compute.manager [-] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.597 186962 DEBUG nova.network.neutron [-] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.746 186962 INFO nova.virt.libvirt.driver [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.752 186962 DEBUG nova.compute.manager [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:08 np0005539505 nova_compute[186958]: 2025-11-29 06:49:08.775 186962 DEBUG nova.objects.instance [None req-339fa41d-d53e-473d-ab00-1ffab5bcf559 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:49:09 np0005539505 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214083]: [NOTICE]   (214087) : haproxy version is 2.8.14-c23fe91
Nov 29 01:49:09 np0005539505 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214083]: [NOTICE]   (214087) : path to executable is /usr/sbin/haproxy
Nov 29 01:49:09 np0005539505 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214083]: [WARNING]  (214087) : Exiting Master process...
Nov 29 01:49:09 np0005539505 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214083]: [WARNING]  (214087) : Exiting Master process...
Nov 29 01:49:09 np0005539505 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214083]: [ALERT]    (214087) : Current worker (214089) exited with code 143 (Terminated)
Nov 29 01:49:09 np0005539505 neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318[214083]: [WARNING]  (214087) : All workers exited. Exiting... (0)
Nov 29 01:49:09 np0005539505 systemd[1]: libpod-40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6.scope: Deactivated successfully.
Nov 29 01:49:09 np0005539505 podman[214256]: 2025-11-29 06:49:09.145700421 +0000 UTC m=+0.783225573 container died 40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 01:49:09 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6-userdata-shm.mount: Deactivated successfully.
Nov 29 01:49:09 np0005539505 systemd[1]: var-lib-containers-storage-overlay-03e6631e1a3552956825ed608c138c4046918855a2bd7abdbcc7b2cac1a338bc-merged.mount: Deactivated successfully.
Nov 29 01:49:09 np0005539505 nova_compute[186958]: 2025-11-29 06:49:09.389 186962 DEBUG nova.compute.manager [req-4715c2d9-22b6-4a37-932a-14b35d28bc0b req-2cdf06bc-2b9d-49e2-bdf7-bd8b3f22c670 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Received event network-vif-unplugged-2fb0f375-9270-4e22-8277-6d04ca007319 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:09 np0005539505 nova_compute[186958]: 2025-11-29 06:49:09.390 186962 DEBUG oslo_concurrency.lockutils [req-4715c2d9-22b6-4a37-932a-14b35d28bc0b req-2cdf06bc-2b9d-49e2-bdf7-bd8b3f22c670 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:09 np0005539505 nova_compute[186958]: 2025-11-29 06:49:09.390 186962 DEBUG oslo_concurrency.lockutils [req-4715c2d9-22b6-4a37-932a-14b35d28bc0b req-2cdf06bc-2b9d-49e2-bdf7-bd8b3f22c670 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:09 np0005539505 nova_compute[186958]: 2025-11-29 06:49:09.390 186962 DEBUG oslo_concurrency.lockutils [req-4715c2d9-22b6-4a37-932a-14b35d28bc0b req-2cdf06bc-2b9d-49e2-bdf7-bd8b3f22c670 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:09 np0005539505 nova_compute[186958]: 2025-11-29 06:49:09.391 186962 DEBUG nova.compute.manager [req-4715c2d9-22b6-4a37-932a-14b35d28bc0b req-2cdf06bc-2b9d-49e2-bdf7-bd8b3f22c670 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] No waiting events found dispatching network-vif-unplugged-2fb0f375-9270-4e22-8277-6d04ca007319 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:09 np0005539505 nova_compute[186958]: 2025-11-29 06:49:09.391 186962 DEBUG nova.compute.manager [req-4715c2d9-22b6-4a37-932a-14b35d28bc0b req-2cdf06bc-2b9d-49e2-bdf7-bd8b3f22c670 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Received event network-vif-unplugged-2fb0f375-9270-4e22-8277-6d04ca007319 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:49:09 np0005539505 podman[214256]: 2025-11-29 06:49:09.535539162 +0000 UTC m=+1.173064304 container cleanup 40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 01:49:09 np0005539505 systemd[1]: libpod-conmon-40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6.scope: Deactivated successfully.
Nov 29 01:49:10 np0005539505 nova_compute[186958]: 2025-11-29 06:49:10.637 186962 DEBUG nova.network.neutron [-] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:49:10 np0005539505 nova_compute[186958]: 2025-11-29 06:49:10.661 186962 INFO nova.compute.manager [-] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Took 2.06 seconds to deallocate network for instance.#033[00m
Nov 29 01:49:10 np0005539505 nova_compute[186958]: 2025-11-29 06:49:10.775 186962 DEBUG oslo_concurrency.lockutils [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:10 np0005539505 nova_compute[186958]: 2025-11-29 06:49:10.775 186962 DEBUG oslo_concurrency.lockutils [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:10 np0005539505 nova_compute[186958]: 2025-11-29 06:49:10.860 186962 DEBUG nova.compute.provider_tree [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:49:10 np0005539505 podman[214304]: 2025-11-29 06:49:10.870559403 +0000 UTC m=+1.314070592 container remove 40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:49:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:10.878 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b45cf02c-7c1a-4095-9970-c2bb5c89e14f]: (4, ('Sat Nov 29 06:49:08 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 (40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6)\n40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6\nSat Nov 29 06:49:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 (40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6)\n40b9f7b5eb0e00fdf191bd503aa85553a704575230f0ac9bc30c5c56e02feec6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:10.880 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[79e07c1f-b88a-4740-b593-de193e853551]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:10.881 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap425e933e-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:10 np0005539505 kernel: tap425e933e-c0: left promiscuous mode
Nov 29 01:49:10 np0005539505 nova_compute[186958]: 2025-11-29 06:49:10.888 186962 DEBUG nova.scheduler.client.report [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:49:10 np0005539505 nova_compute[186958]: 2025-11-29 06:49:10.893 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:10 np0005539505 nova_compute[186958]: 2025-11-29 06:49:10.897 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:10.901 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d48d243f-e2be-46d1-bc5b-818a62b00be4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:10.916 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e91cafe2-ef39-4645-a633-31764df8170a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:10.918 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5b25f00e-3e3e-4a78-b9f5-fd1f37df8c41]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:10 np0005539505 nova_compute[186958]: 2025-11-29 06:49:10.928 186962 DEBUG oslo_concurrency.lockutils [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:10.935 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ae252077-c669-4f96-9ef5-4f7820d6081a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438368, 'reachable_time': 25357, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214320, 'error': None, 'target': 'ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:10 np0005539505 systemd[1]: run-netns-ovnmeta\x2d425e933e\x2dca72\x2d466c\x2d8d2b\x2d499c7ba67318.mount: Deactivated successfully.
Nov 29 01:49:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:10.945 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-425e933e-ca72-466c-8d2b-499c7ba67318 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:49:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:10.947 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb74944-e51d-4846-b06f-ebfd94264ddf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:10 np0005539505 nova_compute[186958]: 2025-11-29 06:49:10.995 186962 INFO nova.scheduler.client.report [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Deleted allocations for instance 7a843394-1a35-483a-89ee-e5946c49ac74#033[00m
Nov 29 01:49:11 np0005539505 nova_compute[186958]: 2025-11-29 06:49:11.083 186962 DEBUG oslo_concurrency.lockutils [None req-4751f6b9-d725-4fd3-93a8-09926578f0ac 7a31c969c2f744a9810fc9890dd7acb2 6d2e7db012114f9eb8e8e1b0123c9974 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.894s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:11 np0005539505 nova_compute[186958]: 2025-11-29 06:49:11.636 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:12 np0005539505 nova_compute[186958]: 2025-11-29 06:49:12.760 186962 DEBUG nova.compute.manager [req-26aeba09-741e-4536-b37e-a6263c4ee304 req-35dae936-2c41-4097-a7fe-5910f815c9e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Received event network-vif-plugged-2fb0f375-9270-4e22-8277-6d04ca007319 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:12 np0005539505 nova_compute[186958]: 2025-11-29 06:49:12.761 186962 DEBUG oslo_concurrency.lockutils [req-26aeba09-741e-4536-b37e-a6263c4ee304 req-35dae936-2c41-4097-a7fe-5910f815c9e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:12 np0005539505 nova_compute[186958]: 2025-11-29 06:49:12.761 186962 DEBUG oslo_concurrency.lockutils [req-26aeba09-741e-4536-b37e-a6263c4ee304 req-35dae936-2c41-4097-a7fe-5910f815c9e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:12 np0005539505 nova_compute[186958]: 2025-11-29 06:49:12.762 186962 DEBUG oslo_concurrency.lockutils [req-26aeba09-741e-4536-b37e-a6263c4ee304 req-35dae936-2c41-4097-a7fe-5910f815c9e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7a843394-1a35-483a-89ee-e5946c49ac74-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:12 np0005539505 nova_compute[186958]: 2025-11-29 06:49:12.762 186962 DEBUG nova.compute.manager [req-26aeba09-741e-4536-b37e-a6263c4ee304 req-35dae936-2c41-4097-a7fe-5910f815c9e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] No waiting events found dispatching network-vif-plugged-2fb0f375-9270-4e22-8277-6d04ca007319 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:12 np0005539505 nova_compute[186958]: 2025-11-29 06:49:12.762 186962 WARNING nova.compute.manager [req-26aeba09-741e-4536-b37e-a6263c4ee304 req-35dae936-2c41-4097-a7fe-5910f815c9e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Received unexpected event network-vif-plugged-2fb0f375-9270-4e22-8277-6d04ca007319 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:49:12 np0005539505 nova_compute[186958]: 2025-11-29 06:49:12.763 186962 DEBUG nova.compute.manager [req-26aeba09-741e-4536-b37e-a6263c4ee304 req-35dae936-2c41-4097-a7fe-5910f815c9e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Received event network-vif-deleted-2fb0f375-9270-4e22-8277-6d04ca007319 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:13 np0005539505 nova_compute[186958]: 2025-11-29 06:49:13.507 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:13 np0005539505 nova_compute[186958]: 2025-11-29 06:49:13.582 186962 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Check if temp file /var/lib/nova/instances/tmpi17kuexu exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 29 01:49:13 np0005539505 nova_compute[186958]: 2025-11-29 06:49:13.588 186962 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:13 np0005539505 nova_compute[186958]: 2025-11-29 06:49:13.642 186962 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:13 np0005539505 nova_compute[186958]: 2025-11-29 06:49:13.643 186962 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:13 np0005539505 nova_compute[186958]: 2025-11-29 06:49:13.694 186962 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:13 np0005539505 nova_compute[186958]: 2025-11-29 06:49:13.695 186962 DEBUG nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi17kuexu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 29 01:49:14 np0005539505 podman[214330]: 2025-11-29 06:49:14.731356345 +0000 UTC m=+0.050566914 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 01:49:14 np0005539505 podman[214329]: 2025-11-29 06:49:14.755254008 +0000 UTC m=+0.070560217 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Nov 29 01:49:15 np0005539505 nova_compute[186958]: 2025-11-29 06:49:15.139 186962 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:15 np0005539505 nova_compute[186958]: 2025-11-29 06:49:15.195 186962 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:15 np0005539505 nova_compute[186958]: 2025-11-29 06:49:15.196 186962 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:15 np0005539505 nova_compute[186958]: 2025-11-29 06:49:15.247 186962 DEBUG oslo_concurrency.processutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:16 np0005539505 nova_compute[186958]: 2025-11-29 06:49:16.636 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:18 np0005539505 systemd-logind[794]: New session 27 of user nova.
Nov 29 01:49:18 np0005539505 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:49:18 np0005539505 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:49:18 np0005539505 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:49:18 np0005539505 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:49:18 np0005539505 podman[214381]: 2025-11-29 06:49:18.398016444 +0000 UTC m=+0.055688718 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:49:18 np0005539505 nova_compute[186958]: 2025-11-29 06:49:18.510 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:18 np0005539505 systemd[214402]: Queued start job for default target Main User Target.
Nov 29 01:49:18 np0005539505 systemd[214402]: Created slice User Application Slice.
Nov 29 01:49:18 np0005539505 systemd[214402]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:49:18 np0005539505 systemd[214402]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:49:18 np0005539505 systemd[214402]: Reached target Paths.
Nov 29 01:49:18 np0005539505 systemd[214402]: Reached target Timers.
Nov 29 01:49:18 np0005539505 systemd[214402]: Starting D-Bus User Message Bus Socket...
Nov 29 01:49:18 np0005539505 systemd[214402]: Starting Create User's Volatile Files and Directories...
Nov 29 01:49:18 np0005539505 systemd[214402]: Finished Create User's Volatile Files and Directories.
Nov 29 01:49:18 np0005539505 systemd[214402]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:49:18 np0005539505 systemd[214402]: Reached target Sockets.
Nov 29 01:49:18 np0005539505 systemd[214402]: Reached target Basic System.
Nov 29 01:49:18 np0005539505 systemd[214402]: Reached target Main User Target.
Nov 29 01:49:18 np0005539505 systemd[214402]: Startup finished in 142ms.
Nov 29 01:49:18 np0005539505 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:49:18 np0005539505 systemd[1]: Started Session 27 of User nova.
Nov 29 01:49:18 np0005539505 systemd-logind[794]: Session 27 logged out. Waiting for processes to exit.
Nov 29 01:49:18 np0005539505 systemd[1]: session-27.scope: Deactivated successfully.
Nov 29 01:49:18 np0005539505 systemd-logind[794]: Removed session 27.
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.514 186962 INFO nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Took 5.27 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.515 186962 DEBUG nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.532 186962 DEBUG nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpi17kuexu',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='af865d23-0f24-47aa-aeab-1c12d04b5a1e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(e0ad21d7-8522-489e-9b33-aaa42aaf42b7),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.559 186962 DEBUG nova.objects.instance [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lazy-loading 'migration_context' on Instance uuid af865d23-0f24-47aa-aeab-1c12d04b5a1e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.560 186962 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.562 186962 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.562 186962 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.580 186962 DEBUG nova.virt.libvirt.vif [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:48:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1791593514',display_name='tempest-LiveMigrationTest-server-1791593514',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1791593514',id=7,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:48:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-4vq3oq0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:49:08Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=af865d23-0f24-47aa-aeab-1c12d04b5a1e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.580 186962 DEBUG nova.network.os_vif_util [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converting VIF {"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.581 186962 DEBUG nova.network.os_vif_util [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.581 186962 DEBUG nova.virt.libvirt.migration [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating guest XML with vif config: <interface type="ethernet">
Nov 29 01:49:20 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:86:09:58"/>
Nov 29 01:49:20 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 01:49:20 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:49:20 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 01:49:20 np0005539505 nova_compute[186958]:  <target dev="tap60d45f94-ad"/>
Nov 29 01:49:20 np0005539505 nova_compute[186958]: </interface>
Nov 29 01:49:20 np0005539505 nova_compute[186958]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.582 186962 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.591 186962 DEBUG nova.compute.manager [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.592 186962 DEBUG oslo_concurrency.lockutils [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.592 186962 DEBUG oslo_concurrency.lockutils [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.592 186962 DEBUG oslo_concurrency.lockutils [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.592 186962 DEBUG nova.compute.manager [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.593 186962 DEBUG nova.compute.manager [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.593 186962 DEBUG nova.compute.manager [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.593 186962 DEBUG oslo_concurrency.lockutils [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.593 186962 DEBUG oslo_concurrency.lockutils [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.593 186962 DEBUG oslo_concurrency.lockutils [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.594 186962 DEBUG nova.compute.manager [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:20 np0005539505 nova_compute[186958]: 2025-11-29 06:49:20.594 186962 WARNING nova.compute.manager [req-0097b97d-38ff-4d22-b746-e83930930eb4 req-4a722082-719c-46ed-87e9-fffaa24f7c7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:49:21 np0005539505 nova_compute[186958]: 2025-11-29 06:49:21.065 186962 DEBUG nova.virt.libvirt.migration [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:49:21 np0005539505 nova_compute[186958]: 2025-11-29 06:49:21.065 186962 INFO nova.virt.libvirt.migration [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 29 01:49:21 np0005539505 nova_compute[186958]: 2025-11-29 06:49:21.278 186962 INFO nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 29 01:49:21 np0005539505 nova_compute[186958]: 2025-11-29 06:49:21.638 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:21 np0005539505 nova_compute[186958]: 2025-11-29 06:49:21.780 186962 DEBUG nova.virt.libvirt.migration [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:49:21 np0005539505 nova_compute[186958]: 2025-11-29 06:49:21.780 186962 DEBUG nova.virt.libvirt.migration [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:49:22 np0005539505 nova_compute[186958]: 2025-11-29 06:49:22.283 186962 DEBUG nova.virt.libvirt.migration [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:49:22 np0005539505 nova_compute[186958]: 2025-11-29 06:49:22.284 186962 DEBUG nova.virt.libvirt.migration [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:49:22 np0005539505 nova_compute[186958]: 2025-11-29 06:49:22.788 186962 DEBUG nova.virt.libvirt.migration [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:49:22 np0005539505 nova_compute[186958]: 2025-11-29 06:49:22.788 186962 DEBUG nova.virt.libvirt.migration [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:49:22 np0005539505 nova_compute[186958]: 2025-11-29 06:49:22.849 186962 DEBUG nova.compute.manager [req-2f6d8eb6-8e0e-4653-8842-36f09f0822fd req-92d5f67a-774b-4e00-80eb-ca57f34814e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-changed-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:22 np0005539505 nova_compute[186958]: 2025-11-29 06:49:22.849 186962 DEBUG nova.compute.manager [req-2f6d8eb6-8e0e-4653-8842-36f09f0822fd req-92d5f67a-774b-4e00-80eb-ca57f34814e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Refreshing instance network info cache due to event network-changed-60d45f94-ad4f-48ba-a0a9-6b5406aa616c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:49:22 np0005539505 nova_compute[186958]: 2025-11-29 06:49:22.849 186962 DEBUG oslo_concurrency.lockutils [req-2f6d8eb6-8e0e-4653-8842-36f09f0822fd req-92d5f67a-774b-4e00-80eb-ca57f34814e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:49:22 np0005539505 nova_compute[186958]: 2025-11-29 06:49:22.849 186962 DEBUG oslo_concurrency.lockutils [req-2f6d8eb6-8e0e-4653-8842-36f09f0822fd req-92d5f67a-774b-4e00-80eb-ca57f34814e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:49:22 np0005539505 nova_compute[186958]: 2025-11-29 06:49:22.850 186962 DEBUG nova.network.neutron [req-2f6d8eb6-8e0e-4653-8842-36f09f0822fd req-92d5f67a-774b-4e00-80eb-ca57f34814e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Refreshing network info cache for port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:49:23 np0005539505 nova_compute[186958]: 2025-11-29 06:49:23.475 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764398948.4746115, 7a843394-1a35-483a-89ee-e5946c49ac74 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:23 np0005539505 nova_compute[186958]: 2025-11-29 06:49:23.476 186962 INFO nova.compute.manager [-] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:49:23 np0005539505 nova_compute[186958]: 2025-11-29 06:49:23.515 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:23 np0005539505 nova_compute[186958]: 2025-11-29 06:49:23.573 186962 DEBUG nova.compute.manager [None req-2d8f1c70-74df-4499-8fd4-afd053b2cd77 - - - - - -] [instance: 7a843394-1a35-483a-89ee-e5946c49ac74] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:24 np0005539505 podman[214434]: 2025-11-29 06:49:24.352589391 +0000 UTC m=+0.672903139 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:49:24 np0005539505 podman[214435]: 2025-11-29 06:49:24.390407595 +0000 UTC m=+0.708714166 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 01:49:24 np0005539505 nova_compute[186958]: 2025-11-29 06:49:24.541 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764398964.5411396, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:24 np0005539505 nova_compute[186958]: 2025-11-29 06:49:24.542 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:49:24 np0005539505 nova_compute[186958]: 2025-11-29 06:49:24.544 186962 DEBUG nova.virt.libvirt.migration [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:49:24 np0005539505 nova_compute[186958]: 2025-11-29 06:49:24.545 186962 DEBUG nova.virt.libvirt.migration [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:49:24 np0005539505 nova_compute[186958]: 2025-11-29 06:49:24.645 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:24 np0005539505 nova_compute[186958]: 2025-11-29 06:49:24.777 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:49:24 np0005539505 nova_compute[186958]: 2025-11-29 06:49:24.799 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 29 01:49:24 np0005539505 kernel: tap60d45f94-ad (unregistering): left promiscuous mode
Nov 29 01:49:25 np0005539505 NetworkManager[55134]: <info>  [1764398965.0120] device (tap60d45f94-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:49:25 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:25Z|00042|binding|INFO|Releasing lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c from this chassis (sb_readonly=0)
Nov 29 01:49:25 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:25Z|00043|binding|INFO|Setting lport 60d45f94-ad4f-48ba-a0a9-6b5406aa616c down in Southbound
Nov 29 01:49:25 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:25Z|00044|binding|INFO|Removing iface tap60d45f94-ad ovn-installed in OVS
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.036 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.067 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:25 np0005539505 ovn_controller[95143]: 2025-11-29T06:49:25Z|00045|binding|INFO|Releasing lport ffbd3b8f-7e45-45d4-84ce-cd74c712f992 from this chassis (sb_readonly=0)
Nov 29 01:49:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:25.090 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:09:58 10.100.0.4'], port_security=['fa:16:3e:86:09:58 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'a43628b3-9efd-4940-9509-686038e16aeb'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'af865d23-0f24-47aa-aeab-1c12d04b5a1e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '18', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fafd611f-c010-460d-b1cc-2d52a79696f1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=60d45f94-ad4f-48ba-a0a9-6b5406aa616c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:49:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:25.093 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c in datapath 24ee44f0-2b10-459c-aabf-bf9ef2c8d950 unbound from our chassis#033[00m
Nov 29 01:49:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:25.096 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:49:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:25.098 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[697b825b-1871-4a26-a3ce-e7ddc9123107]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:25.099 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 namespace which is not needed anymore#033[00m
Nov 29 01:49:25 np0005539505 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 29 01:49:25 np0005539505 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000007.scope: Consumed 2.436s CPU time.
Nov 29 01:49:25 np0005539505 systemd-machined[153285]: Machine qemu-2-instance-00000007 terminated.
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.205 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.220 186962 DEBUG nova.network.neutron [req-2f6d8eb6-8e0e-4653-8842-36f09f0822fd req-92d5f67a-774b-4e00-80eb-ca57f34814e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updated VIF entry in instance network info cache for port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.221 186962 DEBUG nova.network.neutron [req-2f6d8eb6-8e0e-4653-8842-36f09f0822fd req-92d5f67a-774b-4e00-80eb-ca57f34814e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Updating instance_info_cache with network_info: [{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.249 186962 DEBUG oslo_concurrency.lockutils [req-2f6d8eb6-8e0e-4653-8842-36f09f0822fd req-92d5f67a-774b-4e00-80eb-ca57f34814e2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-af865d23-0f24-47aa-aeab-1c12d04b5a1e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.250 186962 DEBUG nova.virt.libvirt.guest [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.251 186962 INFO nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migration operation has completed#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.252 186962 INFO nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] _post_live_migration() is started..#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.255 186962 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.255 186962 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.255 186962 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.365 186962 DEBUG nova.compute.manager [req-fda3d74a-fd6b-46b9-9ba0-36d82dc5bd8b req-a21eb62f-76fd-4f52-858b-d50ef15a68b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.366 186962 DEBUG oslo_concurrency.lockutils [req-fda3d74a-fd6b-46b9-9ba0-36d82dc5bd8b req-a21eb62f-76fd-4f52-858b-d50ef15a68b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.367 186962 DEBUG oslo_concurrency.lockutils [req-fda3d74a-fd6b-46b9-9ba0-36d82dc5bd8b req-a21eb62f-76fd-4f52-858b-d50ef15a68b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.367 186962 DEBUG oslo_concurrency.lockutils [req-fda3d74a-fd6b-46b9-9ba0-36d82dc5bd8b req-a21eb62f-76fd-4f52-858b-d50ef15a68b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.367 186962 DEBUG nova.compute.manager [req-fda3d74a-fd6b-46b9-9ba0-36d82dc5bd8b req-a21eb62f-76fd-4f52-858b-d50ef15a68b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:25 np0005539505 nova_compute[186958]: 2025-11-29 06:49:25.368 186962 DEBUG nova.compute.manager [req-fda3d74a-fd6b-46b9-9ba0-36d82dc5bd8b req-a21eb62f-76fd-4f52-858b-d50ef15a68b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:49:25 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214196]: [NOTICE]   (214200) : haproxy version is 2.8.14-c23fe91
Nov 29 01:49:25 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214196]: [NOTICE]   (214200) : path to executable is /usr/sbin/haproxy
Nov 29 01:49:25 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214196]: [WARNING]  (214200) : Exiting Master process...
Nov 29 01:49:25 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214196]: [ALERT]    (214200) : Current worker (214202) exited with code 143 (Terminated)
Nov 29 01:49:25 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[214196]: [WARNING]  (214200) : All workers exited. Exiting... (0)
Nov 29 01:49:25 np0005539505 systemd[1]: libpod-247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78.scope: Deactivated successfully.
Nov 29 01:49:25 np0005539505 podman[214510]: 2025-11-29 06:49:25.516574388 +0000 UTC m=+0.302140114 container died 247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.686 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.769 186962 DEBUG nova.network.neutron [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Activated binding for port 60d45f94-ad4f-48ba-a0a9-6b5406aa616c and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.769 186962 DEBUG nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.770 186962 DEBUG nova.virt.libvirt.vif [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:48:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1791593514',display_name='tempest-LiveMigrationTest-server-1791593514',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1791593514',id=7,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:48:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-4vq3oq0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:49:12Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=af865d23-0f24-47aa-aeab-1c12d04b5a1e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.771 186962 DEBUG nova.network.os_vif_util [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converting VIF {"id": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "address": "fa:16:3e:86:09:58", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap60d45f94-ad", "ovs_interfaceid": "60d45f94-ad4f-48ba-a0a9-6b5406aa616c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.772 186962 DEBUG nova.network.os_vif_util [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.772 186962 DEBUG os_vif [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.774 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.774 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap60d45f94-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.776 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.778 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.780 186962 INFO os_vif [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:09:58,bridge_name='br-int',has_traffic_filtering=True,id=60d45f94-ad4f-48ba-a0a9-6b5406aa616c,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap60d45f94-ad')#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.781 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.781 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.781 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.782 186962 DEBUG nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.782 186962 INFO nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Deleting instance files /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e_del#033[00m
Nov 29 01:49:26 np0005539505 nova_compute[186958]: 2025-11-29 06:49:26.783 186962 INFO nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Deletion of /var/lib/nova/instances/af865d23-0f24-47aa-aeab-1c12d04b5a1e_del complete#033[00m
Nov 29 01:49:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:26.925 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:26.926 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:26.926 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.608 186962 DEBUG nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.609 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.609 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.609 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.610 186962 DEBUG nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.610 186962 WARNING nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.610 186962 DEBUG nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.611 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.611 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.611 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.612 186962 DEBUG nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.612 186962 WARNING nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.612 186962 DEBUG nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.612 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.613 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.613 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.613 186962 DEBUG nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.613 186962 DEBUG nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-unplugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.614 186962 DEBUG nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.614 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.614 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.614 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.614 186962 DEBUG nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.615 186962 WARNING nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.615 186962 DEBUG nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.615 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.615 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.616 186962 DEBUG oslo_concurrency.lockutils [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.616 186962 DEBUG nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] No waiting events found dispatching network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:49:27 np0005539505 nova_compute[186958]: 2025-11-29 06:49:27.616 186962 WARNING nova.compute.manager [req-d664e847-fc69-4021-9dcb-d814d927a4bc req-511a95a4-d580-4092-9afd-033ed64b8630 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Received unexpected event network-vif-plugged-60d45f94-ad4f-48ba-a0a9-6b5406aa616c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:49:27 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78-userdata-shm.mount: Deactivated successfully.
Nov 29 01:49:27 np0005539505 systemd[1]: var-lib-containers-storage-overlay-738ec18c83fec97b4b6bdf7c715be75e79b66d6b6428d67df91aafbba909561b-merged.mount: Deactivated successfully.
Nov 29 01:49:27 np0005539505 podman[214510]: 2025-11-29 06:49:27.870456022 +0000 UTC m=+2.656021778 container cleanup 247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:49:27 np0005539505 podman[214548]: 2025-11-29 06:49:27.874144146 +0000 UTC m=+1.166091378 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:49:27 np0005539505 systemd[1]: libpod-conmon-247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78.scope: Deactivated successfully.
Nov 29 01:49:27 np0005539505 podman[214573]: 2025-11-29 06:49:27.983762921 +0000 UTC m=+0.075972689 container remove 247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:49:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:27.990 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ae2462c9-0a82-4914-b12b-4e6a2753810d]: (4, ('Sat Nov 29 06:49:25 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 (247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78)\n247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78\nSat Nov 29 06:49:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 (247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78)\n247dbff2b6e0ce255610dcd628770b234ef0bc74bdfa7dc67b2f2f31c26ead78\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:27.991 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c916d35f-da42-439e-933d-d1afc994aa6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:27.992 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24ee44f0-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:28 np0005539505 nova_compute[186958]: 2025-11-29 06:49:28.021 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:28 np0005539505 kernel: tap24ee44f0-20: left promiscuous mode
Nov 29 01:49:28 np0005539505 nova_compute[186958]: 2025-11-29 06:49:28.033 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:28 np0005539505 nova_compute[186958]: 2025-11-29 06:49:28.034 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:28.036 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b1e6f1-08a2-4e28-b316-a3868c251fb8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:28.054 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a3af3936-d682-4217-9433-9ca23bbdbda2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:28.055 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[31f04458-648f-41df-9166-750e3a51a977]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:28.070 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[227b8a63-75cf-45d1-be67-02d32225fe23]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438940, 'reachable_time': 31850, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214588, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:28 np0005539505 systemd[1]: run-netns-ovnmeta\x2d24ee44f0\x2d2b10\x2d459c\x2daabf\x2dbf9ef2c8d950.mount: Deactivated successfully.
Nov 29 01:49:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:28.073 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:49:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:28.074 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e33d2c-a38d-4a46-b676-1e970b5c48e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:49:28 np0005539505 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:49:28 np0005539505 systemd[214402]: Activating special unit Exit the Session...
Nov 29 01:49:28 np0005539505 systemd[214402]: Stopped target Main User Target.
Nov 29 01:49:28 np0005539505 systemd[214402]: Stopped target Basic System.
Nov 29 01:49:28 np0005539505 systemd[214402]: Stopped target Paths.
Nov 29 01:49:28 np0005539505 systemd[214402]: Stopped target Sockets.
Nov 29 01:49:28 np0005539505 systemd[214402]: Stopped target Timers.
Nov 29 01:49:28 np0005539505 systemd[214402]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:49:28 np0005539505 systemd[214402]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:49:28 np0005539505 systemd[214402]: Closed D-Bus User Message Bus Socket.
Nov 29 01:49:28 np0005539505 systemd[214402]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:49:28 np0005539505 systemd[214402]: Removed slice User Application Slice.
Nov 29 01:49:28 np0005539505 systemd[214402]: Reached target Shutdown.
Nov 29 01:49:28 np0005539505 systemd[214402]: Finished Exit the Session.
Nov 29 01:49:28 np0005539505 systemd[214402]: Reached target Exit the Session.
Nov 29 01:49:28 np0005539505 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:49:28 np0005539505 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:49:28 np0005539505 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:49:28 np0005539505 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:49:28 np0005539505 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:49:28 np0005539505 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:49:28 np0005539505 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:49:31 np0005539505 nova_compute[186958]: 2025-11-29 06:49:31.688 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:31 np0005539505 nova_compute[186958]: 2025-11-29 06:49:31.776 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:32 np0005539505 podman[214595]: 2025-11-29 06:49:32.729091506 +0000 UTC m=+0.062493230 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:49:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:34.245 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:49:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:34.245 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.297 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.416 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.417 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.417 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "af865d23-0f24-47aa-aeab-1c12d04b5a1e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.435 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.436 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.436 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.436 186962 DEBUG nova.compute.resource_tracker [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.591 186962 WARNING nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.592 186962 DEBUG nova.compute.resource_tracker [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5779MB free_disk=73.34427261352539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.592 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.593 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.655 186962 DEBUG nova.compute.resource_tracker [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Migration for instance af865d23-0f24-47aa-aeab-1c12d04b5a1e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.714 186962 DEBUG nova.compute.resource_tracker [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.741 186962 DEBUG nova.compute.resource_tracker [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Migration e0ad21d7-8522-489e-9b33-aaa42aaf42b7 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.742 186962 DEBUG nova.compute.resource_tracker [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.742 186962 DEBUG nova.compute.resource_tracker [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.780 186962 DEBUG nova.compute.provider_tree [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.795 186962 DEBUG nova.scheduler.client.report [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.813 186962 DEBUG nova.compute.resource_tracker [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.813 186962 DEBUG oslo_concurrency.lockutils [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.827 186962 INFO nova.compute.manager [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Migrating instance to compute-1.ctlplane.example.com finished successfully.#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.930 186962 INFO nova.scheduler.client.report [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Deleted allocation for migration e0ad21d7-8522-489e-9b33-aaa42aaf42b7#033[00m
Nov 29 01:49:34 np0005539505 nova_compute[186958]: 2025-11-29 06:49:34.930 186962 DEBUG nova.virt.libvirt.driver [None req-b660cf01-5dba-498a-a466-0155a5cfae37 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 29 01:49:36 np0005539505 nova_compute[186958]: 2025-11-29 06:49:36.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:36 np0005539505 nova_compute[186958]: 2025-11-29 06:49:36.695 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:36 np0005539505 nova_compute[186958]: 2025-11-29 06:49:36.778 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:38 np0005539505 nova_compute[186958]: 2025-11-29 06:49:38.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:38 np0005539505 nova_compute[186958]: 2025-11-29 06:49:38.396 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:39 np0005539505 nova_compute[186958]: 2025-11-29 06:49:39.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:39 np0005539505 nova_compute[186958]: 2025-11-29 06:49:39.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:49:39 np0005539505 nova_compute[186958]: 2025-11-29 06:49:39.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:49:39 np0005539505 nova_compute[186958]: 2025-11-29 06:49:39.398 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:49:39 np0005539505 nova_compute[186958]: 2025-11-29 06:49:39.399 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:40 np0005539505 nova_compute[186958]: 2025-11-29 06:49:40.255 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764398965.250626, af865d23-0f24-47aa-aeab-1c12d04b5a1e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:40 np0005539505 nova_compute[186958]: 2025-11-29 06:49:40.256 186962 INFO nova.compute.manager [-] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:49:40 np0005539505 nova_compute[186958]: 2025-11-29 06:49:40.277 186962 DEBUG nova.compute.manager [None req-af6b2fec-ac40-49ab-9c30-625b55ab53e8 - - - - - -] [instance: af865d23-0f24-47aa-aeab-1c12d04b5a1e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:40 np0005539505 nova_compute[186958]: 2025-11-29 06:49:40.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:40 np0005539505 nova_compute[186958]: 2025-11-29 06:49:40.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:40 np0005539505 nova_compute[186958]: 2025-11-29 06:49:40.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:49:40 np0005539505 nova_compute[186958]: 2025-11-29 06:49:40.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:40 np0005539505 nova_compute[186958]: 2025-11-29 06:49:40.978 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:40 np0005539505 nova_compute[186958]: 2025-11-29 06:49:40.979 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:40 np0005539505 nova_compute[186958]: 2025-11-29 06:49:40.979 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:40 np0005539505 nova_compute[186958]: 2025-11-29 06:49:40.980 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.197 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.199 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5782MB free_disk=73.34427261352539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.199 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.199 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:49:41.247 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.267 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.267 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.285 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.297 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.298 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.298 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.698 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:41 np0005539505 nova_compute[186958]: 2025-11-29 06:49:41.780 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:42 np0005539505 nova_compute[186958]: 2025-11-29 06:49:42.295 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:42 np0005539505 nova_compute[186958]: 2025-11-29 06:49:42.662 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "7957f954-232f-402f-98b1-f5e740a43946" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:42 np0005539505 nova_compute[186958]: 2025-11-29 06:49:42.663 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "7957f954-232f-402f-98b1-f5e740a43946" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:42 np0005539505 nova_compute[186958]: 2025-11-29 06:49:42.682 186962 DEBUG nova.compute.manager [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:49:42 np0005539505 nova_compute[186958]: 2025-11-29 06:49:42.798 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:42 np0005539505 nova_compute[186958]: 2025-11-29 06:49:42.798 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:42 np0005539505 nova_compute[186958]: 2025-11-29 06:49:42.804 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:49:42 np0005539505 nova_compute[186958]: 2025-11-29 06:49:42.804 186962 INFO nova.compute.claims [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:49:42 np0005539505 nova_compute[186958]: 2025-11-29 06:49:42.955 186962 DEBUG nova.compute.provider_tree [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:49:42 np0005539505 nova_compute[186958]: 2025-11-29 06:49:42.986 186962 DEBUG nova.scheduler.client.report [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.015 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.016 186962 DEBUG nova.compute.manager [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.094 186962 DEBUG nova.compute.manager [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.094 186962 DEBUG nova.network.neutron [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.116 186962 INFO nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.143 186962 DEBUG nova.compute.manager [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.288 186962 DEBUG nova.compute.manager [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.290 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.290 186962 INFO nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Creating image(s)#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.291 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "/var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.292 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "/var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.292 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "/var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.309 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.363 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.364 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.365 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.381 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.470 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.471 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.939 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk 1073741824" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.941 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:43 np0005539505 nova_compute[186958]: 2025-11-29 06:49:43.941 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.014 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.015 186962 DEBUG nova.virt.disk.api [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Checking if we can resize image /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.015 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.067 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.068 186962 DEBUG nova.virt.disk.api [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Cannot resize image /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.068 186962 DEBUG nova.objects.instance [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lazy-loading 'migration_context' on Instance uuid 7957f954-232f-402f-98b1-f5e740a43946 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.094 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.095 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Ensure instance console log exists: /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.095 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.095 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.096 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.176 186962 DEBUG nova.network.neutron [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.177 186962 DEBUG nova.compute.manager [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.180 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.186 186962 WARNING nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.190 186962 DEBUG nova.virt.libvirt.host [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.191 186962 DEBUG nova.virt.libvirt.host [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.194 186962 DEBUG nova.virt.libvirt.host [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.194 186962 DEBUG nova.virt.libvirt.host [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.195 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.195 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.196 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.196 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.196 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.196 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.197 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.197 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.197 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.197 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.198 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.198 186962 DEBUG nova.virt.hardware [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.201 186962 DEBUG nova.objects.instance [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7957f954-232f-402f-98b1-f5e740a43946 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.219 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  <uuid>7957f954-232f-402f-98b1-f5e740a43946</uuid>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  <name>instance-00000009</name>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <nova:name>tempest-LiveMigrationNegativeTest-server-370146271</nova:name>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:49:44</nova:creationTime>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:49:44 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:        <nova:user uuid="2d4cdd10f0da450ea4816d37bf63eb69">tempest-LiveMigrationNegativeTest-1866892842-project-member</nova:user>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:        <nova:project uuid="574d8ee971ce4fc39fff37888dddd4e1">tempest-LiveMigrationNegativeTest-1866892842</nova:project>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <nova:ports/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <entry name="serial">7957f954-232f-402f-98b1-f5e740a43946</entry>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <entry name="uuid">7957f954-232f-402f-98b1-f5e740a43946</entry>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk.config"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/console.log" append="off"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:49:44 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:49:44 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:49:44 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:49:44 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.647 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.648 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:49:44 np0005539505 nova_compute[186958]: 2025-11-29 06:49:44.649 186962 INFO nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Using config drive#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.053 186962 INFO nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Creating config drive at /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk.config#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.058 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq0qgg54u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.179 186962 DEBUG oslo_concurrency.processutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq0qgg54u" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:45 np0005539505 systemd-machined[153285]: New machine qemu-3-instance-00000009.
Nov 29 01:49:45 np0005539505 systemd[1]: Started Virtual Machine qemu-3-instance-00000009.
Nov 29 01:49:45 np0005539505 podman[214642]: 2025-11-29 06:49:45.31597992 +0000 UTC m=+0.074943240 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 01:49:45 np0005539505 podman[214641]: 2025-11-29 06:49:45.32204456 +0000 UTC m=+0.081610597 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm)
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.782 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764398985.7817247, 7957f954-232f-402f-98b1-f5e740a43946 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.782 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7957f954-232f-402f-98b1-f5e740a43946] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.785 186962 DEBUG nova.compute.manager [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.785 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.789 186962 INFO nova.virt.libvirt.driver [-] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Instance spawned successfully.#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.789 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.806 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.811 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.815 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.815 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.816 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.816 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.817 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.817 186962 DEBUG nova.virt.libvirt.driver [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.839 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7957f954-232f-402f-98b1-f5e740a43946] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.840 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764398985.7826638, 7957f954-232f-402f-98b1-f5e740a43946 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.840 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7957f954-232f-402f-98b1-f5e740a43946] VM Started (Lifecycle Event)#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.872 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.874 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.951 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7957f954-232f-402f-98b1-f5e740a43946] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.966 186962 INFO nova.compute.manager [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Took 2.68 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:49:45 np0005539505 nova_compute[186958]: 2025-11-29 06:49:45.966 186962 DEBUG nova.compute.manager [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:46 np0005539505 nova_compute[186958]: 2025-11-29 06:49:46.087 186962 INFO nova.compute.manager [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Took 3.33 seconds to build instance.#033[00m
Nov 29 01:49:46 np0005539505 nova_compute[186958]: 2025-11-29 06:49:46.109 186962 DEBUG oslo_concurrency.lockutils [None req-fe281c95-ff21-4047-a346-27437cca30ca 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "7957f954-232f-402f-98b1-f5e740a43946" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.446s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:46 np0005539505 nova_compute[186958]: 2025-11-29 06:49:46.700 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:46 np0005539505 nova_compute[186958]: 2025-11-29 06:49:46.782 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:48 np0005539505 podman[214701]: 2025-11-29 06:49:48.721792507 +0000 UTC m=+0.053883007 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.143 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "0e63e043-06e7-454d-b495-fa69f412a1eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.144 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "0e63e043-06e7-454d-b495-fa69f412a1eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.162 186962 DEBUG nova.compute.manager [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.284 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.284 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.290 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.291 186962 INFO nova.compute.claims [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.488 186962 DEBUG nova.compute.provider_tree [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.509 186962 DEBUG nova.scheduler.client.report [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.530 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.531 186962 DEBUG nova.compute.manager [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.604 186962 DEBUG nova.compute.manager [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.605 186962 DEBUG nova.network.neutron [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.624 186962 INFO nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.649 186962 DEBUG nova.compute.manager [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.784 186962 DEBUG nova.compute.manager [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.785 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.785 186962 INFO nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Creating image(s)#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.786 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "/var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.786 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "/var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.787 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "/var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.801 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.862 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.863 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.864 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.875 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.892 186962 DEBUG nova.network.neutron [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.893 186962 DEBUG nova.compute.manager [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.929 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:50 np0005539505 nova_compute[186958]: 2025-11-29 06:49:50.930 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.020 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk 1073741824" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.021 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.022 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.079 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.080 186962 DEBUG nova.virt.disk.api [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Checking if we can resize image /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.081 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.135 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.136 186962 DEBUG nova.virt.disk.api [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Cannot resize image /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.137 186962 DEBUG nova.objects.instance [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lazy-loading 'migration_context' on Instance uuid 0e63e043-06e7-454d-b495-fa69f412a1eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.150 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.151 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Ensure instance console log exists: /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.151 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.152 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.152 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.153 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.157 186962 WARNING nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.160 186962 DEBUG nova.virt.libvirt.host [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.161 186962 DEBUG nova.virt.libvirt.host [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.166 186962 DEBUG nova.virt.libvirt.host [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.167 186962 DEBUG nova.virt.libvirt.host [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.168 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.168 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.169 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.169 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.169 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.169 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.169 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.170 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.170 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.170 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.171 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.171 186962 DEBUG nova.virt.hardware [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.174 186962 DEBUG nova.objects.instance [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0e63e043-06e7-454d-b495-fa69f412a1eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.195 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  <uuid>0e63e043-06e7-454d-b495-fa69f412a1eb</uuid>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  <name>instance-0000000b</name>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <nova:name>tempest-LiveMigrationNegativeTest-server-2094499795</nova:name>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:49:51</nova:creationTime>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:49:51 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:        <nova:user uuid="2d4cdd10f0da450ea4816d37bf63eb69">tempest-LiveMigrationNegativeTest-1866892842-project-member</nova:user>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:        <nova:project uuid="574d8ee971ce4fc39fff37888dddd4e1">tempest-LiveMigrationNegativeTest-1866892842</nova:project>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <nova:ports/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <entry name="serial">0e63e043-06e7-454d-b495-fa69f412a1eb</entry>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <entry name="uuid">0e63e043-06e7-454d-b495-fa69f412a1eb</entry>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk.config"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/console.log" append="off"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:49:51 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:49:51 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:49:51 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:49:51 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.249 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.249 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.250 186962 INFO nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Using config drive#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.391 186962 INFO nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Creating config drive at /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk.config#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.397 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfwl2x2mg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.521 186962 DEBUG oslo_concurrency.processutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfwl2x2mg" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:49:51 np0005539505 systemd-machined[153285]: New machine qemu-4-instance-0000000b.
Nov 29 01:49:51 np0005539505 systemd[1]: Started Virtual Machine qemu-4-instance-0000000b.
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.705 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:51 np0005539505 nova_compute[186958]: 2025-11-29 06:49:51.823 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.894 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764398992.894026, 0e63e043-06e7-454d-b495-fa69f412a1eb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.895 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.904 186962 DEBUG nova.compute.manager [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.905 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.908 186962 INFO nova.virt.libvirt.driver [-] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Instance spawned successfully.#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.908 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.935 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.940 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.942 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.943 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.943 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.943 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.944 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.944 186962 DEBUG nova.virt.libvirt.driver [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.986 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.987 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764398992.9038796, 0e63e043-06e7-454d-b495-fa69f412a1eb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:52 np0005539505 nova_compute[186958]: 2025-11-29 06:49:52.987 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] VM Started (Lifecycle Event)#033[00m
Nov 29 01:49:53 np0005539505 nova_compute[186958]: 2025-11-29 06:49:53.024 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:53 np0005539505 nova_compute[186958]: 2025-11-29 06:49:53.026 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:49:53 np0005539505 nova_compute[186958]: 2025-11-29 06:49:53.072 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:49:53 np0005539505 nova_compute[186958]: 2025-11-29 06:49:53.073 186962 INFO nova.compute.manager [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Took 2.29 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:49:53 np0005539505 nova_compute[186958]: 2025-11-29 06:49:53.074 186962 DEBUG nova.compute.manager [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:53 np0005539505 nova_compute[186958]: 2025-11-29 06:49:53.171 186962 INFO nova.compute.manager [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Took 2.94 seconds to build instance.#033[00m
Nov 29 01:49:53 np0005539505 nova_compute[186958]: 2025-11-29 06:49:53.207 186962 DEBUG oslo_concurrency.lockutils [None req-6099b61c-3207-4836-b23d-750a11143a3a 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "0e63e043-06e7-454d-b495-fa69f412a1eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:54 np0005539505 podman[214765]: 2025-11-29 06:49:54.719081386 +0000 UTC m=+0.052656193 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:49:54 np0005539505 podman[214766]: 2025-11-29 06:49:54.770717509 +0000 UTC m=+0.091905897 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 01:49:54 np0005539505 nova_compute[186958]: 2025-11-29 06:49:54.785 186962 DEBUG nova.objects.instance [None req-64a693a5-9978-4b7e-a1e1-3f42862dc484 76da4e2feec94927bdd8b43ab2ed3247 3fff1e8033ef4252a83ca7631204a3b7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0e63e043-06e7-454d-b495-fa69f412a1eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:54 np0005539505 nova_compute[186958]: 2025-11-29 06:49:54.806 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764398994.8063753, 0e63e043-06e7-454d-b495-fa69f412a1eb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:49:54 np0005539505 nova_compute[186958]: 2025-11-29 06:49:54.806 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:49:54 np0005539505 nova_compute[186958]: 2025-11-29 06:49:54.827 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:54 np0005539505 nova_compute[186958]: 2025-11-29 06:49:54.831 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:49:54 np0005539505 nova_compute[186958]: 2025-11-29 06:49:54.856 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Nov 29 01:49:56 np0005539505 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Nov 29 01:49:56 np0005539505 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Consumed 2.592s CPU time.
Nov 29 01:49:56 np0005539505 systemd-machined[153285]: Machine qemu-4-instance-0000000b terminated.
Nov 29 01:49:56 np0005539505 nova_compute[186958]: 2025-11-29 06:49:56.294 186962 DEBUG nova.compute.manager [None req-64a693a5-9978-4b7e-a1e1-3f42862dc484 76da4e2feec94927bdd8b43ab2ed3247 3fff1e8033ef4252a83ca7631204a3b7 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:49:56 np0005539505 nova_compute[186958]: 2025-11-29 06:49:56.705 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:56 np0005539505 nova_compute[186958]: 2025-11-29 06:49:56.824 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:49:58 np0005539505 nova_compute[186958]: 2025-11-29 06:49:58.596 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "0e63e043-06e7-454d-b495-fa69f412a1eb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:58 np0005539505 nova_compute[186958]: 2025-11-29 06:49:58.598 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "0e63e043-06e7-454d-b495-fa69f412a1eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:58 np0005539505 nova_compute[186958]: 2025-11-29 06:49:58.599 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "0e63e043-06e7-454d-b495-fa69f412a1eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:49:58 np0005539505 nova_compute[186958]: 2025-11-29 06:49:58.599 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "0e63e043-06e7-454d-b495-fa69f412a1eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:49:58 np0005539505 nova_compute[186958]: 2025-11-29 06:49:58.599 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "0e63e043-06e7-454d-b495-fa69f412a1eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:49:58 np0005539505 nova_compute[186958]: 2025-11-29 06:49:58.612 186962 INFO nova.compute.manager [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Terminating instance#033[00m
Nov 29 01:49:58 np0005539505 nova_compute[186958]: 2025-11-29 06:49:58.627 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "refresh_cache-0e63e043-06e7-454d-b495-fa69f412a1eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:49:58 np0005539505 nova_compute[186958]: 2025-11-29 06:49:58.627 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquired lock "refresh_cache-0e63e043-06e7-454d-b495-fa69f412a1eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:49:58 np0005539505 nova_compute[186958]: 2025-11-29 06:49:58.628 186962 DEBUG nova.network.neutron [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:49:58 np0005539505 podman[214835]: 2025-11-29 06:49:58.729276302 +0000 UTC m=+0.058522628 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 01:49:58 np0005539505 nova_compute[186958]: 2025-11-29 06:49:58.819 186962 DEBUG nova.network.neutron [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.368 186962 DEBUG nova.network.neutron [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.495 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Releasing lock "refresh_cache-0e63e043-06e7-454d-b495-fa69f412a1eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.496 186962 DEBUG nova.compute.manager [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.503 186962 INFO nova.virt.libvirt.driver [-] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Instance destroyed successfully.#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.504 186962 DEBUG nova.objects.instance [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lazy-loading 'resources' on Instance uuid 0e63e043-06e7-454d-b495-fa69f412a1eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.522 186962 INFO nova.virt.libvirt.driver [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Deleting instance files /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb_del#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.523 186962 INFO nova.virt.libvirt.driver [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Deletion of /var/lib/nova/instances/0e63e043-06e7-454d-b495-fa69f412a1eb_del complete#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.701 186962 INFO nova.compute.manager [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Took 0.21 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.703 186962 DEBUG oslo.service.loopingcall [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.703 186962 DEBUG nova.compute.manager [-] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.703 186962 DEBUG nova.network.neutron [-] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:49:59 np0005539505 nova_compute[186958]: 2025-11-29 06:49:59.982 186962 DEBUG nova.network.neutron [-] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:50:00 np0005539505 nova_compute[186958]: 2025-11-29 06:50:00.007 186962 DEBUG nova.network.neutron [-] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:00 np0005539505 nova_compute[186958]: 2025-11-29 06:50:00.038 186962 INFO nova.compute.manager [-] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Took 0.33 seconds to deallocate network for instance.#033[00m
Nov 29 01:50:00 np0005539505 nova_compute[186958]: 2025-11-29 06:50:00.140 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:00 np0005539505 nova_compute[186958]: 2025-11-29 06:50:00.141 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:00 np0005539505 nova_compute[186958]: 2025-11-29 06:50:00.239 186962 DEBUG nova.compute.provider_tree [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:00 np0005539505 nova_compute[186958]: 2025-11-29 06:50:00.258 186962 DEBUG nova.scheduler.client.report [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:00 np0005539505 nova_compute[186958]: 2025-11-29 06:50:00.281 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:00 np0005539505 nova_compute[186958]: 2025-11-29 06:50:00.314 186962 INFO nova.scheduler.client.report [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Deleted allocations for instance 0e63e043-06e7-454d-b495-fa69f412a1eb#033[00m
Nov 29 01:50:00 np0005539505 nova_compute[186958]: 2025-11-29 06:50:00.390 186962 DEBUG oslo_concurrency.lockutils [None req-5863a866-89bf-4925-b5c4-4e267b19a622 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "0e63e043-06e7-454d-b495-fa69f412a1eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.792s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.613 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "7957f954-232f-402f-98b1-f5e740a43946" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.614 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "7957f954-232f-402f-98b1-f5e740a43946" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.614 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "7957f954-232f-402f-98b1-f5e740a43946-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.615 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "7957f954-232f-402f-98b1-f5e740a43946-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.615 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "7957f954-232f-402f-98b1-f5e740a43946-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.707 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.771 186962 INFO nova.compute.manager [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Terminating instance#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.789 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "refresh_cache-7957f954-232f-402f-98b1-f5e740a43946" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.790 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquired lock "refresh_cache-7957f954-232f-402f-98b1-f5e740a43946" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.790 186962 DEBUG nova.network.neutron [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.867 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.879 186962 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Creating tmpfile /var/lib/nova/instances/tmpafq0nrkl to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 29 01:50:01 np0005539505 nova_compute[186958]: 2025-11-29 06:50:01.881 186962 DEBUG nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpafq0nrkl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.046 186962 DEBUG nova.network.neutron [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.289 186962 DEBUG nova.network.neutron [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.303 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Releasing lock "refresh_cache-7957f954-232f-402f-98b1-f5e740a43946" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.304 186962 DEBUG nova.compute.manager [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:50:02 np0005539505 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 29 01:50:02 np0005539505 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Consumed 12.600s CPU time.
Nov 29 01:50:02 np0005539505 systemd-machined[153285]: Machine qemu-3-instance-00000009 terminated.
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.552 186962 INFO nova.virt.libvirt.driver [-] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Instance destroyed successfully.#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.553 186962 DEBUG nova.objects.instance [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lazy-loading 'resources' on Instance uuid 7957f954-232f-402f-98b1-f5e740a43946 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.569 186962 INFO nova.virt.libvirt.driver [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Deleting instance files /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946_del#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.570 186962 INFO nova.virt.libvirt.driver [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Deletion of /var/lib/nova/instances/7957f954-232f-402f-98b1-f5e740a43946_del complete#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.642 186962 INFO nova.compute.manager [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.643 186962 DEBUG oslo.service.loopingcall [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.643 186962 DEBUG nova.compute.manager [-] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.643 186962 DEBUG nova.network.neutron [-] [instance: 7957f954-232f-402f-98b1-f5e740a43946] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.773 186962 DEBUG nova.network.neutron [-] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.795 186962 DEBUG nova.network.neutron [-] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.815 186962 INFO nova.compute.manager [-] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Took 0.17 seconds to deallocate network for instance.#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.900 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.901 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.960 186962 DEBUG nova.compute.provider_tree [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:02 np0005539505 nova_compute[186958]: 2025-11-29 06:50:02.978 186962 DEBUG nova.scheduler.client.report [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:03 np0005539505 nova_compute[186958]: 2025-11-29 06:50:03.011 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:03 np0005539505 nova_compute[186958]: 2025-11-29 06:50:03.047 186962 INFO nova.scheduler.client.report [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Deleted allocations for instance 7957f954-232f-402f-98b1-f5e740a43946#033[00m
Nov 29 01:50:03 np0005539505 nova_compute[186958]: 2025-11-29 06:50:03.097 186962 DEBUG nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpafq0nrkl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e6b5b54b-9532-4f51-a346-42dee946a9ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 29 01:50:03 np0005539505 nova_compute[186958]: 2025-11-29 06:50:03.123 186962 DEBUG oslo_concurrency.lockutils [None req-c7389f84-1f66-4128-a953-f827e4c90906 2d4cdd10f0da450ea4816d37bf63eb69 574d8ee971ce4fc39fff37888dddd4e1 - - default default] Lock "7957f954-232f-402f-98b1-f5e740a43946" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:03 np0005539505 nova_compute[186958]: 2025-11-29 06:50:03.127 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:03 np0005539505 nova_compute[186958]: 2025-11-29 06:50:03.127 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquired lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:03 np0005539505 nova_compute[186958]: 2025-11-29 06:50:03.127 186962 DEBUG nova.network.neutron [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:50:03 np0005539505 podman[214866]: 2025-11-29 06:50:03.720143318 +0000 UTC m=+0.057064757 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.306 186962 DEBUG nova.network.neutron [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Updating instance_info_cache with network_info: [{"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.326 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Releasing lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.340 186962 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpafq0nrkl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e6b5b54b-9532-4f51-a346-42dee946a9ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.341 186962 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Creating instance directory: /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.341 186962 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Creating disk.info with the contents: {'/var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk': 'qcow2', '/var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.342 186962 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.342 186962 DEBUG nova.objects.instance [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lazy-loading 'trusted_certs' on Instance uuid e6b5b54b-9532-4f51-a346-42dee946a9ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.367 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.424 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.425 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.426 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.436 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.496 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.497 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.972 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk 1073741824" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.973 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:04 np0005539505 nova_compute[186958]: 2025-11-29 06:50:04.974 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.030 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.031 186962 DEBUG nova.virt.disk.api [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Checking if we can resize image /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.031 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.100 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.101 186962 DEBUG nova.virt.disk.api [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Cannot resize image /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.102 186962 DEBUG nova.objects.instance [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lazy-loading 'migration_context' on Instance uuid e6b5b54b-9532-4f51-a346-42dee946a9ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.126 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.172 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.config 485376" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.173 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.config to /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.174 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.config /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.689 186962 DEBUG oslo_concurrency.processutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef/disk.config /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.691 186962 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.693 186962 DEBUG nova.virt.libvirt.vif [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:49:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-797555638',display_name='tempest-LiveMigrationTest-server-797555638',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-797555638',id=10,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:49:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-duitc0f8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:49:59Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=e6b5b54b-9532-4f51-a346-42dee946a9ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.693 186962 DEBUG nova.network.os_vif_util [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converting VIF {"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.695 186962 DEBUG nova.network.os_vif_util [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.696 186962 DEBUG os_vif [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.697 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.698 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.698 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.702 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.703 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04956313-39, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.704 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap04956313-39, col_values=(('external_ids', {'iface-id': '04956313-39e4-4275-ab3f-18aa7a1a0e46', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:f5:6d', 'vm-uuid': 'e6b5b54b-9532-4f51-a346-42dee946a9ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.706 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:05 np0005539505 NetworkManager[55134]: <info>  [1764399005.7083] manager: (tap04956313-39): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.712 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.716 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.717 186962 INFO os_vif [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39')#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.718 186962 DEBUG nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 29 01:50:05 np0005539505 nova_compute[186958]: 2025-11-29 06:50:05.719 186962 DEBUG nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpafq0nrkl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e6b5b54b-9532-4f51-a346-42dee946a9ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 29 01:50:06 np0005539505 nova_compute[186958]: 2025-11-29 06:50:06.709 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:07 np0005539505 nova_compute[186958]: 2025-11-29 06:50:07.703 186962 DEBUG nova.network.neutron [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Port 04956313-39e4-4275-ab3f-18aa7a1a0e46 updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 29 01:50:07 np0005539505 nova_compute[186958]: 2025-11-29 06:50:07.715 186962 DEBUG nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpafq0nrkl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e6b5b54b-9532-4f51-a346-42dee946a9ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 29 01:50:08 np0005539505 kernel: tap04956313-39: entered promiscuous mode
Nov 29 01:50:08 np0005539505 NetworkManager[55134]: <info>  [1764399008.0180] manager: (tap04956313-39): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Nov 29 01:50:08 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:08Z|00046|binding|INFO|Claiming lport 04956313-39e4-4275-ab3f-18aa7a1a0e46 for this additional chassis.
Nov 29 01:50:08 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:08Z|00047|binding|INFO|04956313-39e4-4275-ab3f-18aa7a1a0e46: Claiming fa:16:3e:b8:f5:6d 10.100.0.9
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.090 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:08 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:08Z|00048|binding|INFO|Claiming lport 81278169-001b-4894-adbd-075edcc27e49 for this additional chassis.
Nov 29 01:50:08 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:08Z|00049|binding|INFO|81278169-001b-4894-adbd-075edcc27e49: Claiming fa:16:3e:a3:12:0f 19.80.0.181
Nov 29 01:50:08 np0005539505 systemd-udevd[214921]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.097 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:08 np0005539505 NetworkManager[55134]: <info>  [1764399008.1032] device (tap04956313-39): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:50:08 np0005539505 NetworkManager[55134]: <info>  [1764399008.1067] device (tap04956313-39): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:50:08 np0005539505 systemd-machined[153285]: New machine qemu-5-instance-0000000a.
Nov 29 01:50:08 np0005539505 systemd[1]: Started Virtual Machine qemu-5-instance-0000000a.
Nov 29 01:50:08 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:08Z|00050|binding|INFO|Setting lport 04956313-39e4-4275-ab3f-18aa7a1a0e46 ovn-installed in OVS
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.179 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.430 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.430 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.459 186962 DEBUG nova.compute.manager [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.563 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.563 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.569 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.570 186962 INFO nova.compute.claims [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.718 186962 DEBUG nova.compute.provider_tree [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.733 186962 DEBUG nova.scheduler.client.report [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.752 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.753 186962 DEBUG nova.compute.manager [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.818 186962 DEBUG nova.compute.manager [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.819 186962 DEBUG nova.network.neutron [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.848 186962 INFO nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.874 186962 DEBUG nova.compute.manager [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:50:08 np0005539505 nova_compute[186958]: 2025-11-29 06:50:08.998 186962 DEBUG nova.compute.manager [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.000 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.001 186962 INFO nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Creating image(s)#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.001 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "/var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.001 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "/var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.002 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "/var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.020 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.074 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.075 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.076 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.088 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.143 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.144 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.308 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399009.308244, e6b5b54b-9532-4f51-a346-42dee946a9ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.309 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] VM Started (Lifecycle Event)#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.330 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.432 186962 DEBUG nova.policy [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.805 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk 1073741824" returned: 0 in 0.660s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.806 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.807 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.908 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.909 186962 DEBUG nova.virt.disk.api [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Checking if we can resize image /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.909 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.974 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.976 186962 DEBUG nova.virt.disk.api [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Cannot resize image /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:50:09 np0005539505 nova_compute[186958]: 2025-11-29 06:50:09.976 186962 DEBUG nova.objects.instance [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lazy-loading 'migration_context' on Instance uuid 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.078 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.079 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Ensure instance console log exists: /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.080 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.080 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.081 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.282 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399010.2814233, e6b5b54b-9532-4f51-a346-42dee946a9ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.283 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.640 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.644 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.690 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.708 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:10 np0005539505 nova_compute[186958]: 2025-11-29 06:50:10.854 186962 DEBUG nova.network.neutron [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Successfully created port: 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:50:11 np0005539505 nova_compute[186958]: 2025-11-29 06:50:11.296 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764398996.2947228, 0e63e043-06e7-454d-b495-fa69f412a1eb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:11 np0005539505 nova_compute[186958]: 2025-11-29 06:50:11.297 186962 INFO nova.compute.manager [-] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:50:11 np0005539505 nova_compute[186958]: 2025-11-29 06:50:11.315 186962 DEBUG nova.compute.manager [None req-14d408cd-748f-472d-81ad-dac6dd46d6f3 - - - - - -] [instance: 0e63e043-06e7-454d-b495-fa69f412a1eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:11 np0005539505 nova_compute[186958]: 2025-11-29 06:50:11.711 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:12 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:12Z|00051|binding|INFO|Claiming lport 04956313-39e4-4275-ab3f-18aa7a1a0e46 for this chassis.
Nov 29 01:50:12 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:12Z|00052|binding|INFO|04956313-39e4-4275-ab3f-18aa7a1a0e46: Claiming fa:16:3e:b8:f5:6d 10.100.0.9
Nov 29 01:50:12 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:12Z|00053|binding|INFO|Claiming lport 81278169-001b-4894-adbd-075edcc27e49 for this chassis.
Nov 29 01:50:12 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:12Z|00054|binding|INFO|81278169-001b-4894-adbd-075edcc27e49: Claiming fa:16:3e:a3:12:0f 19.80.0.181
Nov 29 01:50:12 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:12Z|00055|binding|INFO|Setting lport 04956313-39e4-4275-ab3f-18aa7a1a0e46 up in Southbound
Nov 29 01:50:12 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:12Z|00056|binding|INFO|Setting lport 81278169-001b-4894-adbd-075edcc27e49 up in Southbound
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.487 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:12:0f 19.80.0.181'], port_security=['fa:16:3e:a3:12:0f 19.80.0.181'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['04956313-39e4-4275-ab3f-18aa7a1a0e46'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1726918358', 'neutron:cidrs': '19.80.0.181/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1726918358', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=ec2078f4-7ef2-4848-8fcd-c69eaba744f4, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=81278169-001b-4894-adbd-075edcc27e49) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.491 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:f5:6d 10.100.0.9'], port_security=['fa:16:3e:b8:f5:6d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1473147608', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e6b5b54b-9532-4f51-a346-42dee946a9ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1473147608', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '11', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fafd611f-c010-460d-b1cc-2d52a79696f1, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=04956313-39e4-4275-ab3f-18aa7a1a0e46) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.493 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 81278169-001b-4894-adbd-075edcc27e49 in datapath 5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 bound to our chassis#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.497 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5bda1138-fab5-4b3a-9a12-4d1c90a4dce0#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.511 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f0940df6-4234-41c4-a10c-85790980c2ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.513 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5bda1138-f1 in ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.515 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5bda1138-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.515 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dbb8172e-290a-401a-89e6-4600f1a73ca1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.516 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dbe0b52a-0656-453e-84b4-87a3320f7c6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.527 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[5799d32e-9447-436b-8d91-acdc47ef8422]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.541 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d5269aae-fed6-435c-ac9f-b483d263b89b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.575 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2efcfd-d7c7-44a9-8acf-e5522f582317]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.581 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2a57e832-b023-4787-8eaf-df15754548c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 NetworkManager[55134]: <info>  [1764399012.5829] manager: (tap5bda1138-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Nov 29 01:50:12 np0005539505 systemd-udevd[214967]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.620 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[6c2be2c0-c7db-4a0d-a8b0-31e7de0c6a0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.624 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b5aa4b3a-2b08-4eca-baea-93fc0610781b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 NetworkManager[55134]: <info>  [1764399012.6609] device (tap5bda1138-f0): carrier: link connected
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.665 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[4d74fd35-f104-4ad4-be09-35a44f158bad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.688 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7e57a1ac-5350-4ada-8ae8-a08d3dc9ced2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bda1138-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:70:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446024, 'reachable_time': 27075, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214986, 'error': None, 'target': 'ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.711 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[331beca5-aa5e-4d19-a471-38d94f825e38]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:705d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446024, 'tstamp': 446024}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214987, 'error': None, 'target': 'ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.736 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8fab1f0c-d84c-43c0-80ed-f274ac0c375b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bda1138-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:70:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446024, 'reachable_time': 27075, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214988, 'error': None, 'target': 'ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 nova_compute[186958]: 2025-11-29 06:50:12.756 186962 INFO nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Post operation of migration started#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.772 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b57ccfd3-6c15-4c88-b7e2-2875d82af8e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.840 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[04a8905e-18df-427c-8d32-0d9fc1102a15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.842 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bda1138-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.843 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.843 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bda1138-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:12 np0005539505 nova_compute[186958]: 2025-11-29 06:50:12.846 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:12 np0005539505 kernel: tap5bda1138-f0: entered promiscuous mode
Nov 29 01:50:12 np0005539505 NetworkManager[55134]: <info>  [1764399012.8477] manager: (tap5bda1138-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Nov 29 01:50:12 np0005539505 nova_compute[186958]: 2025-11-29 06:50:12.849 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.850 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5bda1138-f0, col_values=(('external_ids', {'iface-id': 'a5f83360-af8d-41aa-987f-5ef9d63c1561'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:12 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:12Z|00057|binding|INFO|Releasing lport a5f83360-af8d-41aa-987f-5ef9d63c1561 from this chassis (sb_readonly=0)
Nov 29 01:50:12 np0005539505 nova_compute[186958]: 2025-11-29 06:50:12.853 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.856 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5bda1138-fab5-4b3a-9a12-4d1c90a4dce0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5bda1138-fab5-4b3a-9a12-4d1c90a4dce0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.858 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2555f277-777a-40b7-b332-55608bbac143]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.859 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/5bda1138-fab5-4b3a-9a12-4d1c90a4dce0.pid.haproxy
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 5bda1138-fab5-4b3a-9a12-4d1c90a4dce0
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:50:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:12.861 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'env', 'PROCESS_TAG=haproxy-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5bda1138-fab5-4b3a-9a12-4d1c90a4dce0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:50:12 np0005539505 nova_compute[186958]: 2025-11-29 06:50:12.863 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:13 np0005539505 nova_compute[186958]: 2025-11-29 06:50:13.117 186962 DEBUG nova.network.neutron [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Successfully updated port: 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:50:13 np0005539505 nova_compute[186958]: 2025-11-29 06:50:13.130 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "refresh_cache-9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:13 np0005539505 nova_compute[186958]: 2025-11-29 06:50:13.130 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquired lock "refresh_cache-9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:13 np0005539505 nova_compute[186958]: 2025-11-29 06:50:13.131 186962 DEBUG nova.network.neutron [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:50:13 np0005539505 podman[215026]: 2025-11-29 06:50:13.174782554 +0000 UTC m=+0.021793714 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:50:13 np0005539505 nova_compute[186958]: 2025-11-29 06:50:13.336 186962 DEBUG nova.compute.manager [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Received event network-changed-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:13 np0005539505 nova_compute[186958]: 2025-11-29 06:50:13.336 186962 DEBUG nova.compute.manager [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Refreshing instance network info cache due to event network-changed-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:50:13 np0005539505 nova_compute[186958]: 2025-11-29 06:50:13.337 186962 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:14 np0005539505 nova_compute[186958]: 2025-11-29 06:50:14.116 186962 DEBUG nova.network.neutron [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:50:14 np0005539505 podman[215026]: 2025-11-29 06:50:14.995095063 +0000 UTC m=+1.842106233 container create 0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 01:50:15 np0005539505 nova_compute[186958]: 2025-11-29 06:50:15.104 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:15 np0005539505 nova_compute[186958]: 2025-11-29 06:50:15.105 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquired lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:15 np0005539505 nova_compute[186958]: 2025-11-29 06:50:15.105 186962 DEBUG nova.network.neutron [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:50:15 np0005539505 systemd[1]: Started libpod-conmon-0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e.scope.
Nov 29 01:50:15 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:50:15 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c67f2ee1b82c38b4babf50ce4b806d1286a0fb60c8e2962177e42c14e92e5d0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:50:15 np0005539505 podman[215026]: 2025-11-29 06:50:15.677656442 +0000 UTC m=+2.524667632 container init 0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 01:50:15 np0005539505 podman[215026]: 2025-11-29 06:50:15.682768066 +0000 UTC m=+2.529779226 container start 0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:50:15 np0005539505 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[215046]: [NOTICE]   (215071) : New worker (215077) forked
Nov 29 01:50:15 np0005539505 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[215046]: [NOTICE]   (215071) : Loading success.
Nov 29 01:50:15 np0005539505 nova_compute[186958]: 2025-11-29 06:50:15.711 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:15 np0005539505 podman[215049]: 2025-11-29 06:50:15.84497648 +0000 UTC m=+0.214970400 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.855 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 04956313-39e4-4275-ab3f-18aa7a1a0e46 in datapath 24ee44f0-2b10-459c-aabf-bf9ef2c8d950 unbound from our chassis#033[00m
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.857 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24ee44f0-2b10-459c-aabf-bf9ef2c8d950#033[00m
Nov 29 01:50:15 np0005539505 podman[215048]: 2025-11-29 06:50:15.857312097 +0000 UTC m=+0.226995939 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.869 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3e62dfbd-bd5f-4d1c-a54f-94ee966071ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.870 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24ee44f0-21 in ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.872 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24ee44f0-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.872 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5f09ad91-de1f-4cf1-91a8-ce6fd0bcfd06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.873 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1da3ffe0-bf7f-40e1-a804-1ad487972ee9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.884 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[d837657b-4e62-4318-afd6-63a55b373de5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.908 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8e247628-c60e-457c-a3f5-1edc3709cecc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.936 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[02dc6f7d-e158-4519-be41-ed31ae41cba9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.947 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[687bc6e4-a7c7-4668-9e42-f07c90c01eca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:15 np0005539505 NetworkManager[55134]: <info>  [1764399015.9479] manager: (tap24ee44f0-20): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Nov 29 01:50:15 np0005539505 systemd-udevd[215113]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.982 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f42ea1d6-0d23-4ef4-a6a0-e6b615f81294]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:15.985 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c4c3cc63-ba56-4ca0-b398-09a026f94c23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:16 np0005539505 NetworkManager[55134]: <info>  [1764399016.0092] device (tap24ee44f0-20): carrier: link connected
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.013 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[039ceee2-a5ca-47d9-838c-fcb666b63fed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.029 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d9449b24-0f1b-4898-b6d1-eaf47f8f74b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24ee44f0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:94:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446359, 'reachable_time': 26177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215133, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.044 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a437909b-bcaf-463f-9642-17905deb37e1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea4:940c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446359, 'tstamp': 446359}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215134, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.056 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd0241c-bd46-4fa8-b414-ebf7beb8a894]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24ee44f0-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a4:94:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446359, 'reachable_time': 26177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215135, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.078 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[66d33d7e-f358-4b9a-ba7e-a2da4058035a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.134 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8d7ab73f-eb9b-43cd-b96c-72f6e35c6c8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.135 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24ee44f0-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.136 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.136 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24ee44f0-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:16 np0005539505 nova_compute[186958]: 2025-11-29 06:50:16.139 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:16 np0005539505 NetworkManager[55134]: <info>  [1764399016.1396] manager: (tap24ee44f0-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Nov 29 01:50:16 np0005539505 kernel: tap24ee44f0-20: entered promiscuous mode
Nov 29 01:50:16 np0005539505 nova_compute[186958]: 2025-11-29 06:50:16.141 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.144 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24ee44f0-20, col_values=(('external_ids', {'iface-id': 'ffbd3b8f-7e45-45d4-84ce-cd74c712f992'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:16 np0005539505 nova_compute[186958]: 2025-11-29 06:50:16.146 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:16 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:16Z|00058|binding|INFO|Releasing lport ffbd3b8f-7e45-45d4-84ce-cd74c712f992 from this chassis (sb_readonly=0)
Nov 29 01:50:16 np0005539505 nova_compute[186958]: 2025-11-29 06:50:16.146 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.148 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.149 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6a8ccf1d-5717-465e-a7f7-5855769d41a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.150 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-24ee44f0-2b10-459c-aabf-bf9ef2c8d950
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.pid.haproxy
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 24ee44f0-2b10-459c-aabf-bf9ef2c8d950
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:50:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:16.151 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'env', 'PROCESS_TAG=haproxy-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24ee44f0-2b10-459c-aabf-bf9ef2c8d950.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:50:16 np0005539505 nova_compute[186958]: 2025-11-29 06:50:16.158 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:16 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:16Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b8:f5:6d 10.100.0.9
Nov 29 01:50:16 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:16Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b8:f5:6d 10.100.0.9
Nov 29 01:50:16 np0005539505 podman[215168]: 2025-11-29 06:50:16.513623047 +0000 UTC m=+0.054157315 container create c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 01:50:16 np0005539505 systemd[1]: Started libpod-conmon-c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5.scope.
Nov 29 01:50:16 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:50:16 np0005539505 podman[215168]: 2025-11-29 06:50:16.483952172 +0000 UTC m=+0.024486470 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:50:16 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f31a92e9378c42df3acf1395f99aabda98ac209abb752aefcbab2bfec28d32a7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:50:16 np0005539505 podman[215168]: 2025-11-29 06:50:16.59259795 +0000 UTC m=+0.133132228 container init c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:50:16 np0005539505 podman[215168]: 2025-11-29 06:50:16.60006629 +0000 UTC m=+0.140600578 container start c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:50:16 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215183]: [NOTICE]   (215187) : New worker (215189) forked
Nov 29 01:50:16 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215183]: [NOTICE]   (215187) : Loading success.
Nov 29 01:50:16 np0005539505 nova_compute[186958]: 2025-11-29 06:50:16.713 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.165 186962 DEBUG nova.network.neutron [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Updating instance_info_cache with network_info: [{"id": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "address": "fa:16:3e:29:d8:40", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82af0ec7-8b", "ovs_interfaceid": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.224 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Releasing lock "refresh_cache-9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.225 186962 DEBUG nova.compute.manager [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Instance network_info: |[{"id": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "address": "fa:16:3e:29:d8:40", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82af0ec7-8b", "ovs_interfaceid": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.226 186962 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.226 186962 DEBUG nova.network.neutron [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Refreshing network info cache for port 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.230 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Start _get_guest_xml network_info=[{"id": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "address": "fa:16:3e:29:d8:40", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82af0ec7-8b", "ovs_interfaceid": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.238 186962 WARNING nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.244 186962 DEBUG nova.virt.libvirt.host [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.244 186962 DEBUG nova.virt.libvirt.host [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.247 186962 DEBUG nova.virt.libvirt.host [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.247 186962 DEBUG nova.virt.libvirt.host [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.248 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.249 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:50:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1523851324',id=21,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-1649149277',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.249 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.249 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.249 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.250 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.250 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.250 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.250 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.251 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.251 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.251 186962 DEBUG nova.virt.hardware [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.254 186962 DEBUG nova.virt.libvirt.vif [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:50:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-250603763',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-250603763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(21),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-250603763',id=12,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=21,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM1xgBYSUC8gvnJcy96ZHeGXy5fZIEzREwu2SdQF2/u48S662pU9yPXCvPyqRaFOz6WipFAjO8OpVLmGBh29YapeoV6qQTkRjF0wg+tU+pxk7BBzNSGVITtKYZ1VXuTWyg==',key_name='tempest-keypair-1912782802',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='99863a77c63a4673b0ef23a7c0ae373a',ramdisk_id='',reservation_id='r-35t5cudy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-706098489',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-706098489-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:50:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0908eb33a338434891ed9f5dd3768bab',uuid=9dcd4651-f485-4d8f-b6b2-02492d2c0a1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "address": "fa:16:3e:29:d8:40", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82af0ec7-8b", "ovs_interfaceid": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.254 186962 DEBUG nova.network.os_vif_util [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converting VIF {"id": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "address": "fa:16:3e:29:d8:40", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82af0ec7-8b", "ovs_interfaceid": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.255 186962 DEBUG nova.network.os_vif_util [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:d8:40,bridge_name='br-int',has_traffic_filtering=True,id=82af0ec7-8b7a-4fe2-b069-4a1c3566e90d,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82af0ec7-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.256 186962 DEBUG nova.objects.instance [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lazy-loading 'pci_devices' on Instance uuid 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.287 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  <uuid>9dcd4651-f485-4d8f-b6b2-02492d2c0a1c</uuid>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  <name>instance-0000000c</name>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-250603763</nova:name>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:50:17</nova:creationTime>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <nova:flavor name="tempest-flavor_with_ephemeral_0-1649149277">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:        <nova:user uuid="0908eb33a338434891ed9f5dd3768bab">tempest-ServersWithSpecificFlavorTestJSON-706098489-project-member</nova:user>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:        <nova:project uuid="99863a77c63a4673b0ef23a7c0ae373a">tempest-ServersWithSpecificFlavorTestJSON-706098489</nova:project>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:        <nova:port uuid="82af0ec7-8b7a-4fe2-b069-4a1c3566e90d">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <entry name="serial">9dcd4651-f485-4d8f-b6b2-02492d2c0a1c</entry>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <entry name="uuid">9dcd4651-f485-4d8f-b6b2-02492d2c0a1c</entry>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk.config"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:29:d8:40"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <target dev="tap82af0ec7-8b"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/console.log" append="off"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:50:17 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:50:17 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:50:17 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:50:17 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.288 186962 DEBUG nova.compute.manager [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Preparing to wait for external event network-vif-plugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.288 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.288 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.289 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.289 186962 DEBUG nova.virt.libvirt.vif [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:50:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-250603763',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-250603763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(21),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-250603763',id=12,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=21,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM1xgBYSUC8gvnJcy96ZHeGXy5fZIEzREwu2SdQF2/u48S662pU9yPXCvPyqRaFOz6WipFAjO8OpVLmGBh29YapeoV6qQTkRjF0wg+tU+pxk7BBzNSGVITtKYZ1VXuTWyg==',key_name='tempest-keypair-1912782802',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='99863a77c63a4673b0ef23a7c0ae373a',ramdisk_id='',reservation_id='r-35t5cudy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-706098489',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-706098489-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:50:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0908eb33a338434891ed9f5dd3768bab',uuid=9dcd4651-f485-4d8f-b6b2-02492d2c0a1c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "address": "fa:16:3e:29:d8:40", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82af0ec7-8b", "ovs_interfaceid": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.289 186962 DEBUG nova.network.os_vif_util [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converting VIF {"id": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "address": "fa:16:3e:29:d8:40", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82af0ec7-8b", "ovs_interfaceid": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.290 186962 DEBUG nova.network.os_vif_util [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:d8:40,bridge_name='br-int',has_traffic_filtering=True,id=82af0ec7-8b7a-4fe2-b069-4a1c3566e90d,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82af0ec7-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.290 186962 DEBUG os_vif [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:d8:40,bridge_name='br-int',has_traffic_filtering=True,id=82af0ec7-8b7a-4fe2-b069-4a1c3566e90d,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82af0ec7-8b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.291 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.291 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.291 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.294 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.294 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82af0ec7-8b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.294 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82af0ec7-8b, col_values=(('external_ids', {'iface-id': '82af0ec7-8b7a-4fe2-b069-4a1c3566e90d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:29:d8:40', 'vm-uuid': '9dcd4651-f485-4d8f-b6b2-02492d2c0a1c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:17 np0005539505 NetworkManager[55134]: <info>  [1764399017.3321] manager: (tap82af0ec7-8b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.331 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.334 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.337 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.338 186962 INFO os_vif [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:d8:40,bridge_name='br-int',has_traffic_filtering=True,id=82af0ec7-8b7a-4fe2-b069-4a1c3566e90d,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82af0ec7-8b')#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.493 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.493 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.493 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] No VIF found with MAC fa:16:3e:29:d8:40, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.494 186962 INFO nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Using config drive#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.553 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399002.5494347, 7957f954-232f-402f-98b1-f5e740a43946 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.553 186962 INFO nova.compute.manager [-] [instance: 7957f954-232f-402f-98b1-f5e740a43946] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.576 186962 DEBUG nova.compute.manager [None req-2764d8f3-f692-4a4b-a259-27b4d5531810 - - - - - -] [instance: 7957f954-232f-402f-98b1-f5e740a43946] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.985 186962 INFO nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Creating config drive at /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk.config#033[00m
Nov 29 01:50:17 np0005539505 nova_compute[186958]: 2025-11-29 06:50:17.993 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6akwi_9g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.077 186962 DEBUG nova.network.neutron [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Updating instance_info_cache with network_info: [{"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.104 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Releasing lock "refresh_cache-e6b5b54b-9532-4f51-a346-42dee946a9ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.133 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.134 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.134 186962 DEBUG oslo_concurrency.lockutils [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.135 186962 DEBUG oslo_concurrency.processutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6akwi_9g" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.140 186962 INFO nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 29 01:50:18 np0005539505 virtqemud[186353]: Domain id=5 name='instance-0000000a' uuid=e6b5b54b-9532-4f51-a346-42dee946a9ef is tainted: custom-monitor
Nov 29 01:50:18 np0005539505 kernel: tap82af0ec7-8b: entered promiscuous mode
Nov 29 01:50:18 np0005539505 NetworkManager[55134]: <info>  [1764399018.1829] manager: (tap82af0ec7-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.183 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:18 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:18Z|00059|binding|INFO|Claiming lport 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d for this chassis.
Nov 29 01:50:18 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:18Z|00060|binding|INFO|82af0ec7-8b7a-4fe2-b069-4a1c3566e90d: Claiming fa:16:3e:29:d8:40 10.100.0.12
Nov 29 01:50:18 np0005539505 systemd-udevd[215129]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.188 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:18 np0005539505 NetworkManager[55134]: <info>  [1764399018.1981] device (tap82af0ec7-8b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:50:18 np0005539505 NetworkManager[55134]: <info>  [1764399018.1989] device (tap82af0ec7-8b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:50:18 np0005539505 systemd-machined[153285]: New machine qemu-6-instance-0000000c.
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.227 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:d8:40 10.100.0.12'], port_security=['fa:16:3e:29:d8:40 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9dcd4651-f485-4d8f-b6b2-02492d2c0a1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61990940-649c-4332-bea9-4159087142dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a14e098-439d-46d7-8c4c-c5f31ab6085d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e9d8ee4-839b-4078-a12e-f22cafda935b, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=82af0ec7-8b7a-4fe2-b069-4a1c3566e90d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.228 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d in datapath 61990940-649c-4332-bea9-4159087142dd bound to our chassis#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.230 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 61990940-649c-4332-bea9-4159087142dd#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.239 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.240 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1080b004-376b-4423-bd39-87f4ab139c53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.240 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap61990940-61 in ovnmeta-61990940-649c-4332-bea9-4159087142dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.242 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap61990940-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:50:18 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:18Z|00061|binding|INFO|Setting lport 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d ovn-installed in OVS
Nov 29 01:50:18 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:18Z|00062|binding|INFO|Setting lport 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d up in Southbound
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.243 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4de374da-0263-4fab-b828-08659029de6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 systemd[1]: Started Virtual Machine qemu-6-instance-0000000c.
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.244 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.243 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ae3e643a-aaf9-49cd-8aab-fd6ff76e1b39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.255 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[35161966-b98b-4af5-8d5d-6bd5c5a3de14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.268 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3137b6d3-f5d9-48a3-a07d-cfe958f09029]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.296 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[56237548-67db-4d5b-9136-249b07bc4907]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 NetworkManager[55134]: <info>  [1764399018.3044] manager: (tap61990940-60): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.303 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1fae5fc9-cb87-4bcc-86f4-d604c0e8cc20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.331 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c7438fed-b849-4a7c-8dd1-be90815844e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.333 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[59d91d36-28a5-40a5-b078-60a8a9ebfa56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 NetworkManager[55134]: <info>  [1764399018.3537] device (tap61990940-60): carrier: link connected
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.359 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[4010a4d7-6aa8-41f0-8595-674ca7d586a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.376 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d2299ed0-a2f6-400a-ac21-cf518250b4c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61990940-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:ea:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446593, 'reachable_time': 15906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215235, 'error': None, 'target': 'ovnmeta-61990940-649c-4332-bea9-4159087142dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.391 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d99c45f6-3612-4509-bb62-d2ac8c6ccfaf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:ea5d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446593, 'tstamp': 446593}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215236, 'error': None, 'target': 'ovnmeta-61990940-649c-4332-bea9-4159087142dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.408 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4dfa5c7d-9880-4b02-b640-d4abba2db153]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61990940-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:ea:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446593, 'reachable_time': 15906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215237, 'error': None, 'target': 'ovnmeta-61990940-649c-4332-bea9-4159087142dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.438 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[73cc42b9-3aa9-4af1-8050-91e9f79d57a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.488 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[44e70320-e6cb-4dc3-90c1-0e0300a0eecc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.489 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61990940-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.490 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.490 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61990940-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.525 186962 DEBUG nova.compute.manager [req-2fff9851-3128-4abd-8a9f-bc593d6bec73 req-03265e5b-23c0-46f7-9125-ad6d4ff71beb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Received event network-vif-plugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.526 186962 DEBUG oslo_concurrency.lockutils [req-2fff9851-3128-4abd-8a9f-bc593d6bec73 req-03265e5b-23c0-46f7-9125-ad6d4ff71beb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.527 186962 DEBUG oslo_concurrency.lockutils [req-2fff9851-3128-4abd-8a9f-bc593d6bec73 req-03265e5b-23c0-46f7-9125-ad6d4ff71beb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.527 186962 DEBUG oslo_concurrency.lockutils [req-2fff9851-3128-4abd-8a9f-bc593d6bec73 req-03265e5b-23c0-46f7-9125-ad6d4ff71beb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.527 186962 DEBUG nova.compute.manager [req-2fff9851-3128-4abd-8a9f-bc593d6bec73 req-03265e5b-23c0-46f7-9125-ad6d4ff71beb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Processing event network-vif-plugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:50:18 np0005539505 NetworkManager[55134]: <info>  [1764399018.5376] manager: (tap61990940-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Nov 29 01:50:18 np0005539505 kernel: tap61990940-60: entered promiscuous mode
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.549 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap61990940-60, col_values=(('external_ids', {'iface-id': '04e49043-e1b1-4b06-a437-d8d097a15b16'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:18 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:18Z|00063|binding|INFO|Releasing lport 04e49043-e1b1-4b06-a437-d8d097a15b16 from this chassis (sb_readonly=0)
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.550 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.576 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.578 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/61990940-649c-4332-bea9-4159087142dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/61990940-649c-4332-bea9-4159087142dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.580 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[82a13d6d-9178-4511-991a-76fcdbf92270]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.581 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-61990940-649c-4332-bea9-4159087142dd
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/61990940-649c-4332-bea9-4159087142dd.pid.haproxy
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 61990940-649c-4332-bea9-4159087142dd
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:50:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:18.581 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-61990940-649c-4332-bea9-4159087142dd', 'env', 'PROCESS_TAG=haproxy-61990940-649c-4332-bea9-4159087142dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/61990940-649c-4332-bea9-4159087142dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.731 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399018.731031, 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.732 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] VM Started (Lifecycle Event)#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.733 186962 DEBUG nova.compute.manager [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.737 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.740 186962 INFO nova.virt.libvirt.driver [-] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Instance spawned successfully.#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.740 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.758 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.761 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.769 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.769 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.769 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.770 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.770 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.771 186962 DEBUG nova.virt.libvirt.driver [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.831 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.832 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399018.731265, 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.832 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.867 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.871 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399018.736375, 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.872 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.901 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.905 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.944 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:50:18 np0005539505 podman[215274]: 2025-11-29 06:50:18.971994822 +0000 UTC m=+0.054737182 container create f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.991 186962 INFO nova.compute.manager [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Took 9.99 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:50:18 np0005539505 nova_compute[186958]: 2025-11-29 06:50:18.993 186962 DEBUG nova.compute.manager [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:19 np0005539505 systemd[1]: Started libpod-conmon-f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b.scope.
Nov 29 01:50:19 np0005539505 podman[215274]: 2025-11-29 06:50:18.943065428 +0000 UTC m=+0.025807828 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:50:19 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:50:19 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c564f740949d9b90280ce489920b41e3194481b41de4d1ce42c60b647b29697f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:50:19 np0005539505 podman[215274]: 2025-11-29 06:50:19.062582751 +0000 UTC m=+0.145325151 container init f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:50:19 np0005539505 podman[215274]: 2025-11-29 06:50:19.074298991 +0000 UTC m=+0.157041371 container start f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:50:19 np0005539505 podman[215287]: 2025-11-29 06:50:19.074370453 +0000 UTC m=+0.063509179 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 01:50:19 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215290]: [NOTICE]   (215312) : New worker (215314) forked
Nov 29 01:50:19 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215290]: [NOTICE]   (215312) : Loading success.
Nov 29 01:50:19 np0005539505 nova_compute[186958]: 2025-11-29 06:50:19.151 186962 INFO nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 29 01:50:19 np0005539505 nova_compute[186958]: 2025-11-29 06:50:19.310 186962 INFO nova.compute.manager [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Took 10.78 seconds to build instance.#033[00m
Nov 29 01:50:19 np0005539505 nova_compute[186958]: 2025-11-29 06:50:19.369 186962 DEBUG oslo_concurrency.lockutils [None req-88dd3852-b86f-4272-b9cd-96ed6a98313b 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:19 np0005539505 nova_compute[186958]: 2025-11-29 06:50:19.388 186962 DEBUG nova.network.neutron [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Updated VIF entry in instance network info cache for port 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:50:19 np0005539505 nova_compute[186958]: 2025-11-29 06:50:19.389 186962 DEBUG nova.network.neutron [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Updating instance_info_cache with network_info: [{"id": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "address": "fa:16:3e:29:d8:40", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82af0ec7-8b", "ovs_interfaceid": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:19 np0005539505 nova_compute[186958]: 2025-11-29 06:50:19.408 186962 DEBUG oslo_concurrency.lockutils [req-787b27f2-a1a8-4297-a797-726470a90995 req-9c5f19ec-b6c6-4d01-8449-26b93f2f5ce4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:50:20 np0005539505 nova_compute[186958]: 2025-11-29 06:50:20.157 186962 INFO nova.virt.libvirt.driver [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 29 01:50:20 np0005539505 nova_compute[186958]: 2025-11-29 06:50:20.162 186962 DEBUG nova.compute.manager [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:20 np0005539505 nova_compute[186958]: 2025-11-29 06:50:20.187 186962 DEBUG nova.objects.instance [None req-3252067d-1834-4013-a205-9395c5ff1383 1e2bab636b134574964720b43ad00142 23f066c2a3d74f248a3ec36248d78753 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:50:20 np0005539505 nova_compute[186958]: 2025-11-29 06:50:20.800 186962 DEBUG nova.compute.manager [req-ec05bf7a-c2a4-465d-9c85-175ba17cd610 req-56076a43-5dd8-4e41-91a1-503bb6946498 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Received event network-vif-plugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:20 np0005539505 nova_compute[186958]: 2025-11-29 06:50:20.801 186962 DEBUG oslo_concurrency.lockutils [req-ec05bf7a-c2a4-465d-9c85-175ba17cd610 req-56076a43-5dd8-4e41-91a1-503bb6946498 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:20 np0005539505 nova_compute[186958]: 2025-11-29 06:50:20.801 186962 DEBUG oslo_concurrency.lockutils [req-ec05bf7a-c2a4-465d-9c85-175ba17cd610 req-56076a43-5dd8-4e41-91a1-503bb6946498 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:20 np0005539505 nova_compute[186958]: 2025-11-29 06:50:20.801 186962 DEBUG oslo_concurrency.lockutils [req-ec05bf7a-c2a4-465d-9c85-175ba17cd610 req-56076a43-5dd8-4e41-91a1-503bb6946498 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:20 np0005539505 nova_compute[186958]: 2025-11-29 06:50:20.802 186962 DEBUG nova.compute.manager [req-ec05bf7a-c2a4-465d-9c85-175ba17cd610 req-56076a43-5dd8-4e41-91a1-503bb6946498 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] No waiting events found dispatching network-vif-plugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:50:20 np0005539505 nova_compute[186958]: 2025-11-29 06:50:20.802 186962 WARNING nova.compute.manager [req-ec05bf7a-c2a4-465d-9c85-175ba17cd610 req-56076a43-5dd8-4e41-91a1-503bb6946498 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Received unexpected event network-vif-plugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d for instance with vm_state active and task_state None.#033[00m
Nov 29 01:50:21 np0005539505 nova_compute[186958]: 2025-11-29 06:50:21.716 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:22 np0005539505 nova_compute[186958]: 2025-11-29 06:50:22.282 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:22 np0005539505 NetworkManager[55134]: <info>  [1764399022.2833] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/39)
Nov 29 01:50:22 np0005539505 NetworkManager[55134]: <info>  [1764399022.2842] device (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:50:22 np0005539505 NetworkManager[55134]: <info>  [1764399022.2855] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/40)
Nov 29 01:50:22 np0005539505 NetworkManager[55134]: <info>  [1764399022.2859] device (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 29 01:50:22 np0005539505 NetworkManager[55134]: <info>  [1764399022.2871] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Nov 29 01:50:22 np0005539505 NetworkManager[55134]: <info>  [1764399022.2879] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Nov 29 01:50:22 np0005539505 NetworkManager[55134]: <info>  [1764399022.2885] device (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 01:50:22 np0005539505 NetworkManager[55134]: <info>  [1764399022.2889] device (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 29 01:50:22 np0005539505 nova_compute[186958]: 2025-11-29 06:50:22.336 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:22 np0005539505 nova_compute[186958]: 2025-11-29 06:50:22.384 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:22Z|00064|binding|INFO|Releasing lport ffbd3b8f-7e45-45d4-84ce-cd74c712f992 from this chassis (sb_readonly=0)
Nov 29 01:50:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:22Z|00065|binding|INFO|Releasing lport a5f83360-af8d-41aa-987f-5ef9d63c1561 from this chassis (sb_readonly=0)
Nov 29 01:50:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:22Z|00066|binding|INFO|Releasing lport 04e49043-e1b1-4b06-a437-d8d097a15b16 from this chassis (sb_readonly=0)
Nov 29 01:50:22 np0005539505 nova_compute[186958]: 2025-11-29 06:50:22.403 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:23 np0005539505 nova_compute[186958]: 2025-11-29 06:50:23.847 186962 DEBUG nova.compute.manager [req-e6ed38f7-e418-43ea-bfff-d92916be12fb req-c59d5347-9119-45ba-9ce3-a231e83383ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Received event network-changed-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:23 np0005539505 nova_compute[186958]: 2025-11-29 06:50:23.848 186962 DEBUG nova.compute.manager [req-e6ed38f7-e418-43ea-bfff-d92916be12fb req-c59d5347-9119-45ba-9ce3-a231e83383ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Refreshing instance network info cache due to event network-changed-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:50:23 np0005539505 nova_compute[186958]: 2025-11-29 06:50:23.848 186962 DEBUG oslo_concurrency.lockutils [req-e6ed38f7-e418-43ea-bfff-d92916be12fb req-c59d5347-9119-45ba-9ce3-a231e83383ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:23 np0005539505 nova_compute[186958]: 2025-11-29 06:50:23.848 186962 DEBUG oslo_concurrency.lockutils [req-e6ed38f7-e418-43ea-bfff-d92916be12fb req-c59d5347-9119-45ba-9ce3-a231e83383ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:23 np0005539505 nova_compute[186958]: 2025-11-29 06:50:23.849 186962 DEBUG nova.network.neutron [req-e6ed38f7-e418-43ea-bfff-d92916be12fb req-c59d5347-9119-45ba-9ce3-a231e83383ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Refreshing network info cache for port 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.429 186962 DEBUG oslo_concurrency.lockutils [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.429 186962 DEBUG oslo_concurrency.lockutils [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.430 186962 DEBUG oslo_concurrency.lockutils [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.430 186962 DEBUG oslo_concurrency.lockutils [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.431 186962 DEBUG oslo_concurrency.lockutils [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.444 186962 INFO nova.compute.manager [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Terminating instance#033[00m
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.456 186962 DEBUG nova.compute.manager [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:50:24 np0005539505 kernel: tap04956313-39 (unregistering): left promiscuous mode
Nov 29 01:50:24 np0005539505 NetworkManager[55134]: <info>  [1764399024.4783] device (tap04956313-39): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:50:24 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:24Z|00067|binding|INFO|Releasing lport 04956313-39e4-4275-ab3f-18aa7a1a0e46 from this chassis (sb_readonly=0)
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.487 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:24 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:24Z|00068|binding|INFO|Setting lport 04956313-39e4-4275-ab3f-18aa7a1a0e46 down in Southbound
Nov 29 01:50:24 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:24Z|00069|binding|INFO|Releasing lport 81278169-001b-4894-adbd-075edcc27e49 from this chassis (sb_readonly=0)
Nov 29 01:50:24 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:24Z|00070|binding|INFO|Setting lport 81278169-001b-4894-adbd-075edcc27e49 down in Southbound
Nov 29 01:50:24 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:24Z|00071|binding|INFO|Removing iface tap04956313-39 ovn-installed in OVS
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.491 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.522 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:24 np0005539505 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 29 01:50:24 np0005539505 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Consumed 3.759s CPU time.
Nov 29 01:50:24 np0005539505 systemd-machined[153285]: Machine qemu-5-instance-0000000a terminated.
Nov 29 01:50:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:24.572 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:12:0f 19.80.0.181'], port_security=['fa:16:3e:a3:12:0f 19.80.0.181'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['04956313-39e4-4275-ab3f-18aa7a1a0e46'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1726918358', 'neutron:cidrs': '19.80.0.181/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1726918358', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=ec2078f4-7ef2-4848-8fcd-c69eaba744f4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=81278169-001b-4894-adbd-075edcc27e49) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:50:24 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:24Z|00072|binding|INFO|Releasing lport ffbd3b8f-7e45-45d4-84ce-cd74c712f992 from this chassis (sb_readonly=0)
Nov 29 01:50:24 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:24Z|00073|binding|INFO|Releasing lport a5f83360-af8d-41aa-987f-5ef9d63c1561 from this chassis (sb_readonly=0)
Nov 29 01:50:24 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:24Z|00074|binding|INFO|Releasing lport 04e49043-e1b1-4b06-a437-d8d097a15b16 from this chassis (sb_readonly=0)
Nov 29 01:50:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:24.575 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:f5:6d 10.100.0.9'], port_security=['fa:16:3e:b8:f5:6d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1473147608', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e6b5b54b-9532-4f51-a346-42dee946a9ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1473147608', 'neutron:project_id': '2b6eb92d93c24eaaa0c6a3104a54633a', 'neutron:revision_number': '11', 'neutron:security_group_ids': '1b137676-29a0-4a8e-83e8-cda39edaccb8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fafd611f-c010-460d-b1cc-2d52a79696f1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=04956313-39e4-4275-ab3f-18aa7a1a0e46) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:50:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:24.577 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 81278169-001b-4894-adbd-075edcc27e49 in datapath 5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 unbound from our chassis#033[00m
Nov 29 01:50:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:24.579 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:50:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:24.581 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[04dc2b08-7f50-4590-b7d3-b4fbca5ba090]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:24.594 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 namespace which is not needed anymore#033[00m
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.615 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.996 186962 INFO nova.virt.libvirt.driver [-] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Instance destroyed successfully.#033[00m
Nov 29 01:50:24 np0005539505 nova_compute[186958]: 2025-11-29 06:50:24.996 186962 DEBUG nova.objects.instance [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lazy-loading 'resources' on Instance uuid e6b5b54b-9532-4f51-a346-42dee946a9ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:25 np0005539505 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[215046]: [NOTICE]   (215071) : haproxy version is 2.8.14-c23fe91
Nov 29 01:50:25 np0005539505 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[215046]: [NOTICE]   (215071) : path to executable is /usr/sbin/haproxy
Nov 29 01:50:25 np0005539505 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[215046]: [ALERT]    (215071) : Current worker (215077) exited with code 143 (Terminated)
Nov 29 01:50:25 np0005539505 neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0[215046]: [WARNING]  (215071) : All workers exited. Exiting... (0)
Nov 29 01:50:25 np0005539505 systemd[1]: libpod-0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e.scope: Deactivated successfully.
Nov 29 01:50:25 np0005539505 podman[215361]: 2025-11-29 06:50:25.034445093 +0000 UTC m=+0.050357138 container died 0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 01:50:25 np0005539505 podman[215357]: 2025-11-29 06:50:25.075453857 +0000 UTC m=+0.078755987 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:50:25 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e-userdata-shm.mount: Deactivated successfully.
Nov 29 01:50:25 np0005539505 systemd[1]: var-lib-containers-storage-overlay-4c67f2ee1b82c38b4babf50ce4b806d1286a0fb60c8e2962177e42c14e92e5d0-merged.mount: Deactivated successfully.
Nov 29 01:50:25 np0005539505 podman[215361]: 2025-11-29 06:50:25.094717549 +0000 UTC m=+0.110629584 container cleanup 0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 01:50:25 np0005539505 podman[215369]: 2025-11-29 06:50:25.098161246 +0000 UTC m=+0.095925561 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 01:50:25 np0005539505 systemd[1]: libpod-conmon-0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e.scope: Deactivated successfully.
Nov 29 01:50:25 np0005539505 podman[215441]: 2025-11-29 06:50:25.154153832 +0000 UTC m=+0.038852825 container remove 0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.159 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f2499e33-a8c8-4fb4-ad60-3c6b98b25e84]: (4, ('Sat Nov 29 06:50:24 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 (0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e)\n0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e\nSat Nov 29 06:50:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 (0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e)\n0d13e1192e4d1329fabf5b11a4e66117e0f8e2219df4f387e61adadee9bd659e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.160 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fcd07dbb-385c-404b-80ba-4f5f1acd94f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.161 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bda1138-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.163 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:25 np0005539505 kernel: tap5bda1138-f0: left promiscuous mode
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.178 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.180 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0e400df6-84be-4d05-afbc-12bdc9f12155]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.195 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[202fc6cc-19db-4c96-8857-3bf65612d63c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.197 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4941b204-679e-437b-9816-3b47177976f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.216 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ce8d3072-b867-49fb-8ba7-4bff85dcfc44]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446015, 'reachable_time': 25167, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215459, 'error': None, 'target': 'ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.220 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5bda1138-fab5-4b3a-9a12-4d1c90a4dce0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.220 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d7c16b-acc2-497e-905f-d400b78c3b58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.221 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 04956313-39e4-4275-ab3f-18aa7a1a0e46 in datapath 24ee44f0-2b10-459c-aabf-bf9ef2c8d950 unbound from our chassis#033[00m
Nov 29 01:50:25 np0005539505 systemd[1]: run-netns-ovnmeta\x2d5bda1138\x2dfab5\x2d4b3a\x2d9a12\x2d4d1c90a4dce0.mount: Deactivated successfully.
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.223 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24ee44f0-2b10-459c-aabf-bf9ef2c8d950, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.223 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4bbfabb4-a093-4407-b598-b7433b80fc17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.224 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 namespace which is not needed anymore#033[00m
Nov 29 01:50:25 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215183]: [NOTICE]   (215187) : haproxy version is 2.8.14-c23fe91
Nov 29 01:50:25 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215183]: [NOTICE]   (215187) : path to executable is /usr/sbin/haproxy
Nov 29 01:50:25 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215183]: [ALERT]    (215187) : Current worker (215189) exited with code 143 (Terminated)
Nov 29 01:50:25 np0005539505 neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950[215183]: [WARNING]  (215187) : All workers exited. Exiting... (0)
Nov 29 01:50:25 np0005539505 systemd[1]: libpod-c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5.scope: Deactivated successfully.
Nov 29 01:50:25 np0005539505 podman[215478]: 2025-11-29 06:50:25.347358059 +0000 UTC m=+0.044899574 container died c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 01:50:25 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5-userdata-shm.mount: Deactivated successfully.
Nov 29 01:50:25 np0005539505 systemd[1]: var-lib-containers-storage-overlay-f31a92e9378c42df3acf1395f99aabda98ac209abb752aefcbab2bfec28d32a7-merged.mount: Deactivated successfully.
Nov 29 01:50:25 np0005539505 podman[215478]: 2025-11-29 06:50:25.379320529 +0000 UTC m=+0.076862034 container cleanup c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:50:25 np0005539505 systemd[1]: libpod-conmon-c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5.scope: Deactivated successfully.
Nov 29 01:50:25 np0005539505 podman[215506]: 2025-11-29 06:50:25.439483092 +0000 UTC m=+0.039625766 container remove c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.444 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[27d36819-dc11-40a6-85dd-152a648c9a22]: (4, ('Sat Nov 29 06:50:25 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 (c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5)\nc8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5\nSat Nov 29 06:50:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 (c8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5)\nc8c647ed39ef810894f52b656c969bfb92ced2b0d9f8b987bb0fc85bd0d384c5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.446 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3103d559-e22d-4d07-8929-ed76bd1be2cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.447 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24ee44f0-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.481 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:25 np0005539505 kernel: tap24ee44f0-20: left promiscuous mode
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.495 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.497 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b0250d9f-1482-44bf-9beb-a1ad76d01117]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.510 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6fea2627-8a9f-4788-a7d1-0f3683548389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.511 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8d67fe5b-5117-44f6-bec4-a6cd49f2ff95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.527 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a51caab9-4c44-4de5-92d9-4dc1ddab03a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446351, 'reachable_time': 43764, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215526, 'error': None, 'target': 'ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.528 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24ee44f0-2b10-459c-aabf-bf9ef2c8d950 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:25.529 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[22bcc1ee-f17f-4ed2-a40b-273b83d4953d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.778 186962 DEBUG nova.virt.libvirt.vif [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:49:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-797555638',display_name='tempest-LiveMigrationTest-server-797555638',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-livemigrationtest-server-797555638',id=10,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:49:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2b6eb92d93c24eaaa0c6a3104a54633a',ramdisk_id='',reservation_id='r-duitc0f8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-440211682',owner_user_name='tempest-LiveMigrationTest-440211682-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:50:20Z,user_data=None,user_id='a01fd01629a1493bb3fb6df5a2462226',uuid=e6b5b54b-9532-4f51-a346-42dee946a9ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.779 186962 DEBUG nova.network.os_vif_util [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converting VIF {"id": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "address": "fa:16:3e:b8:f5:6d", "network": {"id": "24ee44f0-2b10-459c-aabf-bf9ef2c8d950", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1003556338-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2b6eb92d93c24eaaa0c6a3104a54633a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04956313-39", "ovs_interfaceid": "04956313-39e4-4275-ab3f-18aa7a1a0e46", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.780 186962 DEBUG nova.network.os_vif_util [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.780 186962 DEBUG os_vif [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.782 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.782 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04956313-39, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.784 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.785 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.788 186962 INFO os_vif [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:f5:6d,bridge_name='br-int',has_traffic_filtering=True,id=04956313-39e4-4275-ab3f-18aa7a1a0e46,network=Network(24ee44f0-2b10-459c-aabf-bf9ef2c8d950),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap04956313-39')#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.789 186962 INFO nova.virt.libvirt.driver [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Deleting instance files /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef_del#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.790 186962 INFO nova.virt.libvirt.driver [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Deletion of /var/lib/nova/instances/e6b5b54b-9532-4f51-a346-42dee946a9ef_del complete#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.883 186962 INFO nova.compute.manager [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Took 1.43 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.884 186962 DEBUG oslo.service.loopingcall [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.884 186962 DEBUG nova.compute.manager [-] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:50:25 np0005539505 nova_compute[186958]: 2025-11-29 06:50:25.884 186962 DEBUG nova.network.neutron [-] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:50:26 np0005539505 systemd[1]: run-netns-ovnmeta\x2d24ee44f0\x2d2b10\x2d459c\x2daabf\x2dbf9ef2c8d950.mount: Deactivated successfully.
Nov 29 01:50:26 np0005539505 nova_compute[186958]: 2025-11-29 06:50:26.449 186962 DEBUG nova.compute.manager [req-c4c5a3ac-6a90-4671-ab72-6e1b8cf2fdaa req-a64a9679-bbfd-44fe-84f8-e73f5ae7ef4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-unplugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:26 np0005539505 nova_compute[186958]: 2025-11-29 06:50:26.449 186962 DEBUG oslo_concurrency.lockutils [req-c4c5a3ac-6a90-4671-ab72-6e1b8cf2fdaa req-a64a9679-bbfd-44fe-84f8-e73f5ae7ef4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:26 np0005539505 nova_compute[186958]: 2025-11-29 06:50:26.450 186962 DEBUG oslo_concurrency.lockutils [req-c4c5a3ac-6a90-4671-ab72-6e1b8cf2fdaa req-a64a9679-bbfd-44fe-84f8-e73f5ae7ef4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:26 np0005539505 nova_compute[186958]: 2025-11-29 06:50:26.450 186962 DEBUG oslo_concurrency.lockutils [req-c4c5a3ac-6a90-4671-ab72-6e1b8cf2fdaa req-a64a9679-bbfd-44fe-84f8-e73f5ae7ef4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:26 np0005539505 nova_compute[186958]: 2025-11-29 06:50:26.450 186962 DEBUG nova.compute.manager [req-c4c5a3ac-6a90-4671-ab72-6e1b8cf2fdaa req-a64a9679-bbfd-44fe-84f8-e73f5ae7ef4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] No waiting events found dispatching network-vif-unplugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:50:26 np0005539505 nova_compute[186958]: 2025-11-29 06:50:26.451 186962 DEBUG nova.compute.manager [req-c4c5a3ac-6a90-4671-ab72-6e1b8cf2fdaa req-a64a9679-bbfd-44fe-84f8-e73f5ae7ef4e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-unplugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:50:26 np0005539505 nova_compute[186958]: 2025-11-29 06:50:26.772 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:26.926 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:26.927 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:26.928 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:28 np0005539505 nova_compute[186958]: 2025-11-29 06:50:28.366 186962 DEBUG nova.network.neutron [req-e6ed38f7-e418-43ea-bfff-d92916be12fb req-c59d5347-9119-45ba-9ce3-a231e83383ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Updated VIF entry in instance network info cache for port 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:50:28 np0005539505 nova_compute[186958]: 2025-11-29 06:50:28.367 186962 DEBUG nova.network.neutron [req-e6ed38f7-e418-43ea-bfff-d92916be12fb req-c59d5347-9119-45ba-9ce3-a231e83383ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Updating instance_info_cache with network_info: [{"id": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "address": "fa:16:3e:29:d8:40", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82af0ec7-8b", "ovs_interfaceid": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:28 np0005539505 nova_compute[186958]: 2025-11-29 06:50:28.617 186962 DEBUG nova.compute.manager [req-dada7a16-dd56-4713-91f9-03451988aae1 req-cd3dc235-b669-4a7e-ba92-1337e9c9fc5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:28 np0005539505 nova_compute[186958]: 2025-11-29 06:50:28.617 186962 DEBUG oslo_concurrency.lockutils [req-dada7a16-dd56-4713-91f9-03451988aae1 req-cd3dc235-b669-4a7e-ba92-1337e9c9fc5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:28 np0005539505 nova_compute[186958]: 2025-11-29 06:50:28.618 186962 DEBUG oslo_concurrency.lockutils [req-dada7a16-dd56-4713-91f9-03451988aae1 req-cd3dc235-b669-4a7e-ba92-1337e9c9fc5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:28 np0005539505 nova_compute[186958]: 2025-11-29 06:50:28.618 186962 DEBUG oslo_concurrency.lockutils [req-dada7a16-dd56-4713-91f9-03451988aae1 req-cd3dc235-b669-4a7e-ba92-1337e9c9fc5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:28 np0005539505 nova_compute[186958]: 2025-11-29 06:50:28.618 186962 DEBUG nova.compute.manager [req-dada7a16-dd56-4713-91f9-03451988aae1 req-cd3dc235-b669-4a7e-ba92-1337e9c9fc5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] No waiting events found dispatching network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:50:28 np0005539505 nova_compute[186958]: 2025-11-29 06:50:28.619 186962 WARNING nova.compute.manager [req-dada7a16-dd56-4713-91f9-03451988aae1 req-cd3dc235-b669-4a7e-ba92-1337e9c9fc5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Received unexpected event network-vif-plugged-04956313-39e4-4275-ab3f-18aa7a1a0e46 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 01:50:28 np0005539505 nova_compute[186958]: 2025-11-29 06:50:28.627 186962 DEBUG oslo_concurrency.lockutils [req-e6ed38f7-e418-43ea-bfff-d92916be12fb req-c59d5347-9119-45ba-9ce3-a231e83383ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:50:28 np0005539505 nova_compute[186958]: 2025-11-29 06:50:28.994 186962 DEBUG nova.network.neutron [-] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:29 np0005539505 nova_compute[186958]: 2025-11-29 06:50:29.029 186962 INFO nova.compute.manager [-] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Took 3.14 seconds to deallocate network for instance.#033[00m
Nov 29 01:50:29 np0005539505 nova_compute[186958]: 2025-11-29 06:50:29.246 186962 DEBUG oslo_concurrency.lockutils [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:29 np0005539505 nova_compute[186958]: 2025-11-29 06:50:29.246 186962 DEBUG oslo_concurrency.lockutils [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:29 np0005539505 nova_compute[186958]: 2025-11-29 06:50:29.251 186962 DEBUG oslo_concurrency.lockutils [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:29 np0005539505 nova_compute[186958]: 2025-11-29 06:50:29.381 186962 INFO nova.scheduler.client.report [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Deleted allocations for instance e6b5b54b-9532-4f51-a346-42dee946a9ef#033[00m
Nov 29 01:50:29 np0005539505 nova_compute[186958]: 2025-11-29 06:50:29.507 186962 DEBUG oslo_concurrency.lockutils [None req-c589bf10-d88b-462d-8e9f-ad46583c15ac a01fd01629a1493bb3fb6df5a2462226 2b6eb92d93c24eaaa0c6a3104a54633a - - default default] Lock "e6b5b54b-9532-4f51-a346-42dee946a9ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:29 np0005539505 podman[215527]: 2025-11-29 06:50:29.732326973 +0000 UTC m=+0.068728215 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:50:30 np0005539505 nova_compute[186958]: 2025-11-29 06:50:30.790 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:31 np0005539505 nova_compute[186958]: 2025-11-29 06:50:31.775 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:33 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:33Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:29:d8:40 10.100.0.12
Nov 29 01:50:33 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:33Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:29:d8:40 10.100.0.12
Nov 29 01:50:34 np0005539505 podman[215561]: 2025-11-29 06:50:34.753006306 +0000 UTC m=+0.077380109 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 01:50:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:34.906 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:50:34 np0005539505 nova_compute[186958]: 2025-11-29 06:50:34.907 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:34.908 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:50:35 np0005539505 nova_compute[186958]: 2025-11-29 06:50:35.793 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:36 np0005539505 nova_compute[186958]: 2025-11-29 06:50:36.778 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:37 np0005539505 nova_compute[186958]: 2025-11-29 06:50:37.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:38.910 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.841 186962 DEBUG oslo_concurrency.lockutils [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.841 186962 DEBUG oslo_concurrency.lockutils [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.841 186962 DEBUG oslo_concurrency.lockutils [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.841 186962 DEBUG oslo_concurrency.lockutils [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.842 186962 DEBUG oslo_concurrency.lockutils [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.854 186962 INFO nova.compute.manager [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Terminating instance#033[00m
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.867 186962 DEBUG nova.compute.manager [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:50:39 np0005539505 kernel: tap82af0ec7-8b (unregistering): left promiscuous mode
Nov 29 01:50:39 np0005539505 NetworkManager[55134]: <info>  [1764399039.8914] device (tap82af0ec7-8b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.893 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:39 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:39Z|00075|binding|INFO|Releasing lport 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d from this chassis (sb_readonly=0)
Nov 29 01:50:39 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:39Z|00076|binding|INFO|Setting lport 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d down in Southbound
Nov 29 01:50:39 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:39Z|00077|binding|INFO|Removing iface tap82af0ec7-8b ovn-installed in OVS
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.895 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:39.900 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:d8:40 10.100.0.12'], port_security=['fa:16:3e:29:d8:40 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9dcd4651-f485-4d8f-b6b2-02492d2c0a1c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61990940-649c-4332-bea9-4159087142dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a14e098-439d-46d7-8c4c-c5f31ab6085d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.243'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e9d8ee4-839b-4078-a12e-f22cafda935b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=82af0ec7-8b7a-4fe2-b069-4a1c3566e90d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:50:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:39.901 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 82af0ec7-8b7a-4fe2-b069-4a1c3566e90d in datapath 61990940-649c-4332-bea9-4159087142dd unbound from our chassis#033[00m
Nov 29 01:50:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:39.902 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 61990940-649c-4332-bea9-4159087142dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:50:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:39.903 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c6000d5a-f4fa-47a5-bf57-7f9c245e3934]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:39.904 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-61990940-649c-4332-bea9-4159087142dd namespace which is not needed anymore#033[00m
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.910 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:39 np0005539505 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Nov 29 01:50:39 np0005539505 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000c.scope: Consumed 13.022s CPU time.
Nov 29 01:50:39 np0005539505 systemd-machined[153285]: Machine qemu-6-instance-0000000c terminated.
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.982 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399024.9815123, e6b5b54b-9532-4f51-a346-42dee946a9ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:39 np0005539505 nova_compute[186958]: 2025-11-29 06:50:39.982 186962 INFO nova.compute.manager [-] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.009 186962 DEBUG nova.compute.manager [None req-870829a8-131c-45b6-b187-5348d88e75e0 - - - - - -] [instance: e6b5b54b-9532-4f51-a346-42dee946a9ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:40 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215290]: [NOTICE]   (215312) : haproxy version is 2.8.14-c23fe91
Nov 29 01:50:40 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215290]: [NOTICE]   (215312) : path to executable is /usr/sbin/haproxy
Nov 29 01:50:40 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215290]: [WARNING]  (215312) : Exiting Master process...
Nov 29 01:50:40 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215290]: [ALERT]    (215312) : Current worker (215314) exited with code 143 (Terminated)
Nov 29 01:50:40 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215290]: [WARNING]  (215312) : All workers exited. Exiting... (0)
Nov 29 01:50:40 np0005539505 systemd[1]: libpod-f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b.scope: Deactivated successfully.
Nov 29 01:50:40 np0005539505 podman[215607]: 2025-11-29 06:50:40.052917959 +0000 UTC m=+0.046512600 container died f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.119 186962 INFO nova.virt.libvirt.driver [-] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Instance destroyed successfully.#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.119 186962 DEBUG nova.objects.instance [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lazy-loading 'resources' on Instance uuid 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.151 186962 DEBUG nova.virt.libvirt.vif [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:50:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-250603763',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-250603763',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(21),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-250603763',id=12,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=21,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM1xgBYSUC8gvnJcy96ZHeGXy5fZIEzREwu2SdQF2/u48S662pU9yPXCvPyqRaFOz6WipFAjO8OpVLmGBh29YapeoV6qQTkRjF0wg+tU+pxk7BBzNSGVITtKYZ1VXuTWyg==',key_name='tempest-keypair-1912782802',keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:50:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='99863a77c63a4673b0ef23a7c0ae373a',ramdisk_id='',reservation_id='r-35t5cudy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-706098489',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-706098489-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:50:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0908eb33a338434891ed9f5dd3768bab',uuid=9dcd4651-f485-4d8f-b6b2-02492d2c0a1c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "address": "fa:16:3e:29:d8:40", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82af0ec7-8b", "ovs_interfaceid": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.151 186962 DEBUG nova.network.os_vif_util [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converting VIF {"id": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "address": "fa:16:3e:29:d8:40", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82af0ec7-8b", "ovs_interfaceid": "82af0ec7-8b7a-4fe2-b069-4a1c3566e90d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.152 186962 DEBUG nova.network.os_vif_util [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:29:d8:40,bridge_name='br-int',has_traffic_filtering=True,id=82af0ec7-8b7a-4fe2-b069-4a1c3566e90d,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82af0ec7-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.152 186962 DEBUG os_vif [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:d8:40,bridge_name='br-int',has_traffic_filtering=True,id=82af0ec7-8b7a-4fe2-b069-4a1c3566e90d,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82af0ec7-8b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.154 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.154 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82af0ec7-8b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.155 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.157 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.159 186962 INFO os_vif [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:29:d8:40,bridge_name='br-int',has_traffic_filtering=True,id=82af0ec7-8b7a-4fe2-b069-4a1c3566e90d,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82af0ec7-8b')#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.160 186962 INFO nova.virt.libvirt.driver [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Deleting instance files /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c_del#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.160 186962 INFO nova.virt.libvirt.driver [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Deletion of /var/lib/nova/instances/9dcd4651-f485-4d8f-b6b2-02492d2c0a1c_del complete#033[00m
Nov 29 01:50:40 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:40Z|00078|binding|INFO|Releasing lport 04e49043-e1b1-4b06-a437-d8d097a15b16 from this chassis (sb_readonly=0)
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.177 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:40 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b-userdata-shm.mount: Deactivated successfully.
Nov 29 01:50:40 np0005539505 systemd[1]: var-lib-containers-storage-overlay-c564f740949d9b90280ce489920b41e3194481b41de4d1ce42c60b647b29697f-merged.mount: Deactivated successfully.
Nov 29 01:50:40 np0005539505 podman[215607]: 2025-11-29 06:50:40.219308242 +0000 UTC m=+0.212902883 container cleanup f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:50:40 np0005539505 systemd[1]: libpod-conmon-f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b.scope: Deactivated successfully.
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.240 186962 DEBUG nova.compute.manager [req-97119f4e-e66a-411f-bc43-d96bfe497762 req-831d0c6e-6929-4c05-9267-1079b8ec5806 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Received event network-vif-unplugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.241 186962 DEBUG oslo_concurrency.lockutils [req-97119f4e-e66a-411f-bc43-d96bfe497762 req-831d0c6e-6929-4c05-9267-1079b8ec5806 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.241 186962 DEBUG oslo_concurrency.lockutils [req-97119f4e-e66a-411f-bc43-d96bfe497762 req-831d0c6e-6929-4c05-9267-1079b8ec5806 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.241 186962 DEBUG oslo_concurrency.lockutils [req-97119f4e-e66a-411f-bc43-d96bfe497762 req-831d0c6e-6929-4c05-9267-1079b8ec5806 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.241 186962 DEBUG nova.compute.manager [req-97119f4e-e66a-411f-bc43-d96bfe497762 req-831d0c6e-6929-4c05-9267-1079b8ec5806 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] No waiting events found dispatching network-vif-unplugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.241 186962 DEBUG nova.compute.manager [req-97119f4e-e66a-411f-bc43-d96bfe497762 req-831d0c6e-6929-4c05-9267-1079b8ec5806 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Received event network-vif-unplugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.267 186962 INFO nova.compute.manager [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.267 186962 DEBUG oslo.service.loopingcall [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.268 186962 DEBUG nova.compute.manager [-] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.268 186962 DEBUG nova.network.neutron [-] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:50:40 np0005539505 podman[215655]: 2025-11-29 06:50:40.274559066 +0000 UTC m=+0.037222708 container remove f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:50:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:40.279 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3dca8105-5383-48a6-850f-f09343a2c013]: (4, ('Sat Nov 29 06:50:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd (f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b)\nf6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b\nSat Nov 29 06:50:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd (f6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b)\nf6d89c9ea5b3c39a458223487aec1f00e3f07aa929ec883dcc85cbf660c9eb4b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:40.280 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e698f939-7404-4b8f-b59d-2b1473167a2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:40.281 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61990940-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.282 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:40 np0005539505 kernel: tap61990940-60: left promiscuous mode
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.360 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:40.363 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[346f5c58-7c56-4896-a994-1fd9a5c1e668]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.410 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.410 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.411 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.434 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.434 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.434 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.434 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:50:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:40.436 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e9adba71-85c2-4007-a877-be3e5073b845]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.437 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:40.437 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a2c9d4-4be1-4ad1-acb6-00404ae7615f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:40.454 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b7540c1e-2b73-48f8-abc1-d8b7ba81a2e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446587, 'reachable_time': 20970, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215670, 'error': None, 'target': 'ovnmeta-61990940-649c-4332-bea9-4159087142dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:40.456 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-61990940-649c-4332-bea9-4159087142dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:50:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:40.456 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[37793d85-939c-4e66-bfec-7fbab5bdd0dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:40 np0005539505 systemd[1]: run-netns-ovnmeta\x2d61990940\x2d649c\x2d4332\x2dbea9\x2d4159087142dd.mount: Deactivated successfully.
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.578 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.579 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5728MB free_disk=73.34396362304688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.579 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.579 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.646 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.647 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.647 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.692 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.709 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.764 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:50:40 np0005539505 nova_compute[186958]: 2025-11-29 06:50:40.764 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.614 186962 DEBUG nova.network.neutron [-] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.637 186962 INFO nova.compute.manager [-] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Took 1.37 seconds to deallocate network for instance.#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.731 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.732 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.732 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.732 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.745 186962 DEBUG oslo_concurrency.lockutils [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.746 186962 DEBUG oslo_concurrency.lockutils [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.781 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.799 186962 DEBUG nova.compute.provider_tree [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.821 186962 DEBUG nova.scheduler.client.report [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.847 186962 DEBUG oslo_concurrency.lockutils [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.870 186962 INFO nova.scheduler.client.report [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Deleted allocations for instance 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c#033[00m
Nov 29 01:50:41 np0005539505 nova_compute[186958]: 2025-11-29 06:50:41.955 186962 DEBUG oslo_concurrency.lockutils [None req-1e823110-031f-47da-bd94-812fb2b41bd7 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:42 np0005539505 nova_compute[186958]: 2025-11-29 06:50:42.320 186962 DEBUG nova.compute.manager [req-2a863556-d350-4fdf-bf54-b047fb81be6b req-80a9f3f6-caae-4149-8c9a-89b967d3765e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Received event network-vif-plugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:42 np0005539505 nova_compute[186958]: 2025-11-29 06:50:42.321 186962 DEBUG oslo_concurrency.lockutils [req-2a863556-d350-4fdf-bf54-b047fb81be6b req-80a9f3f6-caae-4149-8c9a-89b967d3765e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:42 np0005539505 nova_compute[186958]: 2025-11-29 06:50:42.321 186962 DEBUG oslo_concurrency.lockutils [req-2a863556-d350-4fdf-bf54-b047fb81be6b req-80a9f3f6-caae-4149-8c9a-89b967d3765e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:42 np0005539505 nova_compute[186958]: 2025-11-29 06:50:42.321 186962 DEBUG oslo_concurrency.lockutils [req-2a863556-d350-4fdf-bf54-b047fb81be6b req-80a9f3f6-caae-4149-8c9a-89b967d3765e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9dcd4651-f485-4d8f-b6b2-02492d2c0a1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:42 np0005539505 nova_compute[186958]: 2025-11-29 06:50:42.321 186962 DEBUG nova.compute.manager [req-2a863556-d350-4fdf-bf54-b047fb81be6b req-80a9f3f6-caae-4149-8c9a-89b967d3765e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] No waiting events found dispatching network-vif-plugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:50:42 np0005539505 nova_compute[186958]: 2025-11-29 06:50:42.321 186962 WARNING nova.compute.manager [req-2a863556-d350-4fdf-bf54-b047fb81be6b req-80a9f3f6-caae-4149-8c9a-89b967d3765e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Received unexpected event network-vif-plugged-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:50:42 np0005539505 nova_compute[186958]: 2025-11-29 06:50:42.321 186962 DEBUG nova.compute.manager [req-2a863556-d350-4fdf-bf54-b047fb81be6b req-80a9f3f6-caae-4149-8c9a-89b967d3765e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Received event network-vif-deleted-82af0ec7-8b7a-4fe2-b069-4a1c3566e90d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:42 np0005539505 nova_compute[186958]: 2025-11-29 06:50:42.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.529 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "1d7d2595-f3da-459e-a875-4b7ef22931bc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.529 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.552 186962 DEBUG nova.compute.manager [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.677 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.678 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.684 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.685 186962 INFO nova.compute.claims [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.815 186962 DEBUG nova.compute.provider_tree [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.836 186962 DEBUG nova.scheduler.client.report [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.857 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.858 186962 DEBUG nova.compute.manager [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.907 186962 DEBUG nova.compute.manager [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.908 186962 DEBUG nova.network.neutron [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:50:43 np0005539505 nova_compute[186958]: 2025-11-29 06:50:43.929 186962 INFO nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.017 186962 DEBUG nova.compute.manager [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.431 186962 DEBUG nova.compute.manager [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.434 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.434 186962 INFO nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Creating image(s)#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.435 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "/var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.436 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "/var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.437 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "/var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.463 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.490 186962 DEBUG nova.policy [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.520 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.521 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.522 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.531 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.620 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.621 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.880 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk 1073741824" returned: 0 in 0.259s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.881 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.881 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.969 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.971 186962 DEBUG nova.virt.disk.api [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Checking if we can resize image /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:50:44 np0005539505 nova_compute[186958]: 2025-11-29 06:50:44.972 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.045 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.047 186962 DEBUG nova.virt.disk.api [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Cannot resize image /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.048 186962 DEBUG nova.objects.instance [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lazy-loading 'migration_context' on Instance uuid 1d7d2595-f3da-459e-a875-4b7ef22931bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.071 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "/var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.072 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "/var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.073 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "/var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.074 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.075 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.075 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.101 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/_base/ephemeral_1_0706d66 1G" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.104 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.138 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "mkfs -t vfat -n ephemeral0 /var/lib/nova/instances/_base/ephemeral_1_0706d66" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.139 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.150 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.162 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.223 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.224 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "ephemeral_1_0706d66" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.225 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "ephemeral_1_0706d66" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.244 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.307 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.308 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.eph0 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.338 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ephemeral_1_0706d66,backing_fmt=raw /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.eph0 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.339 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "ephemeral_1_0706d66" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.339 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.395 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ephemeral_1_0706d66 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.396 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.396 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Ensure instance console log exists: /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.396 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.397 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.397 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:45 np0005539505 nova_compute[186958]: 2025-11-29 06:50:45.574 186962 DEBUG nova.network.neutron [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Successfully created port: 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:50:46 np0005539505 nova_compute[186958]: 2025-11-29 06:50:46.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:50:46 np0005539505 nova_compute[186958]: 2025-11-29 06:50:46.509 186962 DEBUG nova.network.neutron [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Successfully updated port: 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:50:46 np0005539505 nova_compute[186958]: 2025-11-29 06:50:46.554 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "refresh_cache-1d7d2595-f3da-459e-a875-4b7ef22931bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:46 np0005539505 nova_compute[186958]: 2025-11-29 06:50:46.555 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquired lock "refresh_cache-1d7d2595-f3da-459e-a875-4b7ef22931bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:46 np0005539505 nova_compute[186958]: 2025-11-29 06:50:46.555 186962 DEBUG nova.network.neutron [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:50:46 np0005539505 nova_compute[186958]: 2025-11-29 06:50:46.723 186962 DEBUG nova.compute.manager [req-3d171d15-b9ba-471b-80b6-4184da75ea7c req-0a38589f-726d-4db3-add5-6a143e13f456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Received event network-changed-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:46 np0005539505 nova_compute[186958]: 2025-11-29 06:50:46.723 186962 DEBUG nova.compute.manager [req-3d171d15-b9ba-471b-80b6-4184da75ea7c req-0a38589f-726d-4db3-add5-6a143e13f456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Refreshing instance network info cache due to event network-changed-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:50:46 np0005539505 nova_compute[186958]: 2025-11-29 06:50:46.723 186962 DEBUG oslo_concurrency.lockutils [req-3d171d15-b9ba-471b-80b6-4184da75ea7c req-0a38589f-726d-4db3-add5-6a143e13f456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1d7d2595-f3da-459e-a875-4b7ef22931bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:46 np0005539505 podman[215703]: 2025-11-29 06:50:46.737068407 +0000 UTC m=+0.069614500 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 01:50:46 np0005539505 podman[215704]: 2025-11-29 06:50:46.762184984 +0000 UTC m=+0.089896181 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:50:46 np0005539505 nova_compute[186958]: 2025-11-29 06:50:46.808 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:46 np0005539505 nova_compute[186958]: 2025-11-29 06:50:46.825 186962 DEBUG nova.network.neutron [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.618 186962 DEBUG nova.network.neutron [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Updating instance_info_cache with network_info: [{"id": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "address": "fa:16:3e:99:7a:dd", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63ca2ebd-3e", "ovs_interfaceid": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.637 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Releasing lock "refresh_cache-1d7d2595-f3da-459e-a875-4b7ef22931bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.638 186962 DEBUG nova.compute.manager [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Instance network_info: |[{"id": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "address": "fa:16:3e:99:7a:dd", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63ca2ebd-3e", "ovs_interfaceid": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.639 186962 DEBUG oslo_concurrency.lockutils [req-3d171d15-b9ba-471b-80b6-4184da75ea7c req-0a38589f-726d-4db3-add5-6a143e13f456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1d7d2595-f3da-459e-a875-4b7ef22931bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.640 186962 DEBUG nova.network.neutron [req-3d171d15-b9ba-471b-80b6-4184da75ea7c req-0a38589f-726d-4db3-add5-6a143e13f456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Refreshing network info cache for port 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.647 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Start _get_guest_xml network_info=[{"id": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "address": "fa:16:3e:99:7a:dd", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63ca2ebd-3e", "ovs_interfaceid": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [{'encryption_options': None, 'encryption_format': None, 'size': 1, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vdb', 'encrypted': False, 'guest_format': None}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.653 186962 WARNING nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.658 186962 DEBUG nova.virt.libvirt.host [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.659 186962 DEBUG nova.virt.libvirt.host [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.668 186962 DEBUG nova.virt.libvirt.host [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.668 186962 DEBUG nova.virt.libvirt.host [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.670 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.670 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:50:02Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={hw_rng:allowed='True'},flavorid='1685538692',id=20,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_1-983200373',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.671 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.671 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.672 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.672 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.672 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.673 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.673 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.673 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.674 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.674 186962 DEBUG nova.virt.hardware [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.678 186962 DEBUG nova.virt.libvirt.vif [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:50:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1781641709',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1781641709',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(20),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1781641709',id=15,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=20,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM1xgBYSUC8gvnJcy96ZHeGXy5fZIEzREwu2SdQF2/u48S662pU9yPXCvPyqRaFOz6WipFAjO8OpVLmGBh29YapeoV6qQTkRjF0wg+tU+pxk7BBzNSGVITtKYZ1VXuTWyg==',key_name='tempest-keypair-1912782802',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='99863a77c63a4673b0ef23a7c0ae373a',ramdisk_id='',reservation_id='r-lhsz7lpk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-706098489',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-706098489-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:50:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0908eb33a338434891ed9f5dd3768bab',uuid=1d7d2595-f3da-459e-a875-4b7ef22931bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "address": "fa:16:3e:99:7a:dd", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63ca2ebd-3e", "ovs_interfaceid": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.679 186962 DEBUG nova.network.os_vif_util [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converting VIF {"id": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "address": "fa:16:3e:99:7a:dd", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63ca2ebd-3e", "ovs_interfaceid": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.680 186962 DEBUG nova.network.os_vif_util [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:7a:dd,bridge_name='br-int',has_traffic_filtering=True,id=63ca2ebd-3ecf-41f1-aaa0-5aca12224c62,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63ca2ebd-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.681 186962 DEBUG nova.objects.instance [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lazy-loading 'pci_devices' on Instance uuid 1d7d2595-f3da-459e-a875-4b7ef22931bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.702 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  <uuid>1d7d2595-f3da-459e-a875-4b7ef22931bc</uuid>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  <name>instance-0000000f</name>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersWithSpecificFlavorTestJSON-server-1781641709</nova:name>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:50:47</nova:creationTime>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <nova:flavor name="tempest-flavor_with_ephemeral_1-983200373">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:        <nova:ephemeral>1</nova:ephemeral>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:        <nova:user uuid="0908eb33a338434891ed9f5dd3768bab">tempest-ServersWithSpecificFlavorTestJSON-706098489-project-member</nova:user>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:        <nova:project uuid="99863a77c63a4673b0ef23a7c0ae373a">tempest-ServersWithSpecificFlavorTestJSON-706098489</nova:project>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:        <nova:port uuid="63ca2ebd-3ecf-41f1-aaa0-5aca12224c62">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <entry name="serial">1d7d2595-f3da-459e-a875-4b7ef22931bc</entry>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <entry name="uuid">1d7d2595-f3da-459e-a875-4b7ef22931bc</entry>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.eph0"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <target dev="vdb" bus="virtio"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.config"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:99:7a:dd"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <target dev="tap63ca2ebd-3e"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/console.log" append="off"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:50:47 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:50:47 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:50:47 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:50:47 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.703 186962 DEBUG nova.compute.manager [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Preparing to wait for external event network-vif-plugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.703 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.704 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.704 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.705 186962 DEBUG nova.virt.libvirt.vif [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:50:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1781641709',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1781641709',ec2_ids=EC2Ids,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(20),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1781641709',id=15,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=20,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM1xgBYSUC8gvnJcy96ZHeGXy5fZIEzREwu2SdQF2/u48S662pU9yPXCvPyqRaFOz6WipFAjO8OpVLmGBh29YapeoV6qQTkRjF0wg+tU+pxk7BBzNSGVITtKYZ1VXuTWyg==',key_name='tempest-keypair-1912782802',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='99863a77c63a4673b0ef23a7c0ae373a',ramdisk_id='',reservation_id='r-lhsz7lpk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-706098489',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-706098489-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:50:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0908eb33a338434891ed9f5dd3768bab',uuid=1d7d2595-f3da-459e-a875-4b7ef22931bc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "address": "fa:16:3e:99:7a:dd", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63ca2ebd-3e", "ovs_interfaceid": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.705 186962 DEBUG nova.network.os_vif_util [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converting VIF {"id": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "address": "fa:16:3e:99:7a:dd", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63ca2ebd-3e", "ovs_interfaceid": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.706 186962 DEBUG nova.network.os_vif_util [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:7a:dd,bridge_name='br-int',has_traffic_filtering=True,id=63ca2ebd-3ecf-41f1-aaa0-5aca12224c62,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63ca2ebd-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.706 186962 DEBUG os_vif [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:7a:dd,bridge_name='br-int',has_traffic_filtering=True,id=63ca2ebd-3ecf-41f1-aaa0-5aca12224c62,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63ca2ebd-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.707 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.708 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.708 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.711 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.711 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63ca2ebd-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.712 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap63ca2ebd-3e, col_values=(('external_ids', {'iface-id': '63ca2ebd-3ecf-41f1-aaa0-5aca12224c62', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:99:7a:dd', 'vm-uuid': '1d7d2595-f3da-459e-a875-4b7ef22931bc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:47 np0005539505 NetworkManager[55134]: <info>  [1764399047.7151] manager: (tap63ca2ebd-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.714 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.716 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.719 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.720 186962 INFO os_vif [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:7a:dd,bridge_name='br-int',has_traffic_filtering=True,id=63ca2ebd-3ecf-41f1-aaa0-5aca12224c62,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63ca2ebd-3e')#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.898 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.899 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.899 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.899 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] No VIF found with MAC fa:16:3e:99:7a:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:50:47 np0005539505 nova_compute[186958]: 2025-11-29 06:50:47.900 186962 INFO nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Using config drive#033[00m
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.357 186962 INFO nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Creating config drive at /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.config#033[00m
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.364 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq3n07dzh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:48.454 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}8eb302a794efaba1f3c4d181e523060affea2f808455707e52bbb7cd2e0b3f10" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.487 186962 DEBUG oslo_concurrency.processutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq3n07dzh" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:50:48 np0005539505 kernel: tap63ca2ebd-3e: entered promiscuous mode
Nov 29 01:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:48.549 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1187 Content-Type: application/json Date: Sat, 29 Nov 2025 06:50:48 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-f8cee936-7bdd-45ef-a095-9bca9ed03b14 x-openstack-request-id: req-f8cee936-7bdd-45ef-a095-9bca9ed03b14 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.549 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:48.549 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1523851324", "name": "tempest-flavor_with_ephemeral_0-1649149277", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1523851324"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1523851324"}]}, {"id": "1685538692", "name": "tempest-flavor_with_ephemeral_1-983200373", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1685538692"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1685538692"}]}, {"id": "1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}]}, {"id": "e29df891-dca5-4a1c-9258-dc512a46956f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 01:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:48.549 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-f8cee936-7bdd-45ef-a095-9bca9ed03b14 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 01:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:48.550 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/1685538692 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}8eb302a794efaba1f3c4d181e523060affea2f808455707e52bbb7cd2e0b3f10" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 01:50:48 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:48Z|00079|binding|INFO|Claiming lport 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 for this chassis.
Nov 29 01:50:48 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:48Z|00080|binding|INFO|63ca2ebd-3ecf-41f1-aaa0-5aca12224c62: Claiming fa:16:3e:99:7a:dd 10.100.0.12
Nov 29 01:50:48 np0005539505 NetworkManager[55134]: <info>  [1764399048.5513] manager: (tap63ca2ebd-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.555 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.564 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:7a:dd 10.100.0.12'], port_security=['fa:16:3e:99:7a:dd 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61990940-649c-4332-bea9-4159087142dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a14e098-439d-46d7-8c4c-c5f31ab6085d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e9d8ee4-839b-4078-a12e-f22cafda935b, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=63ca2ebd-3ecf-41f1-aaa0-5aca12224c62) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.566 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 in datapath 61990940-649c-4332-bea9-4159087142dd bound to our chassis#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.567 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 61990940-649c-4332-bea9-4159087142dd#033[00m
Nov 29 01:50:48 np0005539505 systemd-udevd[215767]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.579 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[02195d94-15a3-4b06-b8ac-2b5514b7f0b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.580 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap61990940-61 in ovnmeta-61990940-649c-4332-bea9-4159087142dd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.581 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap61990940-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.581 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e6fa5cec-bea1-4db5-be7c-a0ca046d3c81]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.582 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[03632cfb-856a-43f6-832c-4bdc35ff8286]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 systemd-machined[153285]: New machine qemu-7-instance-0000000f.
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.591 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[472bb86a-c7e6-46e3-90a2-9e3fe650f75f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 NetworkManager[55134]: <info>  [1764399048.5933] device (tap63ca2ebd-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:50:48 np0005539505 NetworkManager[55134]: <info>  [1764399048.5944] device (tap63ca2ebd-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.604 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:48 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:48Z|00081|binding|INFO|Setting lport 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 ovn-installed in OVS
Nov 29 01:50:48 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:48Z|00082|binding|INFO|Setting lport 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 up in Southbound
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.608 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:48 np0005539505 systemd[1]: Started Virtual Machine qemu-7-instance-0000000f.
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.616 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb54c29-034b-4027-bd4d-663bbd7ececa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.643 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[953acbb3-cb61-44b8-842d-4122a469b8c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.647 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[edd3f520-16fc-459e-a589-7d512783f6bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 NetworkManager[55134]: <info>  [1764399048.6494] manager: (tap61990940-60): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Nov 29 01:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:48.653 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 451 Content-Type: application/json Date: Sat, 29 Nov 2025 06:50:48 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a03a0728-87e5-4583-a13a-031874555993 x-openstack-request-id: req-a03a0728-87e5-4583-a13a-031874555993 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 01:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:48.653 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "1685538692", "name": "tempest-flavor_with_ephemeral_1-983200373", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1685538692"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1685538692"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 01:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:48.653 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/1685538692 used request id req-a03a0728-87e5-4583-a13a-031874555993 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 01:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:48.655 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000f', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '99863a77c63a4673b0ef23a7c0ae373a', 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'hostId': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:48.655 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.676 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a014819b-7fc8-4516-8a17-c2ca2c4060c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.680 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[88cb46ea-14c3-4b91-952d-a8a3ea6d9ffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 NetworkManager[55134]: <info>  [1764399048.7043] device (tap61990940-60): carrier: link connected
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.710 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[544874ac-0eeb-40f4-aa18-68939138474c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.729 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3ede3ea7-e820-432d-a7f5-4948abd3a090]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61990940-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:ea:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449628, 'reachable_time': 33977, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215800, 'error': None, 'target': 'ovnmeta-61990940-649c-4332-bea9-4159087142dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.745 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[694dc646-b185-421c-bfd9-ea7ddc597362]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:ea5d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449628, 'tstamp': 449628}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215801, 'error': None, 'target': 'ovnmeta-61990940-649c-4332-bea9-4159087142dd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.763 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[73e70ab2-cd9a-42ed-8161-098f10e81f33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61990940-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:ea:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449628, 'reachable_time': 33977, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215802, 'error': None, 'target': 'ovnmeta-61990940-649c-4332-bea9-4159087142dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.796 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[581a29b4-726e-4f4f-9178-f66c3bbc2a60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.830 186962 DEBUG nova.compute.manager [req-fa7ff748-dbe6-45ef-a188-f3f661aab0b9 req-01b9898b-0c79-44d2-bd93-9d96e831d714 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Received event network-vif-plugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.831 186962 DEBUG oslo_concurrency.lockutils [req-fa7ff748-dbe6-45ef-a188-f3f661aab0b9 req-01b9898b-0c79-44d2-bd93-9d96e831d714 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.832 186962 DEBUG oslo_concurrency.lockutils [req-fa7ff748-dbe6-45ef-a188-f3f661aab0b9 req-01b9898b-0c79-44d2-bd93-9d96e831d714 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.832 186962 DEBUG oslo_concurrency.lockutils [req-fa7ff748-dbe6-45ef-a188-f3f661aab0b9 req-01b9898b-0c79-44d2-bd93-9d96e831d714 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.833 186962 DEBUG nova.compute.manager [req-fa7ff748-dbe6-45ef-a188-f3f661aab0b9 req-01b9898b-0c79-44d2-bd93-9d96e831d714 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Processing event network-vif-plugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.862 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[359cfb43-9184-4f51-9ba4-b9e93ce69c04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.864 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61990940-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.864 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.865 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61990940-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:48 np0005539505 NetworkManager[55134]: <info>  [1764399048.8681] manager: (tap61990940-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Nov 29 01:50:48 np0005539505 kernel: tap61990940-60: entered promiscuous mode
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.872 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap61990940-60, col_values=(('external_ids', {'iface-id': '04e49043-e1b1-4b06-a437-d8d097a15b16'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:50:48 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:48Z|00083|binding|INFO|Releasing lport 04e49043-e1b1-4b06-a437-d8d097a15b16 from this chassis (sb_readonly=0)
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.874 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:48 np0005539505 nova_compute[186958]: 2025-11-29 06:50:48.886 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.887 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/61990940-649c-4332-bea9-4159087142dd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/61990940-649c-4332-bea9-4159087142dd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.888 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0a8252f6-f140-4154-9a78-760f3dbf29e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.889 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-61990940-649c-4332-bea9-4159087142dd
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/61990940-649c-4332-bea9-4159087142dd.pid.haproxy
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 61990940-649c-4332-bea9-4159087142dd
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:50:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:50:48.890 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-61990940-649c-4332-bea9-4159087142dd', 'env', 'PROCESS_TAG=haproxy-61990940-649c-4332-bea9-4159087142dd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/61990940-649c-4332-bea9-4159087142dd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.113 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399049.1129534, 1d7d2595-f3da-459e-a875-4b7ef22931bc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.114 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] VM Started (Lifecycle Event)#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.116 186962 DEBUG nova.compute.manager [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.120 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.123 186962 INFO nova.virt.libvirt.driver [-] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Instance spawned successfully.#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.123 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.134 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.137 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.154 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.156 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.157 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.158 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.158 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.159 186962 DEBUG nova.virt.libvirt.driver [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.166 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.167 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399049.1158054, 1d7d2595-f3da-459e-a875-4b7ef22931bc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.168 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.170 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.171 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.172 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '294c1211-cfe1-4b4e-8e51-c0357434048e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vda', 'timestamp': '2025-11-29T06:50:48.655976', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bd2f1478-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': 'b8aaf36bfb81fb9022b34f8f761e05f585351a06a71dbd4ac7a6de14cb7749ca'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vdb', 'timestamp': '2025-11-29T06:50:48.655976', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bd2f2f1c-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': 'aa2868533d284f789a70fd6c072679e3db81a90f8a5f493f12defa8fcc71f65e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-sda', 'timestamp': '2025-11-29T06:50:48.655976', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'sda'}, 'message_id': 'bd2f44ac-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '67b13a44552efb8d2aa779bc9c286b022511a12ad9f9a3e724815b5d931299a5'}]}, 'timestamp': '2025-11-29 06:50:49.172958', '_unique_id': '0bfcfd1bf8414ceaa5ae338d15c152c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.187 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.187 186962 DEBUG nova.network.neutron [req-3d171d15-b9ba-471b-80b6-4184da75ea7c req-0a38589f-726d-4db3-add5-6a143e13f456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Updated VIF entry in instance network info cache for port 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.188 186962 DEBUG nova.network.neutron [req-3d171d15-b9ba-471b-80b6-4184da75ea7c req-0a38589f-726d-4db3-add5-6a143e13f456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Updating instance_info_cache with network_info: [{"id": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "address": "fa:16:3e:99:7a:dd", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63ca2ebd-3e", "ovs_interfaceid": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.191 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1d7d2595-f3da-459e-a875-4b7ef22931bc / tap63ca2ebd-3e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.192 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cca6bbad-5361-427d-bd77-5d9bd3b483e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': 'instance-0000000f-1d7d2595-f3da-459e-a875-4b7ef22931bc-tap63ca2ebd-3e', 'timestamp': '2025-11-29T06:50:49.188131', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'tap63ca2ebd-3e', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:99:7a:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63ca2ebd-3e'}, 'message_id': 'bd324f30-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.829449295, 'message_signature': '8b66d9022ccbad16972fa307e2fdb97ac1c69a9b8dcfb3fb8a50f5f6eea4b532'}]}, 'timestamp': '2025-11-29 06:50:49.192898', '_unique_id': '08d4388646a048b8be1c076a50c793ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.195 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.197 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3564c1b-93f3-4920-95d8-946b9eeb851b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': 'instance-0000000f-1d7d2595-f3da-459e-a875-4b7ef22931bc-tap63ca2ebd-3e', 'timestamp': '2025-11-29T06:50:49.197594', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'tap63ca2ebd-3e', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:99:7a:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63ca2ebd-3e'}, 'message_id': 'bd3319c4-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.829449295, 'message_signature': '05ed4f499474a5b8d43193dbec67a3b108bd8d24435640f0c9b6ed2a929a0b24'}]}, 'timestamp': '2025-11-29 06:50:49.198006', '_unique_id': 'f1ee5cc3cd0b40ffb40f6fd4a1ca7e2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.199 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6327ef7-1e20-4df9-a1fd-a464dde66ee4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': 'instance-0000000f-1d7d2595-f3da-459e-a875-4b7ef22931bc-tap63ca2ebd-3e', 'timestamp': '2025-11-29T06:50:49.199693', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'tap63ca2ebd-3e', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:99:7a:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63ca2ebd-3e'}, 'message_id': 'bd3367d0-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.829449295, 'message_signature': 'e1f664bd7b8503bbaa4d2eb43e3f899336f24ea5e9a732e1ecc08d02500cc4cf'}]}, 'timestamp': '2025-11-29 06:50:49.199931', '_unique_id': '10c13dec8f294932986c80a3fb4c7327'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.200 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.202 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399049.120207, 1d7d2595-f3da-459e-a875-4b7ef22931bc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.202 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.216 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.216 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 1d7d2595-f3da-459e-a875-4b7ef22931bc: ceilometer.compute.pollsters.NoVolumeException
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.216 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.216 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.216 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1781641709>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1781641709>]
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.217 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.217 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.217 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f6db5f6-e2af-475b-b0bb-9d7af2321988', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vda', 'timestamp': '2025-11-29T06:50:49.217507', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bd362060-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '14a142d78f2c4f92235f9de028e9884f12ae36d69b4f9ed267f623e248510c98'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vdb', 'timestamp': '2025-11-29T06:50:49.217507', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bd362998-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '3e12d4414a0bba8341271bf3aec97bbfac179e6a7e57da2e750b6997d4189af1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-sda', 'timestamp': '2025-11-29T06:50:49.217507', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'sda'}, 'message_id': 'bd36321c-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': 'c0e6399df71ba3a72bac22b0bf343e368b31585b193fb7d0431810b90f3ed8d5'}]}, 'timestamp': '2025-11-29 06:50:49.218206', '_unique_id': 'b4d6c1050f274cc08cafc6d820d81b63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.219 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac4a69a5-2c6f-4a8c-bd3c-eff4d33171cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': 'instance-0000000f-1d7d2595-f3da-459e-a875-4b7ef22931bc-tap63ca2ebd-3e', 'timestamp': '2025-11-29T06:50:49.219676', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'tap63ca2ebd-3e', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:99:7a:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63ca2ebd-3e'}, 'message_id': 'bd3674ac-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.829449295, 'message_signature': 'b1780e6b75959c4f946266e31c30b8874e07c0a7c1924246924cf656a94fa445'}]}, 'timestamp': '2025-11-29 06:50:49.219922', '_unique_id': 'f9f0a9093d8749089a8813ea73c94c2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '842724fd-f445-4b24-8422-fdec81d66910', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': 'instance-0000000f-1d7d2595-f3da-459e-a875-4b7ef22931bc-tap63ca2ebd-3e', 'timestamp': '2025-11-29T06:50:49.221022', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'tap63ca2ebd-3e', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:99:7a:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63ca2ebd-3e'}, 'message_id': 'bd36a8be-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.829449295, 'message_signature': '884fa690a7530fa2830776f04c9c351e68f2eea74aadd2938c99f4183ff0caf5'}]}, 'timestamp': '2025-11-29 06:50:49.221270', '_unique_id': '05bc5185b2e044738f2fe9411d55d64f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.221 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.222 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.222 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.222 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.222 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '760bc9ca-b279-498c-bf22-2ec8287300ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vda', 'timestamp': '2025-11-29T06:50:49.222350', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bd36dc80-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '5537fb02f6e63187fc36b0b4e3a8f922ae8bccbc863dfaf5ea5794b78076b031'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vdb', 'timestamp': '2025-11-29T06:50:49.222350', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bd36e48c-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '4f70be517d096e9904dcda821f50d36f88f9812631731b5a4fb72d7061efaacd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-sda', 'timestamp': '2025-11-29T06:50:49.222350', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'sda'}, 'message_id': 'bd36ec2a-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '507fc4e39df8aedd3f4d7a97556db997eedb6a7fd1cb06ae339d5ebdd9e2bff2'}]}, 'timestamp': '2025-11-29 06:50:49.222971', '_unique_id': '7fac6336958e4fceb4c1db5fe8135fd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.223 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad06c358-d41c-44b0-b674-f508423640fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': 'instance-0000000f-1d7d2595-f3da-459e-a875-4b7ef22931bc-tap63ca2ebd-3e', 'timestamp': '2025-11-29T06:50:49.224080', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'tap63ca2ebd-3e', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:99:7a:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63ca2ebd-3e'}, 'message_id': 'bd3720aa-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.829449295, 'message_signature': '1c49260d94585ef9f59dab9b218dc4cdbe6d1a48c4cb238d4c3a3862c65a3a0b'}]}, 'timestamp': '2025-11-29 06:50:49.224382', '_unique_id': 'd961e8c73c8843bca6650ec2c01e4d36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.224 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.225 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.225 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.225 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.226 186962 DEBUG oslo_concurrency.lockutils [req-3d171d15-b9ba-471b-80b6-4184da75ea7c req-0a38589f-726d-4db3-add5-6a143e13f456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1d7d2595-f3da-459e-a875-4b7ef22931bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cd44e3a-6697-4e44-849d-d1987ad5b884', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vda', 'timestamp': '2025-11-29T06:50:49.225585', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bd375af2-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': 'fe893ea3d246e29e493adf3f8eddae639c62448c684881065c3c3b981b0406a0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vdb', 'timestamp': '2025-11-29T06:50:49.225585', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bd376362-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '883cf71724a5c2864e0e1d2ade54ad5fc0ba9abc868c72511adc323027705859'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-sda', 'timestamp': '2025-11-29T06:50:49.225585', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'sda'}, 'message_id': 'bd376c68-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '860024dfd2d06c87c99d11289d9715c34580cbdf536203d69ddee272cc1eb93a'}]}, 'timestamp': '2025-11-29 06:50:49.226260', '_unique_id': 'b7773e13a4bd4369b523d2fcb33dbef9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.226 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.227 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.227 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.230 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.242 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.242 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.242 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1bacb81-9dcb-4d6d-b736-c1a3e40bbe59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vda', 'timestamp': '2025-11-29T06:50:49.227436', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bd39e79a-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.868334269, 'message_signature': 'f8b6a61349eb28260833f2412a2b338c0eec088b7d2254c6f27856cd8e9419a5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vdb', 'timestamp': '2025-11-29T06:50:49.227436', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bd39f19a-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.868334269, 'message_signature': '471c044fb7a483e68dc497eb534bb246abb85875456741b57358ceb80332fb43'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-sda', 'timestamp': '2025-11-29T06:50:49.227436', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'sda'}, 'message_id': 'bd39f924-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.868334269, 'message_signature': '600006587c8addd17f8e908bcb66288836bd247a71a1991303074a74e100bb8e'}]}, 'timestamp': '2025-11-29 06:50:49.242972', '_unique_id': 'c9bff4ce58f34af88aae1f7090a52dda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.244 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.244 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.244 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1781641709>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1781641709>]
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.244 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.245 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.245 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bd25ea1-38e6-4d92-9f05-90c1619dba4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vda', 'timestamp': '2025-11-29T06:50:49.244855', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bd3a4c1c-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '344f65aa84bbaf1d39016dd5d73113f69805b21eac553928c360c7b3840ef369'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vdb', 'timestamp': '2025-11-29T06:50:49.244855', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bd3a5450-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '5cb445f66864c950578198cdebbc456dca1c5744e8660a0769ae9e350b927388'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-sda', 'timestamp': '2025-11-29T06:50:49.244855', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'sda'}, 'message_id': 'bd3a5ca2-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '4668d5239ad8e3fa6639d212a05eec3ead0539e5154c906711fb1d63e8fd5427'}]}, 'timestamp': '2025-11-29 06:50:49.245498', '_unique_id': '7f99bb0ff0a94182a3c38095b9f004ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.246 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/cpu volume: 70000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '395f3c2f-455a-413c-89f3-8ccd6af2e699', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 70000000, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'timestamp': '2025-11-29T06:50:49.246856', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'bd3a9b2c-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.856835126, 'message_signature': 'b181b81a9ccce4d3ce51d4159f8009701b3c430b18d65190e95f26d18847fb6d'}]}, 'timestamp': '2025-11-29 06:50:49.247148', '_unique_id': '2479868807dc4c129259917a9d13cd10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.247 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.248 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.248 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.249 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a27f41d7-9096-4eb6-b49b-dd28b7f5d21a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vda', 'timestamp': '2025-11-29T06:50:49.248642', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bd3ae08c-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': 'de0bb8a420095f60840ca302e85d15d57f9eabd4d9fa70adb53e007c7315e2c7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vdb', 'timestamp': '2025-11-29T06:50:49.248642', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bd3ae956-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '47a33e7d54583e72845ac479b4a4e516a790c528dd35079ced77e864d7542bca'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-sda', 'timestamp': '2025-11-29T06:50:49.248642', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'sda'}, 'message_id': 'bd3af3ba-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.296901088, 'message_signature': '503765036f22b6bdba8ee447d8f46e082cd55c430da586b8f674a2a89e4aa013'}]}, 'timestamp': '2025-11-29 06:50:49.249412', '_unique_id': 'cebf6746fadb467fb7b2d258b4210e9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.251 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.251 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1781641709>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1781641709>]
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.251 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9740732-2bb3-495a-a197-e851b248bd20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': 'instance-0000000f-1d7d2595-f3da-459e-a875-4b7ef22931bc-tap63ca2ebd-3e', 'timestamp': '2025-11-29T06:50:49.251434', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'tap63ca2ebd-3e', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:99:7a:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63ca2ebd-3e'}, 'message_id': 'bd3b4ce8-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.829449295, 'message_signature': '94982731a594505765e907962a2e735e747fb06c215c8f3acee9c7c4ae360243'}]}, 'timestamp': '2025-11-29 06:50:49.251666', '_unique_id': '102e1427643441c09cedd4974985fe19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.252 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89e3779e-6ada-454a-adbb-e7c284defdc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': 'instance-0000000f-1d7d2595-f3da-459e-a875-4b7ef22931bc-tap63ca2ebd-3e', 'timestamp': '2025-11-29T06:50:49.252732', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'tap63ca2ebd-3e', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:99:7a:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63ca2ebd-3e'}, 'message_id': 'bd3b7f42-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.829449295, 'message_signature': '418688bbaff1c3af0aa7ba2df9a74ffac99960cf637711e8e992fb9570382331'}]}, 'timestamp': '2025-11-29 06:50:49.252952', '_unique_id': '7fd0e3b2cd464fefa9ce81d7473c9f8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '314d4920-6671-412f-b55b-0bdf31194a07', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': 'instance-0000000f-1d7d2595-f3da-459e-a875-4b7ef22931bc-tap63ca2ebd-3e', 'timestamp': '2025-11-29T06:50:49.253982', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'tap63ca2ebd-3e', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:99:7a:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63ca2ebd-3e'}, 'message_id': 'bd3bb00c-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.829449295, 'message_signature': '5490af0942376e331c42f6d5cac4e964b0dfcf484b0e806b0ddd3d9197af407c'}]}, 'timestamp': '2025-11-29 06:50:49.254241', '_unique_id': 'cf31a78ab1274a7babb41b65fe340252'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.254 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e54d29db-02b9-4b9e-a1c7-0b029097c9a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': 'instance-0000000f-1d7d2595-f3da-459e-a875-4b7ef22931bc-tap63ca2ebd-3e', 'timestamp': '2025-11-29T06:50:49.255273', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'tap63ca2ebd-3e', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:99:7a:dd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63ca2ebd-3e'}, 'message_id': 'bd3be2fc-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.829449295, 'message_signature': '984b30d67adb35c579c7ccb0077eb7e8159e68b70275ceb77def196dd319e9ed'}]}, 'timestamp': '2025-11-29 06:50:49.255506', '_unique_id': '3e668fe9d7074f46916ad11b545d6878'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.256 186962 INFO nova.compute.manager [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Took 4.82 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.256 186962 DEBUG nova.compute.manager [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.255 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.256 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.256 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1781641709>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersWithSpecificFlavorTestJSON-server-1781641709>]
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.256 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a8480e1-3fd0-49b8-b962-55521b56d9e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vda', 'timestamp': '2025-11-29T06:50:49.256824', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bd3c1f1a-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.868334269, 'message_signature': '791078ae98a65457d01d19359af8ce6828bc66b1591ec35dc3df61f89eab7d13'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vdb', 'timestamp': '2025-11-29T06:50:49.256824', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bd3c26cc-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.868334269, 'message_signature': 'f243cf9e8da0ecc9e91d96511ce2bab69c5e8d68915ffbd83e6b30c5463abcc1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-sda', 'timestamp': '2025-11-29T06:50:49.256824', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'sda'}, 'message_id': 'bd3c2ec4-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.868334269, 'message_signature': '5dde6e334216b749fdd8db0d34a7f1e5527115dbc626232007d5ce7a4608097e'}]}, 'timestamp': '2025-11-29 06:50:49.257431', '_unique_id': '8400ce90eee0479cae676b2c64c2c8a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.257 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.258 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.258 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.258 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.259 12 DEBUG ceilometer.compute.pollsters [-] 1d7d2595-f3da-459e-a875-4b7ef22931bc/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b9de9bf-4e66-480c-bcc7-3322aea86447', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vda', 'timestamp': '2025-11-29T06:50:49.258745', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bd3c6a2e-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.868334269, 'message_signature': '351a22d76fc23747b2cf2ee0d6d3870e01c400823cd7953b7704e9d1047322a5'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-vdb', 'timestamp': '2025-11-29T06:50:49.258745', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bd3c71e0-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.868334269, 'message_signature': '0ff6b84926944d8ee46c395927eea437c5fba2ebee7f3c02c02a0f66a39b9852'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '0908eb33a338434891ed9f5dd3768bab', 'user_name': None, 'project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'project_name': None, 'resource_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc-sda', 'timestamp': '2025-11-29T06:50:49.258745', 'resource_metadata': {'display_name': 'tempest-ServersWithSpecificFlavorTestJSON-server-1781641709', 'name': 'instance-0000000f', 'instance_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'instance_type': 'tempest-flavor_with_ephemeral_1-983200373', 'host': '85fcc04b68bf794e3f1c028a7aacbb623a40323ed414e033ef62d6bf', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1685538692', 'name': 'tempest-flavor_with_ephemeral_1-983200373', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'sda'}, 'message_id': 'bd3c79ba-ccef-11f0-8954-fa163e5a5606', 'monotonic_time': 4496.868334269, 'message_signature': '7b60981265c6e13266dcf0d1139b257cac7a5f86971a51fcf19ec40f3c216a23'}]}, 'timestamp': '2025-11-29 06:50:49.259349', '_unique_id': 'ce22b01c02f540bc874e869583732422'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:50:49 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:50:49.260 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.262 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:50:49 np0005539505 podman[215841]: 2025-11-29 06:50:49.25084222 +0000 UTC m=+0.031524258 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.359 186962 INFO nova.compute.manager [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Took 5.71 seconds to build instance.#033[00m
Nov 29 01:50:49 np0005539505 nova_compute[186958]: 2025-11-29 06:50:49.380 186962 DEBUG oslo_concurrency.lockutils [None req-b7279afe-a3e7-45b6-898d-748df0d08175 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:49 np0005539505 podman[215841]: 2025-11-29 06:50:49.671181249 +0000 UTC m=+0.451863257 container create 4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:50:49 np0005539505 systemd[1]: Started libpod-conmon-4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18.scope.
Nov 29 01:50:49 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:50:49 np0005539505 podman[215855]: 2025-11-29 06:50:49.819152573 +0000 UTC m=+0.144208139 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:50:49 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00219f4316926ace31cfbb9b6ceedd34b8c75c4088331caf6485e65237899871/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:50:49 np0005539505 podman[215841]: 2025-11-29 06:50:49.995466755 +0000 UTC m=+0.776148783 container init 4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 01:50:50 np0005539505 podman[215841]: 2025-11-29 06:50:50.007027441 +0000 UTC m=+0.787709449 container start 4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:50:50 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215872]: [NOTICE]   (215878) : New worker (215880) forked
Nov 29 01:50:50 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215872]: [NOTICE]   (215878) : Loading success.
Nov 29 01:50:50 np0005539505 nova_compute[186958]: 2025-11-29 06:50:50.966 186962 DEBUG nova.compute.manager [req-46a8eefd-176d-4c0c-a881-f74bc59e91dd req-fdb8b157-9f6a-4572-b04d-982f7dfe342d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Received event network-vif-plugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:50 np0005539505 nova_compute[186958]: 2025-11-29 06:50:50.967 186962 DEBUG oslo_concurrency.lockutils [req-46a8eefd-176d-4c0c-a881-f74bc59e91dd req-fdb8b157-9f6a-4572-b04d-982f7dfe342d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:50:50 np0005539505 nova_compute[186958]: 2025-11-29 06:50:50.967 186962 DEBUG oslo_concurrency.lockutils [req-46a8eefd-176d-4c0c-a881-f74bc59e91dd req-fdb8b157-9f6a-4572-b04d-982f7dfe342d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:50:50 np0005539505 nova_compute[186958]: 2025-11-29 06:50:50.967 186962 DEBUG oslo_concurrency.lockutils [req-46a8eefd-176d-4c0c-a881-f74bc59e91dd req-fdb8b157-9f6a-4572-b04d-982f7dfe342d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:50:50 np0005539505 nova_compute[186958]: 2025-11-29 06:50:50.968 186962 DEBUG nova.compute.manager [req-46a8eefd-176d-4c0c-a881-f74bc59e91dd req-fdb8b157-9f6a-4572-b04d-982f7dfe342d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] No waiting events found dispatching network-vif-plugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:50:50 np0005539505 nova_compute[186958]: 2025-11-29 06:50:50.968 186962 WARNING nova.compute.manager [req-46a8eefd-176d-4c0c-a881-f74bc59e91dd req-fdb8b157-9f6a-4572-b04d-982f7dfe342d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Received unexpected event network-vif-plugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 for instance with vm_state active and task_state None.#033[00m
Nov 29 01:50:51 np0005539505 nova_compute[186958]: 2025-11-29 06:50:51.848 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:52 np0005539505 nova_compute[186958]: 2025-11-29 06:50:52.715 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:55 np0005539505 nova_compute[186958]: 2025-11-29 06:50:55.118 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399040.1173878, 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:50:55 np0005539505 nova_compute[186958]: 2025-11-29 06:50:55.119 186962 INFO nova.compute.manager [-] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:50:55 np0005539505 podman[215889]: 2025-11-29 06:50:55.746368063 +0000 UTC m=+0.069411062 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:50:55 np0005539505 podman[215890]: 2025-11-29 06:50:55.785001839 +0000 UTC m=+0.106289879 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:50:55 np0005539505 nova_compute[186958]: 2025-11-29 06:50:55.939 186962 DEBUG nova.compute.manager [None req-5baccbce-032b-41da-94ff-ea21905104bd - - - - - -] [instance: 9dcd4651-f485-4d8f-b6b2-02492d2c0a1c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:50:56 np0005539505 nova_compute[186958]: 2025-11-29 06:50:56.075 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:56 np0005539505 NetworkManager[55134]: <info>  [1764399056.0766] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 29 01:50:56 np0005539505 NetworkManager[55134]: <info>  [1764399056.0774] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Nov 29 01:50:56 np0005539505 nova_compute[186958]: 2025-11-29 06:50:56.142 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:56 np0005539505 ovn_controller[95143]: 2025-11-29T06:50:56Z|00084|binding|INFO|Releasing lport 04e49043-e1b1-4b06-a437-d8d097a15b16 from this chassis (sb_readonly=0)
Nov 29 01:50:56 np0005539505 nova_compute[186958]: 2025-11-29 06:50:56.151 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:56 np0005539505 nova_compute[186958]: 2025-11-29 06:50:56.848 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:57 np0005539505 nova_compute[186958]: 2025-11-29 06:50:57.718 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:50:57 np0005539505 nova_compute[186958]: 2025-11-29 06:50:57.895 186962 DEBUG nova.compute.manager [req-13b5658d-8c6c-48ae-91e0-41dc1e5e8040 req-55a03ff5-ffba-42a2-8281-27d00cd14302 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Received event network-changed-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:50:57 np0005539505 nova_compute[186958]: 2025-11-29 06:50:57.896 186962 DEBUG nova.compute.manager [req-13b5658d-8c6c-48ae-91e0-41dc1e5e8040 req-55a03ff5-ffba-42a2-8281-27d00cd14302 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Refreshing instance network info cache due to event network-changed-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:50:57 np0005539505 nova_compute[186958]: 2025-11-29 06:50:57.897 186962 DEBUG oslo_concurrency.lockutils [req-13b5658d-8c6c-48ae-91e0-41dc1e5e8040 req-55a03ff5-ffba-42a2-8281-27d00cd14302 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1d7d2595-f3da-459e-a875-4b7ef22931bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:50:57 np0005539505 nova_compute[186958]: 2025-11-29 06:50:57.897 186962 DEBUG oslo_concurrency.lockutils [req-13b5658d-8c6c-48ae-91e0-41dc1e5e8040 req-55a03ff5-ffba-42a2-8281-27d00cd14302 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1d7d2595-f3da-459e-a875-4b7ef22931bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:50:57 np0005539505 nova_compute[186958]: 2025-11-29 06:50:57.897 186962 DEBUG nova.network.neutron [req-13b5658d-8c6c-48ae-91e0-41dc1e5e8040 req-55a03ff5-ffba-42a2-8281-27d00cd14302 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Refreshing network info cache for port 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:51:00 np0005539505 podman[215952]: 2025-11-29 06:51:00.725740039 +0000 UTC m=+0.055138271 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:51:01 np0005539505 ovn_controller[95143]: 2025-11-29T06:51:01Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:99:7a:dd 10.100.0.12
Nov 29 01:51:01 np0005539505 ovn_controller[95143]: 2025-11-29T06:51:01Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:99:7a:dd 10.100.0.12
Nov 29 01:51:01 np0005539505 nova_compute[186958]: 2025-11-29 06:51:01.885 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:02 np0005539505 nova_compute[186958]: 2025-11-29 06:51:02.721 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:03 np0005539505 nova_compute[186958]: 2025-11-29 06:51:03.384 186962 DEBUG nova.network.neutron [req-13b5658d-8c6c-48ae-91e0-41dc1e5e8040 req-55a03ff5-ffba-42a2-8281-27d00cd14302 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Updated VIF entry in instance network info cache for port 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:51:03 np0005539505 nova_compute[186958]: 2025-11-29 06:51:03.385 186962 DEBUG nova.network.neutron [req-13b5658d-8c6c-48ae-91e0-41dc1e5e8040 req-55a03ff5-ffba-42a2-8281-27d00cd14302 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Updating instance_info_cache with network_info: [{"id": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "address": "fa:16:3e:99:7a:dd", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63ca2ebd-3e", "ovs_interfaceid": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:51:03 np0005539505 nova_compute[186958]: 2025-11-29 06:51:03.426 186962 DEBUG oslo_concurrency.lockutils [req-13b5658d-8c6c-48ae-91e0-41dc1e5e8040 req-55a03ff5-ffba-42a2-8281-27d00cd14302 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1d7d2595-f3da-459e-a875-4b7ef22931bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:51:05 np0005539505 podman[215973]: 2025-11-29 06:51:05.726309252 +0000 UTC m=+0.062676753 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:51:06 np0005539505 nova_compute[186958]: 2025-11-29 06:51:06.886 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:07 np0005539505 nova_compute[186958]: 2025-11-29 06:51:07.763 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.226 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Acquiring lock "b8acedf9-c61c-4006-b7b7-0e6b2ede690b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.227 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "b8acedf9-c61c-4006-b7b7-0e6b2ede690b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.248 186962 DEBUG nova.compute.manager [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.363 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.364 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.373 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.373 186962 INFO nova.compute.claims [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.519 186962 DEBUG nova.compute.provider_tree [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.534 186962 DEBUG nova.scheduler.client.report [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.564 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.566 186962 DEBUG nova.compute.manager [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.625 186962 DEBUG nova.compute.manager [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.625 186962 DEBUG nova.network.neutron [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.646 186962 INFO nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.669 186962 DEBUG nova.compute.manager [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.802 186962 DEBUG nova.compute.manager [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.804 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.804 186962 INFO nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Creating image(s)#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.805 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Acquiring lock "/var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.805 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "/var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.806 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "/var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.818 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.878 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.879 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.880 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.895 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.948 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.949 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.996 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.998 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:10 np0005539505 nova_compute[186958]: 2025-11-29 06:51:10.998 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.024 186962 DEBUG nova.network.neutron [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.026 186962 DEBUG nova.compute.manager [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.079 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.080 186962 DEBUG nova.virt.disk.api [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Checking if we can resize image /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.080 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.135 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.136 186962 DEBUG nova.virt.disk.api [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Cannot resize image /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.136 186962 DEBUG nova.objects.instance [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lazy-loading 'migration_context' on Instance uuid b8acedf9-c61c-4006-b7b7-0e6b2ede690b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.151 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.152 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Ensure instance console log exists: /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.153 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.154 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.155 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.158 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.165 186962 WARNING nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.169 186962 DEBUG nova.virt.libvirt.host [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.170 186962 DEBUG nova.virt.libvirt.host [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.173 186962 DEBUG nova.virt.libvirt.host [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.174 186962 DEBUG nova.virt.libvirt.host [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.175 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.175 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.176 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.176 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.177 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.177 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.177 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.178 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.178 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.178 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.179 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.179 186962 DEBUG nova.virt.hardware [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.184 186962 DEBUG nova.objects.instance [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lazy-loading 'pci_devices' on Instance uuid b8acedf9-c61c-4006-b7b7-0e6b2ede690b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.196 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  <uuid>b8acedf9-c61c-4006-b7b7-0e6b2ede690b</uuid>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  <name>instance-00000012</name>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerDiagnosticsTest-server-722668643</nova:name>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:51:11</nova:creationTime>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:51:11 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:        <nova:user uuid="1421f8a87cd7495f9ca2ecb96b70925c">tempest-ServerDiagnosticsTest-1134508533-project-member</nova:user>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:        <nova:project uuid="3cd11e636a954ce4b808b2a5ebd76b12">tempest-ServerDiagnosticsTest-1134508533</nova:project>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <nova:ports/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <entry name="serial">b8acedf9-c61c-4006-b7b7-0e6b2ede690b</entry>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <entry name="uuid">b8acedf9-c61c-4006-b7b7-0e6b2ede690b</entry>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk.config"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/console.log" append="off"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:51:11 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:51:11 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:51:11 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:51:11 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.266 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.266 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.267 186962 INFO nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Using config drive#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.441 186962 INFO nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Creating config drive at /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk.config#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.445 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3_91kczs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.578 186962 DEBUG oslo_concurrency.processutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3_91kczs" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:11 np0005539505 systemd-machined[153285]: New machine qemu-8-instance-00000012.
Nov 29 01:51:11 np0005539505 systemd[1]: Started Virtual Machine qemu-8-instance-00000012.
Nov 29 01:51:11 np0005539505 nova_compute[186958]: 2025-11-29 06:51:11.919 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.489 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399072.4884133, b8acedf9-c61c-4006-b7b7-0e6b2ede690b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.489 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.493 186962 DEBUG nova.compute.manager [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.494 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.497 186962 INFO nova.virt.libvirt.driver [-] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Instance spawned successfully.#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.498 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.530 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.537 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.542 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.543 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.543 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.544 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.544 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.545 186962 DEBUG nova.virt.libvirt.driver [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.571 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.571 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399072.4927182, b8acedf9-c61c-4006-b7b7-0e6b2ede690b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.571 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] VM Started (Lifecycle Event)#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.603 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.607 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.628 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.638 186962 INFO nova.compute.manager [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Took 1.83 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.638 186962 DEBUG nova.compute.manager [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.764 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.775 186962 INFO nova.compute.manager [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Took 2.46 seconds to build instance.#033[00m
Nov 29 01:51:12 np0005539505 nova_compute[186958]: 2025-11-29 06:51:12.802 186962 DEBUG oslo_concurrency.lockutils [None req-b2c7d17c-70a5-4352-b306-e7a26361c5e6 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "b8acedf9-c61c-4006-b7b7-0e6b2ede690b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.520 186962 DEBUG nova.compute.manager [None req-b3ab1f0f-0848-4e40-a170-8de8e73b0a90 6f5ca860b23b40e296a293eb4229f370 62f3ca944cd548a08f0405af9a8031b5 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.524 186962 INFO nova.compute.manager [None req-b3ab1f0f-0848-4e40-a170-8de8e73b0a90 6f5ca860b23b40e296a293eb4229f370 62f3ca944cd548a08f0405af9a8031b5 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Retrieving diagnostics#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.543 186962 DEBUG oslo_concurrency.lockutils [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "1d7d2595-f3da-459e-a875-4b7ef22931bc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.544 186962 DEBUG oslo_concurrency.lockutils [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.544 186962 DEBUG oslo_concurrency.lockutils [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.544 186962 DEBUG oslo_concurrency.lockutils [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.544 186962 DEBUG oslo_concurrency.lockutils [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.556 186962 INFO nova.compute.manager [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Terminating instance#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.567 186962 DEBUG nova.compute.manager [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:51:14 np0005539505 kernel: tap63ca2ebd-3e (unregistering): left promiscuous mode
Nov 29 01:51:14 np0005539505 NetworkManager[55134]: <info>  [1764399074.5881] device (tap63ca2ebd-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.598 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:14 np0005539505 ovn_controller[95143]: 2025-11-29T06:51:14Z|00085|binding|INFO|Releasing lport 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 from this chassis (sb_readonly=0)
Nov 29 01:51:14 np0005539505 ovn_controller[95143]: 2025-11-29T06:51:14Z|00086|binding|INFO|Setting lport 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 down in Southbound
Nov 29 01:51:14 np0005539505 ovn_controller[95143]: 2025-11-29T06:51:14Z|00087|binding|INFO|Removing iface tap63ca2ebd-3e ovn-installed in OVS
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.600 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.628 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:7a:dd 10.100.0.12'], port_security=['fa:16:3e:99:7a:dd 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1d7d2595-f3da-459e-a875-4b7ef22931bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61990940-649c-4332-bea9-4159087142dd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '99863a77c63a4673b0ef23a7c0ae373a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a14e098-439d-46d7-8c4c-c5f31ab6085d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.243'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e9d8ee4-839b-4078-a12e-f22cafda935b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=63ca2ebd-3ecf-41f1-aaa0-5aca12224c62) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.631 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 in datapath 61990940-649c-4332-bea9-4159087142dd unbound from our chassis#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.632 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 61990940-649c-4332-bea9-4159087142dd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.634 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.634 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e73233d9-c6c2-41bb-82b5-c85857b3fb45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.636 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-61990940-649c-4332-bea9-4159087142dd namespace which is not needed anymore#033[00m
Nov 29 01:51:14 np0005539505 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Nov 29 01:51:14 np0005539505 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000f.scope: Consumed 12.956s CPU time.
Nov 29 01:51:14 np0005539505 systemd-machined[153285]: Machine qemu-7-instance-0000000f terminated.
Nov 29 01:51:14 np0005539505 NetworkManager[55134]: <info>  [1764399074.7831] manager: (tap63ca2ebd-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Nov 29 01:51:14 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215872]: [NOTICE]   (215878) : haproxy version is 2.8.14-c23fe91
Nov 29 01:51:14 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215872]: [NOTICE]   (215878) : path to executable is /usr/sbin/haproxy
Nov 29 01:51:14 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215872]: [ALERT]    (215878) : Current worker (215880) exited with code 143 (Terminated)
Nov 29 01:51:14 np0005539505 neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd[215872]: [WARNING]  (215878) : All workers exited. Exiting... (0)
Nov 29 01:51:14 np0005539505 systemd[1]: libpod-4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18.scope: Deactivated successfully.
Nov 29 01:51:14 np0005539505 conmon[215872]: conmon 4605a0c850dbc906081b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18.scope/container/memory.events
Nov 29 01:51:14 np0005539505 podman[216060]: 2025-11-29 06:51:14.794501379 +0000 UTC m=+0.050982034 container died 4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 01:51:14 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18-userdata-shm.mount: Deactivated successfully.
Nov 29 01:51:14 np0005539505 systemd[1]: var-lib-containers-storage-overlay-00219f4316926ace31cfbb9b6ceedd34b8c75c4088331caf6485e65237899871-merged.mount: Deactivated successfully.
Nov 29 01:51:14 np0005539505 podman[216060]: 2025-11-29 06:51:14.83047132 +0000 UTC m=+0.086951955 container cleanup 4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 01:51:14 np0005539505 systemd[1]: libpod-conmon-4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18.scope: Deactivated successfully.
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.844 186962 INFO nova.virt.libvirt.driver [-] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Instance destroyed successfully.#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.845 186962 DEBUG nova.objects.instance [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lazy-loading 'resources' on Instance uuid 1d7d2595-f3da-459e-a875-4b7ef22931bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.865 186962 DEBUG nova.virt.libvirt.vif [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:50:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersWithSpecificFlavorTestJSON-server-1781641709',display_name='tempest-ServersWithSpecificFlavorTestJSON-server-1781641709',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(20),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverswithspecificflavortestjson-server-1781641709',id=15,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=20,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM1xgBYSUC8gvnJcy96ZHeGXy5fZIEzREwu2SdQF2/u48S662pU9yPXCvPyqRaFOz6WipFAjO8OpVLmGBh29YapeoV6qQTkRjF0wg+tU+pxk7BBzNSGVITtKYZ1VXuTWyg==',key_name='tempest-keypair-1912782802',keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:50:49Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='99863a77c63a4673b0ef23a7c0ae373a',ramdisk_id='',reservation_id='r-lhsz7lpk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersWithSpecificFlavorTestJSON-706098489',owner_user_name='tempest-ServersWithSpecificFlavorTestJSON-706098489-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:50:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0908eb33a338434891ed9f5dd3768bab',uuid=1d7d2595-f3da-459e-a875-4b7ef22931bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "address": "fa:16:3e:99:7a:dd", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63ca2ebd-3e", "ovs_interfaceid": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.865 186962 DEBUG nova.network.os_vif_util [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converting VIF {"id": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "address": "fa:16:3e:99:7a:dd", "network": {"id": "61990940-649c-4332-bea9-4159087142dd", "bridge": "br-int", "label": "tempest-ServersWithSpecificFlavorTestJSON-1615815347-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.243", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "99863a77c63a4673b0ef23a7c0ae373a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63ca2ebd-3e", "ovs_interfaceid": "63ca2ebd-3ecf-41f1-aaa0-5aca12224c62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.866 186962 DEBUG nova.network.os_vif_util [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:99:7a:dd,bridge_name='br-int',has_traffic_filtering=True,id=63ca2ebd-3ecf-41f1-aaa0-5aca12224c62,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63ca2ebd-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.866 186962 DEBUG os_vif [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:7a:dd,bridge_name='br-int',has_traffic_filtering=True,id=63ca2ebd-3ecf-41f1-aaa0-5aca12224c62,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63ca2ebd-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.868 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.868 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63ca2ebd-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.869 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.872 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.874 186962 INFO os_vif [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:7a:dd,bridge_name='br-int',has_traffic_filtering=True,id=63ca2ebd-3ecf-41f1-aaa0-5aca12224c62,network=Network(61990940-649c-4332-bea9-4159087142dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63ca2ebd-3e')#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.875 186962 INFO nova.virt.libvirt.driver [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Deleting instance files /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc_del#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.875 186962 INFO nova.virt.libvirt.driver [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Deletion of /var/lib/nova/instances/1d7d2595-f3da-459e-a875-4b7ef22931bc_del complete#033[00m
Nov 29 01:51:14 np0005539505 podman[216107]: 2025-11-29 06:51:14.884793667 +0000 UTC m=+0.034904812 container remove 4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.888 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0ccbc937-22b2-479c-a196-ebf6beec58c3]: (4, ('Sat Nov 29 06:51:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd (4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18)\n4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18\nSat Nov 29 06:51:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-61990940-649c-4332-bea9-4159087142dd (4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18)\n4605a0c850dbc906081baae184fb7e2fc08a9a6b4eb2966638cbbe77a84caf18\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.890 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[979bb629-3212-4939-8538-e0ea33582d50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.892 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61990940-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.894 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:14 np0005539505 kernel: tap61990940-60: left promiscuous mode
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.900 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Acquiring lock "b8acedf9-c61c-4006-b7b7-0e6b2ede690b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.901 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "b8acedf9-c61c-4006-b7b7-0e6b2ede690b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.901 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Acquiring lock "b8acedf9-c61c-4006-b7b7-0e6b2ede690b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.901 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "b8acedf9-c61c-4006-b7b7-0e6b2ede690b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.901 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "b8acedf9-c61c-4006-b7b7-0e6b2ede690b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.908 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.910 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6722f5d7-cd86-49fb-981e-ee16c7ad2d17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.914 186962 INFO nova.compute.manager [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Terminating instance#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.924 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Acquiring lock "refresh_cache-b8acedf9-c61c-4006-b7b7-0e6b2ede690b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.925 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Acquired lock "refresh_cache-b8acedf9-c61c-4006-b7b7-0e6b2ede690b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.925 186962 DEBUG nova.network.neutron [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.933 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d02af80f-6f15-44e9-9cea-e5cc037ccf77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.934 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc8a94a-673f-4179-b83e-98e126fe3abe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.947 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[076b4490-e999-4453-845b-2c74d7efd12e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449622, 'reachable_time': 26769, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216121, 'error': None, 'target': 'ovnmeta-61990940-649c-4332-bea9-4159087142dd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:51:14 np0005539505 systemd[1]: run-netns-ovnmeta\x2d61990940\x2d649c\x2d4332\x2dbea9\x2d4159087142dd.mount: Deactivated successfully.
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.950 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-61990940-649c-4332-bea9-4159087142dd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:51:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:14.951 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[8a8da526-cb01-4e27-a7c8-0e796fa5048a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.995 186962 INFO nova.compute.manager [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.996 186962 DEBUG oslo.service.loopingcall [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.996 186962 DEBUG nova.compute.manager [-] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:51:14 np0005539505 nova_compute[186958]: 2025-11-29 06:51:14.996 186962 DEBUG nova.network.neutron [-] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:51:15 np0005539505 nova_compute[186958]: 2025-11-29 06:51:15.274 186962 DEBUG nova.network.neutron [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:51:15 np0005539505 nova_compute[186958]: 2025-11-29 06:51:15.650 186962 DEBUG nova.compute.manager [req-a2da92b7-7b13-4317-9d74-083b5a5085b8 req-84b2c367-686e-4a82-932d-48a22762e947 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Received event network-vif-unplugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:51:15 np0005539505 nova_compute[186958]: 2025-11-29 06:51:15.650 186962 DEBUG oslo_concurrency.lockutils [req-a2da92b7-7b13-4317-9d74-083b5a5085b8 req-84b2c367-686e-4a82-932d-48a22762e947 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:15 np0005539505 nova_compute[186958]: 2025-11-29 06:51:15.651 186962 DEBUG oslo_concurrency.lockutils [req-a2da92b7-7b13-4317-9d74-083b5a5085b8 req-84b2c367-686e-4a82-932d-48a22762e947 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:15 np0005539505 nova_compute[186958]: 2025-11-29 06:51:15.651 186962 DEBUG oslo_concurrency.lockutils [req-a2da92b7-7b13-4317-9d74-083b5a5085b8 req-84b2c367-686e-4a82-932d-48a22762e947 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:15 np0005539505 nova_compute[186958]: 2025-11-29 06:51:15.652 186962 DEBUG nova.compute.manager [req-a2da92b7-7b13-4317-9d74-083b5a5085b8 req-84b2c367-686e-4a82-932d-48a22762e947 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] No waiting events found dispatching network-vif-unplugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:51:15 np0005539505 nova_compute[186958]: 2025-11-29 06:51:15.652 186962 DEBUG nova.compute.manager [req-a2da92b7-7b13-4317-9d74-083b5a5085b8 req-84b2c367-686e-4a82-932d-48a22762e947 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Received event network-vif-unplugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.115 186962 DEBUG nova.network.neutron [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.136 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Releasing lock "refresh_cache-b8acedf9-c61c-4006-b7b7-0e6b2ede690b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.137 186962 DEBUG nova.compute.manager [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.154 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:16.158 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:51:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:16.160 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:51:16 np0005539505 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000012.scope: Deactivated successfully.
Nov 29 01:51:16 np0005539505 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000012.scope: Consumed 4.479s CPU time.
Nov 29 01:51:16 np0005539505 systemd-machined[153285]: Machine qemu-8-instance-00000012 terminated.
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.213 186962 DEBUG nova.network.neutron [-] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.236 186962 INFO nova.compute.manager [-] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Took 1.24 seconds to deallocate network for instance.#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.309 186962 DEBUG nova.compute.manager [req-25823647-3014-44e6-b412-e48b3bb18911 req-9775a4f3-62f5-4b1f-a668-0279fcfe4dc6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Received event network-vif-deleted-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.323 186962 DEBUG oslo_concurrency.lockutils [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.323 186962 DEBUG oslo_concurrency.lockutils [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.384 186962 INFO nova.virt.libvirt.driver [-] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Instance destroyed successfully.#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.384 186962 DEBUG nova.objects.instance [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lazy-loading 'resources' on Instance uuid b8acedf9-c61c-4006-b7b7-0e6b2ede690b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.397 186962 INFO nova.virt.libvirt.driver [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Deleting instance files /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b_del#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.397 186962 INFO nova.virt.libvirt.driver [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Deletion of /var/lib/nova/instances/b8acedf9-c61c-4006-b7b7-0e6b2ede690b_del complete#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.401 186962 DEBUG nova.compute.provider_tree [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.421 186962 DEBUG nova.scheduler.client.report [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.449 186962 DEBUG oslo_concurrency.lockutils [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.469 186962 INFO nova.compute.manager [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.469 186962 DEBUG oslo.service.loopingcall [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.470 186962 DEBUG nova.compute.manager [-] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.470 186962 DEBUG nova.network.neutron [-] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.473 186962 INFO nova.scheduler.client.report [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Deleted allocations for instance 1d7d2595-f3da-459e-a875-4b7ef22931bc#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.620 186962 DEBUG oslo_concurrency.lockutils [None req-7c00e0c1-87fa-4cfd-8280-b99bf47e7d85 0908eb33a338434891ed9f5dd3768bab 99863a77c63a4673b0ef23a7c0ae373a - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:16 np0005539505 nova_compute[186958]: 2025-11-29 06:51:16.959 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.117 186962 DEBUG nova.network.neutron [-] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.157 186962 DEBUG nova.network.neutron [-] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.178 186962 INFO nova.compute.manager [-] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Took 0.71 seconds to deallocate network for instance.#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.272 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.273 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.323 186962 DEBUG nova.compute.provider_tree [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.342 186962 DEBUG nova.scheduler.client.report [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.371 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.410 186962 INFO nova.scheduler.client.report [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Deleted allocations for instance b8acedf9-c61c-4006-b7b7-0e6b2ede690b#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.502 186962 DEBUG oslo_concurrency.lockutils [None req-42595df1-3936-41fb-b100-de921d44182b 1421f8a87cd7495f9ca2ecb96b70925c 3cd11e636a954ce4b808b2a5ebd76b12 - - default default] Lock "b8acedf9-c61c-4006-b7b7-0e6b2ede690b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:17 np0005539505 podman[216132]: 2025-11-29 06:51:17.736568937 +0000 UTC m=+0.063136855 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:51:17 np0005539505 podman[216131]: 2025-11-29 06:51:17.74804031 +0000 UTC m=+0.075608997 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.895 186962 DEBUG nova.compute.manager [req-23af980d-c374-44cb-9bb2-e5a770ff9857 req-bee1f335-2985-405d-b7c7-89116a8faa8b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Received event network-vif-plugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.896 186962 DEBUG oslo_concurrency.lockutils [req-23af980d-c374-44cb-9bb2-e5a770ff9857 req-bee1f335-2985-405d-b7c7-89116a8faa8b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.896 186962 DEBUG oslo_concurrency.lockutils [req-23af980d-c374-44cb-9bb2-e5a770ff9857 req-bee1f335-2985-405d-b7c7-89116a8faa8b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.896 186962 DEBUG oslo_concurrency.lockutils [req-23af980d-c374-44cb-9bb2-e5a770ff9857 req-bee1f335-2985-405d-b7c7-89116a8faa8b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1d7d2595-f3da-459e-a875-4b7ef22931bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.896 186962 DEBUG nova.compute.manager [req-23af980d-c374-44cb-9bb2-e5a770ff9857 req-bee1f335-2985-405d-b7c7-89116a8faa8b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] No waiting events found dispatching network-vif-plugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:51:17 np0005539505 nova_compute[186958]: 2025-11-29 06:51:17.897 186962 WARNING nova.compute.manager [req-23af980d-c374-44cb-9bb2-e5a770ff9857 req-bee1f335-2985-405d-b7c7-89116a8faa8b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Received unexpected event network-vif-plugged-63ca2ebd-3ecf-41f1-aaa0-5aca12224c62 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:51:19 np0005539505 nova_compute[186958]: 2025-11-29 06:51:19.871 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:20 np0005539505 podman[216177]: 2025-11-29 06:51:20.761273418 +0000 UTC m=+0.082592643 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 01:51:22 np0005539505 nova_compute[186958]: 2025-11-29 06:51:22.006 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:24.162 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:51:24 np0005539505 nova_compute[186958]: 2025-11-29 06:51:24.877 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:26 np0005539505 nova_compute[186958]: 2025-11-29 06:51:26.696 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:26 np0005539505 podman[216196]: 2025-11-29 06:51:26.743144423 +0000 UTC m=+0.074750512 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:51:26 np0005539505 nova_compute[186958]: 2025-11-29 06:51:26.801 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:26 np0005539505 podman[216197]: 2025-11-29 06:51:26.803010186 +0000 UTC m=+0.129365597 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:51:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:26.927 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:26.927 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:51:26.928 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:27 np0005539505 nova_compute[186958]: 2025-11-29 06:51:27.008 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:29 np0005539505 nova_compute[186958]: 2025-11-29 06:51:29.843 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399074.842578, 1d7d2595-f3da-459e-a875-4b7ef22931bc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:51:29 np0005539505 nova_compute[186958]: 2025-11-29 06:51:29.844 186962 INFO nova.compute.manager [-] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:51:29 np0005539505 nova_compute[186958]: 2025-11-29 06:51:29.880 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:29 np0005539505 nova_compute[186958]: 2025-11-29 06:51:29.893 186962 DEBUG nova.compute.manager [None req-0f8a4b14-1a5e-4790-9e6b-ea30ee656a07 - - - - - -] [instance: 1d7d2595-f3da-459e-a875-4b7ef22931bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:31 np0005539505 nova_compute[186958]: 2025-11-29 06:51:31.382 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399076.3813055, b8acedf9-c61c-4006-b7b7-0e6b2ede690b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:51:31 np0005539505 nova_compute[186958]: 2025-11-29 06:51:31.382 186962 INFO nova.compute.manager [-] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:51:31 np0005539505 nova_compute[186958]: 2025-11-29 06:51:31.459 186962 DEBUG nova.compute.manager [None req-022b04d4-d503-4b42-b4d7-13c1e33b6f00 - - - - - -] [instance: b8acedf9-c61c-4006-b7b7-0e6b2ede690b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:31 np0005539505 podman[216244]: 2025-11-29 06:51:31.731955524 +0000 UTC m=+0.063472885 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:51:32 np0005539505 nova_compute[186958]: 2025-11-29 06:51:32.010 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:34 np0005539505 nova_compute[186958]: 2025-11-29 06:51:34.918 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.109 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Acquiring lock "c9068b06-c8a5-41cb-b45f-5737247de868" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.110 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "c9068b06-c8a5-41cb-b45f-5737247de868" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.135 186962 DEBUG nova.compute.manager [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.524 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.525 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.531 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.531 186962 INFO nova.compute.claims [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.811 186962 DEBUG nova.compute.provider_tree [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.856 186962 DEBUG nova.scheduler.client.report [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.884 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:35 np0005539505 nova_compute[186958]: 2025-11-29 06:51:35.885 186962 DEBUG nova.compute.manager [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.059 186962 DEBUG nova.compute.manager [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.059 186962 DEBUG nova.network.neutron [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.097 186962 INFO nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.118 186962 DEBUG nova.compute.manager [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.358 186962 DEBUG nova.compute.manager [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.360 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.360 186962 INFO nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Creating image(s)#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.361 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Acquiring lock "/var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.362 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "/var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.363 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "/var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.380 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.425 186962 DEBUG nova.network.neutron [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.426 186962 DEBUG nova.compute.manager [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.457 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.458 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.458 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.470 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.522 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.523 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:36 np0005539505 podman[216273]: 2025-11-29 06:51:36.73303294 +0000 UTC m=+0.063877086 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.879 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk 1073741824" returned: 0 in 0.356s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.881 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.423s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.882 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.979 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.981 186962 DEBUG nova.virt.disk.api [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Checking if we can resize image /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:51:36 np0005539505 nova_compute[186958]: 2025-11-29 06:51:36.982 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.051 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.063 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.064 186962 DEBUG nova.virt.disk.api [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Cannot resize image /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.064 186962 DEBUG nova.objects.instance [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lazy-loading 'migration_context' on Instance uuid c9068b06-c8a5-41cb-b45f-5737247de868 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.138 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.139 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Ensure instance console log exists: /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.140 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.140 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.141 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.144 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.154 186962 WARNING nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.160 186962 DEBUG nova.virt.libvirt.host [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.161 186962 DEBUG nova.virt.libvirt.host [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.164 186962 DEBUG nova.virt.libvirt.host [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.165 186962 DEBUG nova.virt.libvirt.host [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.168 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.168 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.169 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.169 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.170 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.170 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.171 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.171 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.171 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.172 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.172 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.173 186962 DEBUG nova.virt.hardware [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.182 186962 DEBUG nova.objects.instance [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lazy-loading 'pci_devices' on Instance uuid c9068b06-c8a5-41cb-b45f-5737247de868 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.315 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  <uuid>c9068b06-c8a5-41cb-b45f-5737247de868</uuid>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  <name>instance-00000013</name>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerExternalEventsTest-server-1327000306</nova:name>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:51:37</nova:creationTime>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:51:37 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:        <nova:user uuid="61146a9c683b48b79e5c0a1eef3c7f83">tempest-ServerExternalEventsTest-371877085-project-member</nova:user>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:        <nova:project uuid="bec415bf7d6d430699df7ffd6d79a79c">tempest-ServerExternalEventsTest-371877085</nova:project>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <nova:ports/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <entry name="serial">c9068b06-c8a5-41cb-b45f-5737247de868</entry>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <entry name="uuid">c9068b06-c8a5-41cb-b45f-5737247de868</entry>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk.config"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/console.log" append="off"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:51:37 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:51:37 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:51:37 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:51:37 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.514 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.515 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.515 186962 INFO nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Using config drive#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.714 186962 INFO nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Creating config drive at /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk.config#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.719 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjdwcx_n5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:51:37 np0005539505 nova_compute[186958]: 2025-11-29 06:51:37.847 186962 DEBUG oslo_concurrency.processutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjdwcx_n5" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:51:37 np0005539505 systemd-machined[153285]: New machine qemu-9-instance-00000013.
Nov 29 01:51:37 np0005539505 systemd[1]: Started Virtual Machine qemu-9-instance-00000013.
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.410 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399098.409239, c9068b06-c8a5-41cb-b45f-5737247de868 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.411 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.414 186962 DEBUG nova.compute.manager [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.414 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.418 186962 INFO nova.virt.libvirt.driver [-] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Instance spawned successfully.#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.418 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.423 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.438 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.445 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.447 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.448 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.448 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.448 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.449 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.449 186962 DEBUG nova.virt.libvirt.driver [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.472 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.473 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399098.4096382, c9068b06-c8a5-41cb-b45f-5737247de868 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.473 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] VM Started (Lifecycle Event)#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.502 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.506 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.519 186962 INFO nova.compute.manager [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Took 2.16 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.520 186962 DEBUG nova.compute.manager [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.548 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.609 186962 INFO nova.compute.manager [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Took 3.19 seconds to build instance.#033[00m
Nov 29 01:51:38 np0005539505 nova_compute[186958]: 2025-11-29 06:51:38.626 186962 DEBUG oslo_concurrency.lockutils [None req-8fc6da75-49a7-4050-a8ba-caaf60ea3256 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "c9068b06-c8a5-41cb-b45f-5737247de868" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.517s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:39 np0005539505 nova_compute[186958]: 2025-11-29 06:51:39.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:39 np0005539505 nova_compute[186958]: 2025-11-29 06:51:39.505 186962 DEBUG nova.compute.manager [None req-f7c806d0-45f2-4d73-96df-6aa3db2f0dc6 e33e7fa5128e4d7f896210984bad8227 bfdfd32b181f4d52b5f2001ba9915c67 - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Received event network-changed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:51:39 np0005539505 nova_compute[186958]: 2025-11-29 06:51:39.505 186962 DEBUG nova.compute.manager [None req-f7c806d0-45f2-4d73-96df-6aa3db2f0dc6 e33e7fa5128e4d7f896210984bad8227 bfdfd32b181f4d52b5f2001ba9915c67 - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Refreshing instance network info cache due to event network-changed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:51:39 np0005539505 nova_compute[186958]: 2025-11-29 06:51:39.506 186962 DEBUG oslo_concurrency.lockutils [None req-f7c806d0-45f2-4d73-96df-6aa3db2f0dc6 e33e7fa5128e4d7f896210984bad8227 bfdfd32b181f4d52b5f2001ba9915c67 - - default default] Acquiring lock "refresh_cache-c9068b06-c8a5-41cb-b45f-5737247de868" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:51:39 np0005539505 nova_compute[186958]: 2025-11-29 06:51:39.506 186962 DEBUG oslo_concurrency.lockutils [None req-f7c806d0-45f2-4d73-96df-6aa3db2f0dc6 e33e7fa5128e4d7f896210984bad8227 bfdfd32b181f4d52b5f2001ba9915c67 - - default default] Acquired lock "refresh_cache-c9068b06-c8a5-41cb-b45f-5737247de868" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:51:39 np0005539505 nova_compute[186958]: 2025-11-29 06:51:39.506 186962 DEBUG nova.network.neutron [None req-f7c806d0-45f2-4d73-96df-6aa3db2f0dc6 e33e7fa5128e4d7f896210984bad8227 bfdfd32b181f4d52b5f2001ba9915c67 - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:51:39 np0005539505 nova_compute[186958]: 2025-11-29 06:51:39.789 186962 DEBUG nova.network.neutron [None req-f7c806d0-45f2-4d73-96df-6aa3db2f0dc6 e33e7fa5128e4d7f896210984bad8227 bfdfd32b181f4d52b5f2001ba9915c67 - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:51:39 np0005539505 nova_compute[186958]: 2025-11-29 06:51:39.922 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.011 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Acquiring lock "c9068b06-c8a5-41cb-b45f-5737247de868" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.011 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "c9068b06-c8a5-41cb-b45f-5737247de868" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.011 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Acquiring lock "c9068b06-c8a5-41cb-b45f-5737247de868-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.011 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "c9068b06-c8a5-41cb-b45f-5737247de868-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.012 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "c9068b06-c8a5-41cb-b45f-5737247de868-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.023 186962 INFO nova.compute.manager [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Terminating instance#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.038 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Acquiring lock "refresh_cache-c9068b06-c8a5-41cb-b45f-5737247de868" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.362 186962 DEBUG nova.network.neutron [None req-f7c806d0-45f2-4d73-96df-6aa3db2f0dc6 e33e7fa5128e4d7f896210984bad8227 bfdfd32b181f4d52b5f2001ba9915c67 - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.393 186962 DEBUG oslo_concurrency.lockutils [None req-f7c806d0-45f2-4d73-96df-6aa3db2f0dc6 e33e7fa5128e4d7f896210984bad8227 bfdfd32b181f4d52b5f2001ba9915c67 - - default default] Releasing lock "refresh_cache-c9068b06-c8a5-41cb-b45f-5737247de868" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.393 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Acquired lock "refresh_cache-c9068b06-c8a5-41cb-b45f-5737247de868" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.393 186962 DEBUG nova.network.neutron [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:51:40 np0005539505 nova_compute[186958]: 2025-11-29 06:51:40.754 186962 DEBUG nova.network.neutron [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.270 186962 DEBUG nova.network.neutron [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.299 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Releasing lock "refresh_cache-c9068b06-c8a5-41cb-b45f-5737247de868" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.299 186962 DEBUG nova.compute.manager [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:51:41 np0005539505 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000013.scope: Deactivated successfully.
Nov 29 01:51:41 np0005539505 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000013.scope: Consumed 3.350s CPU time.
Nov 29 01:51:41 np0005539505 systemd-machined[153285]: Machine qemu-9-instance-00000013 terminated.
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.409 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.409 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.410 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.410 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.543 186962 INFO nova.virt.libvirt.driver [-] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Instance destroyed successfully.#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.544 186962 DEBUG nova.objects.instance [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lazy-loading 'resources' on Instance uuid c9068b06-c8a5-41cb-b45f-5737247de868 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.583 186962 INFO nova.virt.libvirt.driver [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Deleting instance files /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868_del#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.584 186962 INFO nova.virt.libvirt.driver [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Deletion of /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868_del complete#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.617 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000013, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/c9068b06-c8a5-41cb-b45f-5737247de868/disk#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.670 186962 INFO nova.compute.manager [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.670 186962 DEBUG oslo.service.loopingcall [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.671 186962 DEBUG nova.compute.manager [-] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.671 186962 DEBUG nova.network.neutron [-] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.825 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.827 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5749MB free_disk=73.3363037109375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.827 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:41 np0005539505 nova_compute[186958]: 2025-11-29 06:51:41.828 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:42 np0005539505 nova_compute[186958]: 2025-11-29 06:51:42.082 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:42 np0005539505 nova_compute[186958]: 2025-11-29 06:51:42.255 186962 DEBUG nova.network.neutron [-] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:51:42 np0005539505 nova_compute[186958]: 2025-11-29 06:51:42.840 186962 DEBUG nova.network.neutron [-] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:51:42 np0005539505 nova_compute[186958]: 2025-11-29 06:51:42.866 186962 INFO nova.compute.manager [-] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Took 1.20 seconds to deallocate network for instance.#033[00m
Nov 29 01:51:42 np0005539505 nova_compute[186958]: 2025-11-29 06:51:42.895 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance c9068b06-c8a5-41cb-b45f-5737247de868 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:51:42 np0005539505 nova_compute[186958]: 2025-11-29 06:51:42.896 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:51:42 np0005539505 nova_compute[186958]: 2025-11-29 06:51:42.896 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.087 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.091 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.103 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.123 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.123 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.123 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.124 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.124 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.183 186962 DEBUG nova.compute.provider_tree [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.203 186962 DEBUG nova.scheduler.client.report [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.227 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.298 186962 INFO nova.scheduler.client.report [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Deleted allocations for instance c9068b06-c8a5-41cb-b45f-5737247de868#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.406 186962 DEBUG oslo_concurrency.lockutils [None req-6afa34bd-008a-4432-a76c-4e69204820f8 61146a9c683b48b79e5c0a1eef3c7f83 bec415bf7d6d430699df7ffd6d79a79c - - default default] Lock "c9068b06-c8a5-41cb-b45f-5737247de868" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.408 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.408 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.409 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.409 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.432 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.433 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.433 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.433 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.433 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.434 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.434 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:51:43 np0005539505 nova_compute[186958]: 2025-11-29 06:51:43.452 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:51:44 np0005539505 nova_compute[186958]: 2025-11-29 06:51:44.925 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:47 np0005539505 nova_compute[186958]: 2025-11-29 06:51:47.116 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:48 np0005539505 nova_compute[186958]: 2025-11-29 06:51:48.397 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:51:48 np0005539505 podman[216338]: 2025-11-29 06:51:48.751178399 +0000 UTC m=+0.075214295 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:51:48 np0005539505 podman[216337]: 2025-11-29 06:51:48.772117998 +0000 UTC m=+0.096329069 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 01:51:49 np0005539505 nova_compute[186958]: 2025-11-29 06:51:49.928 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:51 np0005539505 podman[216381]: 2025-11-29 06:51:51.740202488 +0000 UTC m=+0.069927527 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 01:51:52 np0005539505 nova_compute[186958]: 2025-11-29 06:51:52.119 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:54 np0005539505 nova_compute[186958]: 2025-11-29 06:51:54.932 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:56 np0005539505 nova_compute[186958]: 2025-11-29 06:51:56.541 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399101.5394018, c9068b06-c8a5-41cb-b45f-5737247de868 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:51:56 np0005539505 nova_compute[186958]: 2025-11-29 06:51:56.541 186962 INFO nova.compute.manager [-] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:51:56 np0005539505 nova_compute[186958]: 2025-11-29 06:51:56.570 186962 DEBUG nova.compute.manager [None req-2231c6e3-5bdc-4e40-9900-65a99a44893b - - - - - -] [instance: c9068b06-c8a5-41cb-b45f-5737247de868] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:51:57 np0005539505 nova_compute[186958]: 2025-11-29 06:51:57.162 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:51:57 np0005539505 podman[216400]: 2025-11-29 06:51:57.777605774 +0000 UTC m=+0.105102455 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:51:57 np0005539505 podman[216401]: 2025-11-29 06:51:57.839588096 +0000 UTC m=+0.162553120 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 01:51:59 np0005539505 nova_compute[186958]: 2025-11-29 06:51:59.934 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:01 np0005539505 nova_compute[186958]: 2025-11-29 06:52:01.972 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.107 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "0b850e95-2727-4d2f-afa1-7a755670a387" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.108 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.108 186962 INFO nova.compute.manager [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Unshelving#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.164 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.374 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.375 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.380 186962 DEBUG nova.objects.instance [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.404 186962 DEBUG nova.objects.instance [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'numa_topology' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.417 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.417 186962 INFO nova.compute.claims [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:52:02 np0005539505 podman[216447]: 2025-11-29 06:52:02.765416767 +0000 UTC m=+0.089040794 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.928 186962 DEBUG nova.compute.provider_tree [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.948 186962 DEBUG nova.scheduler.client.report [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:52:02 np0005539505 nova_compute[186958]: 2025-11-29 06:52:02.973 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:03 np0005539505 nova_compute[186958]: 2025-11-29 06:52:03.622 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:52:03 np0005539505 nova_compute[186958]: 2025-11-29 06:52:03.623 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquired lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:52:03 np0005539505 nova_compute[186958]: 2025-11-29 06:52:03.623 186962 DEBUG nova.network.neutron [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:52:03 np0005539505 nova_compute[186958]: 2025-11-29 06:52:03.888 186962 DEBUG nova.network.neutron [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:52:04 np0005539505 nova_compute[186958]: 2025-11-29 06:52:04.937 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:05 np0005539505 nova_compute[186958]: 2025-11-29 06:52:05.078 186962 DEBUG nova.network.neutron [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:52:05 np0005539505 nova_compute[186958]: 2025-11-29 06:52:05.105 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Releasing lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:52:05 np0005539505 nova_compute[186958]: 2025-11-29 06:52:05.106 186962 DEBUG nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:52:05 np0005539505 nova_compute[186958]: 2025-11-29 06:52:05.106 186962 INFO nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Creating image(s)#033[00m
Nov 29 01:52:05 np0005539505 nova_compute[186958]: 2025-11-29 06:52:05.107 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:05 np0005539505 nova_compute[186958]: 2025-11-29 06:52:05.107 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:05 np0005539505 nova_compute[186958]: 2025-11-29 06:52:05.108 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:05 np0005539505 nova_compute[186958]: 2025-11-29 06:52:05.108 186962 DEBUG nova.objects.instance [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:05 np0005539505 nova_compute[186958]: 2025-11-29 06:52:05.121 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "699710de794702bf7c50d3f51aa45a0dd64d5fc4" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:05 np0005539505 nova_compute[186958]: 2025-11-29 06:52:05.122 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "699710de794702bf7c50d3f51aa45a0dd64d5fc4" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:07 np0005539505 nova_compute[186958]: 2025-11-29 06:52:07.558 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:07 np0005539505 podman[216469]: 2025-11-29 06:52:07.731155959 +0000 UTC m=+0.062073776 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.299 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.364 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4.part --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.365 186962 DEBUG nova.virt.images [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] f1908021-8e03-4e68-be35-6176b9f2a833 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.366 186962 DEBUG nova.privsep.utils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.366 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4.part /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.726 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4.part /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4.converted" returned: 0 in 0.360s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.737 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.794 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4.converted --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.795 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "699710de794702bf7c50d3f51aa45a0dd64d5fc4" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.674s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.813 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.866 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.868 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "699710de794702bf7c50d3f51aa45a0dd64d5fc4" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.868 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "699710de794702bf7c50d3f51aa45a0dd64d5fc4" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.883 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.946 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.947 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4,backing_fmt=raw /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.980 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4,backing_fmt=raw /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.981 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "699710de794702bf7c50d3f51aa45a0dd64d5fc4" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:08 np0005539505 nova_compute[186958]: 2025-11-29 06:52:08.981 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:09 np0005539505 nova_compute[186958]: 2025-11-29 06:52:09.036 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:09 np0005539505 nova_compute[186958]: 2025-11-29 06:52:09.037 186962 DEBUG nova.objects.instance [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'migration_context' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:09 np0005539505 nova_compute[186958]: 2025-11-29 06:52:09.053 186962 INFO nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Rebasing disk image.#033[00m
Nov 29 01:52:09 np0005539505 nova_compute[186958]: 2025-11-29 06:52:09.054 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:09 np0005539505 nova_compute[186958]: 2025-11-29 06:52:09.106 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:09 np0005539505 nova_compute[186958]: 2025-11-29 06:52:09.107 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 -F raw /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:09 np0005539505 ovn_controller[95143]: 2025-11-29T06:52:09Z|00088|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 01:52:09 np0005539505 nova_compute[186958]: 2025-11-29 06:52:09.939 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:10 np0005539505 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.611 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 -F raw /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk" returned: 0 in 1.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.612 186962 DEBUG nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.612 186962 DEBUG nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Ensure instance console log exists: /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.613 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.613 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.613 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.615 186962 DEBUG nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='7da1ee9b8aa53ae9329ca21f16408c6f',container_format='bare',created_at=2025-11-29T06:51:36Z,direct_url=<?>,disk_format='qcow2',id=f1908021-8e03-4e68-be35-6176b9f2a833,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-188422497-shelved',owner='2ad5553710d5496dafe785396586bef5',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2025-11-29T06:51:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.619 186962 WARNING nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.625 186962 DEBUG nova.virt.libvirt.host [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.626 186962 DEBUG nova.virt.libvirt.host [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.628 186962 DEBUG nova.virt.libvirt.host [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.628 186962 DEBUG nova.virt.libvirt.host [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.629 186962 DEBUG nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.630 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='7da1ee9b8aa53ae9329ca21f16408c6f',container_format='bare',created_at=2025-11-29T06:51:36Z,direct_url=<?>,disk_format='qcow2',id=f1908021-8e03-4e68-be35-6176b9f2a833,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-188422497-shelved',owner='2ad5553710d5496dafe785396586bef5',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2025-11-29T06:51:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.630 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.630 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.631 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.631 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.631 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.631 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.632 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.632 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.632 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.632 186962 DEBUG nova.virt.hardware [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.633 186962 DEBUG nova.objects.instance [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.651 186962 DEBUG nova.objects.instance [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.664 186962 DEBUG nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  <uuid>0b850e95-2727-4d2f-afa1-7a755670a387</uuid>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  <name>instance-00000010</name>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <nova:name>tempest-UnshelveToHostMultiNodesTest-server-188422497</nova:name>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:52:10</nova:creationTime>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:52:10 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:        <nova:user uuid="9399c90511c44462b8092380bad3cfdc">tempest-UnshelveToHostMultiNodesTest-1888846715-project-member</nova:user>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:        <nova:project uuid="2ad5553710d5496dafe785396586bef5">tempest-UnshelveToHostMultiNodesTest-1888846715</nova:project>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="f1908021-8e03-4e68-be35-6176b9f2a833"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <nova:ports/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <entry name="serial">0b850e95-2727-4d2f-afa1-7a755670a387</entry>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <entry name="uuid">0b850e95-2727-4d2f-afa1-7a755670a387</entry>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/console.log" append="off"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <input type="keyboard" bus="usb"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:52:10 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:52:10 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:52:10 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:52:10 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.705 186962 DEBUG nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.706 186962 DEBUG nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.706 186962 INFO nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Using config drive#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.724 186962 DEBUG nova.objects.instance [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.762 186962 DEBUG nova.objects.instance [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lazy-loading 'keypairs' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.936 186962 INFO nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Creating config drive at /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config#033[00m
Nov 29 01:52:10 np0005539505 nova_compute[186958]: 2025-11-29 06:52:10.941 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp31k45v33 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.063 186962 DEBUG oslo_concurrency.processutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp31k45v33" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:11 np0005539505 systemd-machined[153285]: New machine qemu-10-instance-00000010.
Nov 29 01:52:11 np0005539505 systemd[1]: Started Virtual Machine qemu-10-instance-00000010.
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.445 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399131.4452956, 0b850e95-2727-4d2f-afa1-7a755670a387 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.446 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.448 186962 DEBUG nova.compute.manager [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.449 186962 DEBUG nova.virt.libvirt.driver [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.452 186962 INFO nova.virt.libvirt.driver [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance spawned successfully.#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.463 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.465 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.483 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.483 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399131.4462483, 0b850e95-2727-4d2f-afa1-7a755670a387 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.484 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] VM Started (Lifecycle Event)#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.500 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.505 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:52:11 np0005539505 nova_compute[186958]: 2025-11-29 06:52:11.528 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:52:12 np0005539505 nova_compute[186958]: 2025-11-29 06:52:12.241 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:12 np0005539505 nova_compute[186958]: 2025-11-29 06:52:12.594 186962 DEBUG nova.compute.manager [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:12 np0005539505 nova_compute[186958]: 2025-11-29 06:52:12.694 186962 DEBUG oslo_concurrency.lockutils [None req-3d2eb811-0c71-4d5a-9658-dc5027d0b25b 72045f7461264a329f5e5033e85262d5 15ff417cc2324fa98b9837742841d9e5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 10.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:14 np0005539505 nova_compute[186958]: 2025-11-29 06:52:14.335 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "0b850e95-2727-4d2f-afa1-7a755670a387" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:14 np0005539505 nova_compute[186958]: 2025-11-29 06:52:14.336 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:14 np0005539505 nova_compute[186958]: 2025-11-29 06:52:14.336 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "0b850e95-2727-4d2f-afa1-7a755670a387-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:14 np0005539505 nova_compute[186958]: 2025-11-29 06:52:14.337 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:14 np0005539505 nova_compute[186958]: 2025-11-29 06:52:14.337 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:14 np0005539505 nova_compute[186958]: 2025-11-29 06:52:14.353 186962 INFO nova.compute.manager [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Terminating instance#033[00m
Nov 29 01:52:14 np0005539505 nova_compute[186958]: 2025-11-29 06:52:14.368 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:52:14 np0005539505 nova_compute[186958]: 2025-11-29 06:52:14.368 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquired lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:52:14 np0005539505 nova_compute[186958]: 2025-11-29 06:52:14.369 186962 DEBUG nova.network.neutron [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:52:14 np0005539505 nova_compute[186958]: 2025-11-29 06:52:14.720 186962 DEBUG nova.network.neutron [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:52:14 np0005539505 nova_compute[186958]: 2025-11-29 06:52:14.942 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:15 np0005539505 nova_compute[186958]: 2025-11-29 06:52:15.311 186962 DEBUG nova.network.neutron [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:52:15 np0005539505 nova_compute[186958]: 2025-11-29 06:52:15.344 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Releasing lock "refresh_cache-0b850e95-2727-4d2f-afa1-7a755670a387" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:52:15 np0005539505 nova_compute[186958]: 2025-11-29 06:52:15.347 186962 DEBUG nova.compute.manager [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:52:15 np0005539505 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000010.scope: Deactivated successfully.
Nov 29 01:52:15 np0005539505 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000010.scope: Consumed 4.300s CPU time.
Nov 29 01:52:15 np0005539505 systemd-machined[153285]: Machine qemu-10-instance-00000010 terminated.
Nov 29 01:52:15 np0005539505 nova_compute[186958]: 2025-11-29 06:52:15.604 186962 INFO nova.virt.libvirt.driver [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance destroyed successfully.#033[00m
Nov 29 01:52:15 np0005539505 nova_compute[186958]: 2025-11-29 06:52:15.604 186962 DEBUG nova.objects.instance [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lazy-loading 'resources' on Instance uuid 0b850e95-2727-4d2f-afa1-7a755670a387 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:15 np0005539505 nova_compute[186958]: 2025-11-29 06:52:15.617 186962 INFO nova.virt.libvirt.driver [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Deleting instance files /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387_del#033[00m
Nov 29 01:52:15 np0005539505 nova_compute[186958]: 2025-11-29 06:52:15.622 186962 INFO nova.virt.libvirt.driver [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Deletion of /var/lib/nova/instances/0b850e95-2727-4d2f-afa1-7a755670a387_del complete#033[00m
Nov 29 01:52:15 np0005539505 nova_compute[186958]: 2025-11-29 06:52:15.686 186962 INFO nova.compute.manager [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:52:15 np0005539505 nova_compute[186958]: 2025-11-29 06:52:15.687 186962 DEBUG oslo.service.loopingcall [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:52:15 np0005539505 nova_compute[186958]: 2025-11-29 06:52:15.687 186962 DEBUG nova.compute.manager [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:52:15 np0005539505 nova_compute[186958]: 2025-11-29 06:52:15.687 186962 DEBUG nova.network.neutron [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:52:16 np0005539505 nova_compute[186958]: 2025-11-29 06:52:16.165 186962 DEBUG nova.network.neutron [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:52:16 np0005539505 nova_compute[186958]: 2025-11-29 06:52:16.252 186962 DEBUG nova.network.neutron [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:52:16 np0005539505 nova_compute[186958]: 2025-11-29 06:52:16.271 186962 INFO nova.compute.manager [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Took 0.58 seconds to deallocate network for instance.#033[00m
Nov 29 01:52:16 np0005539505 nova_compute[186958]: 2025-11-29 06:52:16.413 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:16 np0005539505 nova_compute[186958]: 2025-11-29 06:52:16.414 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:16 np0005539505 nova_compute[186958]: 2025-11-29 06:52:16.484 186962 DEBUG nova.compute.provider_tree [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:52:16 np0005539505 nova_compute[186958]: 2025-11-29 06:52:16.520 186962 DEBUG nova.scheduler.client.report [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:52:16 np0005539505 nova_compute[186958]: 2025-11-29 06:52:16.559 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:16 np0005539505 nova_compute[186958]: 2025-11-29 06:52:16.589 186962 INFO nova.scheduler.client.report [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Deleted allocations for instance 0b850e95-2727-4d2f-afa1-7a755670a387#033[00m
Nov 29 01:52:16 np0005539505 nova_compute[186958]: 2025-11-29 06:52:16.683 186962 DEBUG oslo_concurrency.lockutils [None req-669b6a07-7a17-4c42-8872-7ed9b5f5d110 9399c90511c44462b8092380bad3cfdc 2ad5553710d5496dafe785396586bef5 - - default default] Lock "0b850e95-2727-4d2f-afa1-7a755670a387" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:17 np0005539505 nova_compute[186958]: 2025-11-29 06:52:17.304 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:17 np0005539505 nova_compute[186958]: 2025-11-29 06:52:17.483 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:17.483 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:52:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:17.484 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:52:19 np0005539505 podman[216559]: 2025-11-29 06:52:19.754202946 +0000 UTC m=+0.077386078 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:52:19 np0005539505 podman[216558]: 2025-11-29 06:52:19.762003215 +0000 UTC m=+0.085066743 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Nov 29 01:52:19 np0005539505 nova_compute[186958]: 2025-11-29 06:52:19.946 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:22 np0005539505 nova_compute[186958]: 2025-11-29 06:52:22.372 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:22 np0005539505 podman[216601]: 2025-11-29 06:52:22.770727127 +0000 UTC m=+0.101168435 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:52:24 np0005539505 nova_compute[186958]: 2025-11-29 06:52:24.948 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:25.487 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:26.928 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:26.929 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:26.929 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:27 np0005539505 nova_compute[186958]: 2025-11-29 06:52:27.414 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:28 np0005539505 podman[216620]: 2025-11-29 06:52:28.74155592 +0000 UTC m=+0.067007176 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:52:28 np0005539505 podman[216621]: 2025-11-29 06:52:28.778109348 +0000 UTC m=+0.100937360 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 29 01:52:29 np0005539505 nova_compute[186958]: 2025-11-29 06:52:29.956 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:30 np0005539505 nova_compute[186958]: 2025-11-29 06:52:30.603 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399135.6014903, 0b850e95-2727-4d2f-afa1-7a755670a387 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:30 np0005539505 nova_compute[186958]: 2025-11-29 06:52:30.603 186962 INFO nova.compute.manager [-] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:52:30 np0005539505 nova_compute[186958]: 2025-11-29 06:52:30.644 186962 DEBUG nova.compute.manager [None req-acbe4571-b5af-4056-84ea-e903ec703e10 - - - - - -] [instance: 0b850e95-2727-4d2f-afa1-7a755670a387] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:32 np0005539505 nova_compute[186958]: 2025-11-29 06:52:32.449 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:33 np0005539505 podman[216668]: 2025-11-29 06:52:33.765435718 +0000 UTC m=+0.102770620 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:52:34 np0005539505 nova_compute[186958]: 2025-11-29 06:52:34.971 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:37 np0005539505 nova_compute[186958]: 2025-11-29 06:52:37.497 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:38 np0005539505 podman[216690]: 2025-11-29 06:52:38.767576004 +0000 UTC m=+0.086454391 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 01:52:39 np0005539505 nova_compute[186958]: 2025-11-29 06:52:39.409 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.034 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.230 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "a6a138ce-4707-4db6-892e-31809c4b4e03" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.231 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.252 186962 DEBUG nova.compute.manager [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.427 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.428 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.436 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.437 186962 INFO nova.compute.claims [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.620 186962 DEBUG nova.compute.provider_tree [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.638 186962 DEBUG nova.scheduler.client.report [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.664 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.665 186962 DEBUG nova.compute.manager [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.748 186962 DEBUG nova.compute.manager [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.749 186962 DEBUG nova.network.neutron [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.772 186962 INFO nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.799 186962 DEBUG nova.compute.manager [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.931 186962 DEBUG nova.compute.manager [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.933 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.933 186962 INFO nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Creating image(s)#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.934 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "/var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.934 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.935 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.950 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:40 np0005539505 nova_compute[186958]: 2025-11-29 06:52:40.995 186962 DEBUG nova.policy [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.013 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.014 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.015 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.031 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.102 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.104 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.137 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.139 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.139 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.192 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.193 186962 DEBUG nova.virt.disk.api [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Checking if we can resize image /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.193 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.247 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.249 186962 DEBUG nova.virt.disk.api [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Cannot resize image /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.249 186962 DEBUG nova.objects.instance [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'migration_context' on Instance uuid a6a138ce-4707-4db6-892e-31809c4b4e03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.269 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.270 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Ensure instance console log exists: /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.270 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.270 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.271 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:41 np0005539505 nova_compute[186958]: 2025-11-29 06:52:41.760 186962 DEBUG nova.network.neutron [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Successfully created port: 5afcc88b-5bd9-4937-8f10-36d3484dc52a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.401 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.401 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.402 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.402 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.581 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.683 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.684 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5789MB free_disk=73.26851272583008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.684 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.684 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.758 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance a6a138ce-4707-4db6-892e-31809c4b4e03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.758 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.758 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.805 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.820 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.839 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:52:42 np0005539505 nova_compute[186958]: 2025-11-29 06:52:42.839 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:43 np0005539505 nova_compute[186958]: 2025-11-29 06:52:43.478 186962 DEBUG nova.network.neutron [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Successfully updated port: 5afcc88b-5bd9-4937-8f10-36d3484dc52a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:52:43 np0005539505 nova_compute[186958]: 2025-11-29 06:52:43.492 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "refresh_cache-a6a138ce-4707-4db6-892e-31809c4b4e03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:52:43 np0005539505 nova_compute[186958]: 2025-11-29 06:52:43.493 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquired lock "refresh_cache-a6a138ce-4707-4db6-892e-31809c4b4e03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:52:43 np0005539505 nova_compute[186958]: 2025-11-29 06:52:43.493 186962 DEBUG nova.network.neutron [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:52:43 np0005539505 nova_compute[186958]: 2025-11-29 06:52:43.596 186962 DEBUG nova.compute.manager [req-1e9ac0ba-b78e-4a30-9af6-d3065b0557d5 req-564a15f4-d8ad-4357-87ea-496fb831d1be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Received event network-changed-5afcc88b-5bd9-4937-8f10-36d3484dc52a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:52:43 np0005539505 nova_compute[186958]: 2025-11-29 06:52:43.596 186962 DEBUG nova.compute.manager [req-1e9ac0ba-b78e-4a30-9af6-d3065b0557d5 req-564a15f4-d8ad-4357-87ea-496fb831d1be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Refreshing instance network info cache due to event network-changed-5afcc88b-5bd9-4937-8f10-36d3484dc52a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:52:43 np0005539505 nova_compute[186958]: 2025-11-29 06:52:43.597 186962 DEBUG oslo_concurrency.lockutils [req-1e9ac0ba-b78e-4a30-9af6-d3065b0557d5 req-564a15f4-d8ad-4357-87ea-496fb831d1be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-a6a138ce-4707-4db6-892e-31809c4b4e03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:52:43 np0005539505 nova_compute[186958]: 2025-11-29 06:52:43.752 186962 DEBUG nova.network.neutron [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:52:43 np0005539505 nova_compute[186958]: 2025-11-29 06:52:43.834 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:43 np0005539505 nova_compute[186958]: 2025-11-29 06:52:43.835 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:43 np0005539505 nova_compute[186958]: 2025-11-29 06:52:43.835 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.394 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.395 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.396 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.754 186962 DEBUG nova.network.neutron [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Updating instance_info_cache with network_info: [{"id": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "address": "fa:16:3e:b9:e9:8d", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afcc88b-5b", "ovs_interfaceid": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.780 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Releasing lock "refresh_cache-a6a138ce-4707-4db6-892e-31809c4b4e03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.780 186962 DEBUG nova.compute.manager [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Instance network_info: |[{"id": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "address": "fa:16:3e:b9:e9:8d", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afcc88b-5b", "ovs_interfaceid": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.781 186962 DEBUG oslo_concurrency.lockutils [req-1e9ac0ba-b78e-4a30-9af6-d3065b0557d5 req-564a15f4-d8ad-4357-87ea-496fb831d1be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-a6a138ce-4707-4db6-892e-31809c4b4e03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.781 186962 DEBUG nova.network.neutron [req-1e9ac0ba-b78e-4a30-9af6-d3065b0557d5 req-564a15f4-d8ad-4357-87ea-496fb831d1be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Refreshing network info cache for port 5afcc88b-5bd9-4937-8f10-36d3484dc52a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.785 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Start _get_guest_xml network_info=[{"id": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "address": "fa:16:3e:b9:e9:8d", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afcc88b-5b", "ovs_interfaceid": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.789 186962 WARNING nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.793 186962 DEBUG nova.virt.libvirt.host [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.794 186962 DEBUG nova.virt.libvirt.host [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.796 186962 DEBUG nova.virt.libvirt.host [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.797 186962 DEBUG nova.virt.libvirt.host [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.798 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.799 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.800 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.800 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.800 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.800 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.801 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.801 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.801 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.802 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.802 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.802 186962 DEBUG nova.virt.hardware [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.807 186962 DEBUG nova.virt.libvirt.vif [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:52:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1550340611',display_name='tempest-ImagesTestJSON-server-1550340611',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1550340611',id=21,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-qv2n60l6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:52:40Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=a6a138ce-4707-4db6-892e-31809c4b4e03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "address": "fa:16:3e:b9:e9:8d", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afcc88b-5b", "ovs_interfaceid": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.807 186962 DEBUG nova.network.os_vif_util [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "address": "fa:16:3e:b9:e9:8d", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afcc88b-5b", "ovs_interfaceid": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.808 186962 DEBUG nova.network.os_vif_util [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:e9:8d,bridge_name='br-int',has_traffic_filtering=True,id=5afcc88b-5bd9-4937-8f10-36d3484dc52a,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afcc88b-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.809 186962 DEBUG nova.objects.instance [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'pci_devices' on Instance uuid a6a138ce-4707-4db6-892e-31809c4b4e03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.829 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  <uuid>a6a138ce-4707-4db6-892e-31809c4b4e03</uuid>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  <name>instance-00000015</name>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <nova:name>tempest-ImagesTestJSON-server-1550340611</nova:name>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:52:44</nova:creationTime>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:        <nova:user uuid="315be492c2ce4b9f8af2898e6794a256">tempest-ImagesTestJSON-1674785298-project-member</nova:user>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:        <nova:project uuid="78f8ba841bbe4fdcb9d9e2237d97bf73">tempest-ImagesTestJSON-1674785298</nova:project>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:        <nova:port uuid="5afcc88b-5bd9-4937-8f10-36d3484dc52a">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <entry name="serial">a6a138ce-4707-4db6-892e-31809c4b4e03</entry>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <entry name="uuid">a6a138ce-4707-4db6-892e-31809c4b4e03</entry>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk.config"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:b9:e9:8d"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <target dev="tap5afcc88b-5b"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/console.log" append="off"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:52:44 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:52:44 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:52:44 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:52:44 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.830 186962 DEBUG nova.compute.manager [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Preparing to wait for external event network-vif-plugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.831 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.831 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.831 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.832 186962 DEBUG nova.virt.libvirt.vif [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:52:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1550340611',display_name='tempest-ImagesTestJSON-server-1550340611',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1550340611',id=21,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-qv2n60l6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:52:40Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=a6a138ce-4707-4db6-892e-31809c4b4e03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "address": "fa:16:3e:b9:e9:8d", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afcc88b-5b", "ovs_interfaceid": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.833 186962 DEBUG nova.network.os_vif_util [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "address": "fa:16:3e:b9:e9:8d", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afcc88b-5b", "ovs_interfaceid": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.834 186962 DEBUG nova.network.os_vif_util [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:e9:8d,bridge_name='br-int',has_traffic_filtering=True,id=5afcc88b-5bd9-4937-8f10-36d3484dc52a,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afcc88b-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.834 186962 DEBUG os_vif [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:e9:8d,bridge_name='br-int',has_traffic_filtering=True,id=5afcc88b-5bd9-4937-8f10-36d3484dc52a,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afcc88b-5b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.835 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.835 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.836 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.840 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.840 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5afcc88b-5b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.841 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5afcc88b-5b, col_values=(('external_ids', {'iface-id': '5afcc88b-5bd9-4937-8f10-36d3484dc52a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:e9:8d', 'vm-uuid': 'a6a138ce-4707-4db6-892e-31809c4b4e03'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.885 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:44 np0005539505 NetworkManager[55134]: <info>  [1764399164.8863] manager: (tap5afcc88b-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.888 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.891 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.892 186962 INFO os_vif [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:e9:8d,bridge_name='br-int',has_traffic_filtering=True,id=5afcc88b-5bd9-4937-8f10-36d3484dc52a,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afcc88b-5b')#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.949 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.950 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.950 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No VIF found with MAC fa:16:3e:b9:e9:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:52:44 np0005539505 nova_compute[186958]: 2025-11-29 06:52:44.950 186962 INFO nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Using config drive#033[00m
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.407 186962 INFO nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Creating config drive at /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk.config#033[00m
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.412 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmvp0bxsy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.538 186962 DEBUG oslo_concurrency.processutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmvp0bxsy" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:52:45 np0005539505 kernel: tap5afcc88b-5b: entered promiscuous mode
Nov 29 01:52:45 np0005539505 NetworkManager[55134]: <info>  [1764399165.6097] manager: (tap5afcc88b-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Nov 29 01:52:45 np0005539505 ovn_controller[95143]: 2025-11-29T06:52:45Z|00089|binding|INFO|Claiming lport 5afcc88b-5bd9-4937-8f10-36d3484dc52a for this chassis.
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.609 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:45 np0005539505 ovn_controller[95143]: 2025-11-29T06:52:45Z|00090|binding|INFO|5afcc88b-5bd9-4937-8f10-36d3484dc52a: Claiming fa:16:3e:b9:e9:8d 10.100.0.9
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.612 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.617 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.631 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:e9:8d 10.100.0.9'], port_security=['fa:16:3e:b9:e9:8d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=5afcc88b-5bd9-4937-8f10-36d3484dc52a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.633 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 5afcc88b-5bd9-4937-8f10-36d3484dc52a in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba bound to our chassis#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.634 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba#033[00m
Nov 29 01:52:45 np0005539505 systemd-udevd[216745]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:52:45 np0005539505 systemd-machined[153285]: New machine qemu-11-instance-00000015.
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.647 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ff0dce52-1658-41a8-871c-e84fe0140427]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.648 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17ec2ca4-31 in ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.650 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17ec2ca4-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.650 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c42dc373-4edd-47e0-881f-7c111ab52158]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.651 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ce521217-1d14-41ca-878d-3fc8efe90c00]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 NetworkManager[55134]: <info>  [1764399165.6613] device (tap5afcc88b-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:52:45 np0005539505 NetworkManager[55134]: <info>  [1764399165.6624] device (tap5afcc88b-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.669 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[5561e6b9-dc66-4395-a956-3418f5195127]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.674 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:45 np0005539505 ovn_controller[95143]: 2025-11-29T06:52:45Z|00091|binding|INFO|Setting lport 5afcc88b-5bd9-4937-8f10-36d3484dc52a ovn-installed in OVS
Nov 29 01:52:45 np0005539505 ovn_controller[95143]: 2025-11-29T06:52:45Z|00092|binding|INFO|Setting lport 5afcc88b-5bd9-4937-8f10-36d3484dc52a up in Southbound
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.678 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:45 np0005539505 systemd[1]: Started Virtual Machine qemu-11-instance-00000015.
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.693 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ba8409-d1ae-4c84-88d0-7108c490c415]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.719 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ee49ba-d60c-4cbe-b5cb-712280d13ee8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.723 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[935f093b-48d8-43bb-9613-3a34422237d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 NetworkManager[55134]: <info>  [1764399165.7247] manager: (tap17ec2ca4-30): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.755 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[82c02885-4086-4b5a-8164-a89a2e604646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.759 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[489375fb-0d64-4b3f-9ee0-453be63c9dbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 NetworkManager[55134]: <info>  [1764399165.7814] device (tap17ec2ca4-30): carrier: link connected
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.787 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[74b8c10c-5d6c-4b5e-b177-6ca023acfa07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.804 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d5417838-7092-4b2f-bcc5-a69dea5e000e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461336, 'reachable_time': 20333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216778, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.819 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f54f4e81-9667-47f3-82cd-47f2c9875d8f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:556b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461336, 'tstamp': 461336}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216779, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.837 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8c97b0a4-3c8a-40db-b9e2-ae03a09e73c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461336, 'reachable_time': 20333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216780, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.864 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[841a9666-c0ae-4a9f-ba46-3a287ef487bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.918 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c8fa755b-cf7d-41c9-8c55-158807bdf14c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.919 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.920 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.920 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17ec2ca4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:45 np0005539505 NetworkManager[55134]: <info>  [1764399165.9601] manager: (tap17ec2ca4-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 29 01:52:45 np0005539505 kernel: tap17ec2ca4-30: entered promiscuous mode
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.960 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.962 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17ec2ca4-30, col_values=(('external_ids', {'iface-id': '97d66506-c891-4bf7-8595-2d091560f247'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.963 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:45 np0005539505 ovn_controller[95143]: 2025-11-29T06:52:45Z|00093|binding|INFO|Releasing lport 97d66506-c891-4bf7-8595-2d091560f247 from this chassis (sb_readonly=0)
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.965 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.965 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[187a9b37-e501-48be-bb33-77d861bca7a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.966 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:52:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:52:45.967 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'env', 'PROCESS_TAG=haproxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:52:45 np0005539505 nova_compute[186958]: 2025-11-29 06:52:45.976 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:46 np0005539505 nova_compute[186958]: 2025-11-29 06:52:46.163 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399166.1625185, a6a138ce-4707-4db6-892e-31809c4b4e03 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:46 np0005539505 nova_compute[186958]: 2025-11-29 06:52:46.164 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] VM Started (Lifecycle Event)#033[00m
Nov 29 01:52:46 np0005539505 nova_compute[186958]: 2025-11-29 06:52:46.183 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:46 np0005539505 nova_compute[186958]: 2025-11-29 06:52:46.192 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399166.1643307, a6a138ce-4707-4db6-892e-31809c4b4e03 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:46 np0005539505 nova_compute[186958]: 2025-11-29 06:52:46.193 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:52:46 np0005539505 nova_compute[186958]: 2025-11-29 06:52:46.210 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:46 np0005539505 nova_compute[186958]: 2025-11-29 06:52:46.213 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:52:46 np0005539505 nova_compute[186958]: 2025-11-29 06:52:46.231 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:52:46 np0005539505 podman[216819]: 2025-11-29 06:52:46.345209473 +0000 UTC m=+0.055776919 container create 8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 01:52:46 np0005539505 systemd[1]: Started libpod-conmon-8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145.scope.
Nov 29 01:52:46 np0005539505 nova_compute[186958]: 2025-11-29 06:52:46.398 186962 DEBUG nova.network.neutron [req-1e9ac0ba-b78e-4a30-9af6-d3065b0557d5 req-564a15f4-d8ad-4357-87ea-496fb831d1be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Updated VIF entry in instance network info cache for port 5afcc88b-5bd9-4937-8f10-36d3484dc52a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:52:46 np0005539505 nova_compute[186958]: 2025-11-29 06:52:46.400 186962 DEBUG nova.network.neutron [req-1e9ac0ba-b78e-4a30-9af6-d3065b0557d5 req-564a15f4-d8ad-4357-87ea-496fb831d1be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Updating instance_info_cache with network_info: [{"id": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "address": "fa:16:3e:b9:e9:8d", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afcc88b-5b", "ovs_interfaceid": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:52:46 np0005539505 podman[216819]: 2025-11-29 06:52:46.311149366 +0000 UTC m=+0.021716812 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:52:46 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:52:46 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0237cd70c01ac3998c9c51d767b942fff4cded6a2c184717675564ed7490afb9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:52:46 np0005539505 nova_compute[186958]: 2025-11-29 06:52:46.420 186962 DEBUG oslo_concurrency.lockutils [req-1e9ac0ba-b78e-4a30-9af6-d3065b0557d5 req-564a15f4-d8ad-4357-87ea-496fb831d1be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-a6a138ce-4707-4db6-892e-31809c4b4e03" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:52:46 np0005539505 podman[216819]: 2025-11-29 06:52:46.426700054 +0000 UTC m=+0.137267530 container init 8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 01:52:46 np0005539505 podman[216819]: 2025-11-29 06:52:46.431679234 +0000 UTC m=+0.142246680 container start 8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 01:52:46 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[216834]: [NOTICE]   (216838) : New worker (216840) forked
Nov 29 01:52:46 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[216834]: [NOTICE]   (216838) : Loading success.
Nov 29 01:52:47 np0005539505 nova_compute[186958]: 2025-11-29 06:52:47.582 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.081 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}8eb302a794efaba1f3c4d181e523060affea2f808455707e52bbb7cd2e0b3f10" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.216 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Sat, 29 Nov 2025 06:52:48 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a5d03ebb-1a8e-420f-8e24-3a60a6bf9bf6 x-openstack-request-id: req-a5d03ebb-1a8e-420f-8e24-3a60a6bf9bf6 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.217 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}]}, {"id": "e29df891-dca5-4a1c-9258-dc512a46956f", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e29df891-dca5-4a1c-9258-dc512a46956f"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.217 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-a5d03ebb-1a8e-420f-8e24-3a60a6bf9bf6 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.218 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}8eb302a794efaba1f3c4d181e523060affea2f808455707e52bbb7cd2e0b3f10" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.270 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Sat, 29 Nov 2025 06:52:48 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-43441b82-51c4-4599-b0cb-91e4913a2442 x-openstack-request-id: req-43441b82-51c4-4599-b0cb-91e4913a2442 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.270 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.270 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 used request id req-43441b82-51c4-4599-b0cb-91e4913a2442 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.271 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'name': 'tempest-ImagesTestJSON-server-1550340611', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000015', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'hostId': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.275 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a6a138ce-4707-4db6-892e-31809c4b4e03 / tap5afcc88b-5b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.275 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f0205de-00ff-4724-98f5-16efe583c641', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-00000015-a6a138ce-4707-4db6-892e-31809c4b4e03-tap5afcc88b-5b', 'timestamp': '2025-11-29T06:52:48.271831', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'tap5afcc88b-5b', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:e9:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afcc88b-5b'}, 'message_id': '042cea44-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.912695516, 'message_signature': '01123bdbf8b8eeeb6504758630dc1e15971eef478855a9f28aaf599faabd1b78'}]}, 'timestamp': '2025-11-29 06:52:48.275725', '_unique_id': '0b75fa4751ed44b6809db0320069105f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.276 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.277 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.277 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70183366-3ab9-4c06-b24b-6c6509342782', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-00000015-a6a138ce-4707-4db6-892e-31809c4b4e03-tap5afcc88b-5b', 'timestamp': '2025-11-29T06:52:48.277915', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'tap5afcc88b-5b', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:e9:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afcc88b-5b'}, 'message_id': '042d4e30-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.912695516, 'message_signature': '22c6e16f52847e62037d02c998d7ec03348c4e340e90d8748936c2928d3fe5d2'}]}, 'timestamp': '2025-11-29 06:52:48.278239', '_unique_id': '74ae60488378498aa1d2f418c3622449'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.278 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.279 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.279 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '450742f4-3ca7-492f-ab71-78a212fcd731', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-00000015-a6a138ce-4707-4db6-892e-31809c4b4e03-tap5afcc88b-5b', 'timestamp': '2025-11-29T06:52:48.279772', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'tap5afcc88b-5b', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:e9:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afcc88b-5b'}, 'message_id': '042d96b0-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.912695516, 'message_signature': '8abf6efa49c68d849f10a5a7842a4880bc7c3f31260304c0a8a518e962bc664c'}]}, 'timestamp': '2025-11-29 06:52:48.280068', '_unique_id': 'f45b468db86d42eab20dc73868427047'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.280 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.281 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.281 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b438e720-7c8a-4ed7-900b-cdbb43904f9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-00000015-a6a138ce-4707-4db6-892e-31809c4b4e03-tap5afcc88b-5b', 'timestamp': '2025-11-29T06:52:48.281331', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'tap5afcc88b-5b', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:e9:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afcc88b-5b'}, 'message_id': '042dd242-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.912695516, 'message_signature': '7a624c20daa4caaa2906973ba82af5b07b00d0e10d84ef77ad90f8b4b04b92ba'}]}, 'timestamp': '2025-11-29 06:52:48.281565', '_unique_id': '685eedf60ebc45949db55dacab4cf4da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.282 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a2d2130-f634-4c8a-aa18-af5537845d24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-00000015-a6a138ce-4707-4db6-892e-31809c4b4e03-tap5afcc88b-5b', 'timestamp': '2025-11-29T06:52:48.282656', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'tap5afcc88b-5b', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:e9:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afcc88b-5b'}, 'message_id': '042e05e6-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.912695516, 'message_signature': 'ed51831c18303f5886ab88382b00ad124fac652f748657d92875a04fe68da01b'}]}, 'timestamp': '2025-11-29 06:52:48.282885', '_unique_id': 'ecd1fb9827be4fb4bd0aaa6c691379b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.283 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.311 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.312 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd4c4ca7-db9a-4a23-8aa8-3557a9a5eddf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-vda', 'timestamp': '2025-11-29T06:52:48.283944', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0432732e-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': '3faf3c3767cfdb87aa2bfce628ef8d1f8af1df9fda25ea3284ecdb34c7ec4d3f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-sda', 'timestamp': '2025-11-29T06:52:48.283944', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04328148-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': '1a08761fc3145eb675e347bc26e4f2295c423e9d7f8f3575080b924c35ed31f4'}]}, 'timestamp': '2025-11-29 06:52:48.312335', '_unique_id': '9b4342581f654924bf57c9a61b022748'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.313 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.314 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.314 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.314 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-1550340611>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-1550340611>]
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.315 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.315 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.315 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fbac3b4-b9f7-4ebe-bd87-58fcace46f61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-vda', 'timestamp': '2025-11-29T06:52:48.315156', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0432fe7a-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': '49d611bc8edc096bd56beffe60d08a0cce80c46e6c30cbe4151b5dc4bcbe85cd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-sda', 'timestamp': '2025-11-29T06:52:48.315156', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04330a28-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': 'fc968ffed936171459de00578f3f1a52f5bb0d73b28475f62a96b70b161327fb'}]}, 'timestamp': '2025-11-29 06:52:48.315794', '_unique_id': '9e6159b6a85b471bad8bf2e41f3f61a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.316 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.317 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.317 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.317 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-1550340611>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-1550340611>]
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.317 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.317 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36cabbb4-098a-4d44-83fb-c286c655e196', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-00000015-a6a138ce-4707-4db6-892e-31809c4b4e03-tap5afcc88b-5b', 'timestamp': '2025-11-29T06:52:48.317859', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'tap5afcc88b-5b', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:e9:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afcc88b-5b'}, 'message_id': '04336734-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.912695516, 'message_signature': 'fb9ea8e234bc4d5755442e3b8630de4b278660ebededd71de58ede38ba681560'}]}, 'timestamp': '2025-11-29 06:52:48.318191', '_unique_id': '62ce5fc782174e838beae18eeaeb9bfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.318 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.319 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.319 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.319 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3ef9231-8c59-47c1-b8d6-f24c4c3d6c8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-vda', 'timestamp': '2025-11-29T06:52:48.319648', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0433ac4e-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': '9279753539ad96d0b6ecafd255e1302603bf956de2db29f7d398ea9ba036f7ab'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-sda', 'timestamp': '2025-11-29T06:52:48.319648', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0433c2d8-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': 'f2085b754263d7d61d68abee80ac03530d31132f3a7e51a289ff771f915405be'}]}, 'timestamp': '2025-11-29 06:52:48.320599', '_unique_id': 'a3bd2396d1424c3981e88feaec0a67b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.321 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.322 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.333 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.333 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c828a678-5138-4d2e-ae83-36bbd6614f32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-vda', 'timestamp': '2025-11-29T06:52:48.322657', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0435c7b8-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.963521245, 'message_signature': '18f4eb110e8895c0275147712e918f24457eaf64002ab2bc2d35c1f0f4034b0b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-sda', 'timestamp': '2025-11-29T06:52:48.322657', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0435d366-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.963521245, 'message_signature': '5d1a35b32dba3ee17815840cfeb421c9d29ff69b0b8451cc41dc8afdcaf830d8'}]}, 'timestamp': '2025-11-29 06:52:48.334024', '_unique_id': '2959b0616d1d42c0ae35e073c31c3683'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.335 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0783cdc3-eb1f-43fc-b8eb-21be69af85c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-00000015-a6a138ce-4707-4db6-892e-31809c4b4e03-tap5afcc88b-5b', 'timestamp': '2025-11-29T06:52:48.335760', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'tap5afcc88b-5b', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:e9:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afcc88b-5b'}, 'message_id': '04362140-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.912695516, 'message_signature': '54d20a38580b86abe467e4dde59a138e2b4d971045e0466e6299345bb1fb118a'}]}, 'timestamp': '2025-11-29 06:52:48.336017', '_unique_id': 'e93a2631613c4d03807d7a3dc51d68fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.336 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.337 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.355 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '494ecf1f-722c-4908-8f0d-8a0daca1c9c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'timestamp': '2025-11-29T06:52:48.337237', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '043938d0-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.996521973, 'message_signature': 'fa3c181fbae8876946c70a451e29ad8db065443ef4f987bfaaada6d4d3f3041b'}]}, 'timestamp': '2025-11-29 06:52:48.356380', '_unique_id': '218909aef05140cb9e2e9a64a48eee47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.357 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.358 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.358 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.358 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c548eb3-dd31-4e1f-af57-6d3df67ca8c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-vda', 'timestamp': '2025-11-29T06:52:48.358159', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '04398dc6-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.963521245, 'message_signature': 'dc71b6eeaacd68d0034df0d426f8652df90e2fe0819d9f184a856fdd1ae490d3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-sda', 'timestamp': '2025-11-29T06:52:48.358159', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '04399618-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.963521245, 'message_signature': 'a139d401cb3764cb5c5a61f4a3a4f62867c19edf0b46bb675794291bcaf59666'}]}, 'timestamp': '2025-11-29 06:52:48.358651', '_unique_id': 'ee612c2c4976443b9477217afc7cd2a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.359 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.360 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.360 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.360 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69097f63-12a7-403e-a344-21f26560d16b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-vda', 'timestamp': '2025-11-29T06:52:48.360093', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0439d8da-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.963521245, 'message_signature': 'ae4b151b4793e918a8f3bb8f85dd7e38e7ff4b4e1a0a6d5ea777a19be416872e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-sda', 'timestamp': '2025-11-29T06:52:48.360093', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0439e410-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.963521245, 'message_signature': '473405e72ec37cbc26fffe89884053b426f156f1e34f123e261f99e7a09b31c8'}]}, 'timestamp': '2025-11-29 06:52:48.360691', '_unique_id': '35a235d249c14f47bdf57a19ff9e38ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.361 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.362 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.362 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.362 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c54b32e-b8ad-4d8f-af96-36ad87229d76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-vda', 'timestamp': '2025-11-29T06:52:48.362279', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '043a2e0c-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': '9d1f8e1cd0960912cd11fc4d1ef9b1058dc8e01fe44f40d5001c2d61ea22995d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-sda', 'timestamp': '2025-11-29T06:52:48.362279', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '043a3a6e-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': '0acd222537c3bed133c7350cf0caed111995ea85f249ce009418d717f532770e'}]}, 'timestamp': '2025-11-29 06:52:48.362899', '_unique_id': 'c3dd89ad416a4403aada399c974cd599'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.363 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.364 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.364 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7c4d837-e766-4d33-a36a-18249679264c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-00000015-a6a138ce-4707-4db6-892e-31809c4b4e03-tap5afcc88b-5b', 'timestamp': '2025-11-29T06:52:48.364375', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'tap5afcc88b-5b', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:e9:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afcc88b-5b'}, 'message_id': '043a7fd8-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.912695516, 'message_signature': '0e21716a5bc448536bc67882ad6b487ef35f787c266a02f08d63f092e7660e85'}]}, 'timestamp': '2025-11-29 06:52:48.364723', '_unique_id': '705d660a8a3946d0b81903bf752fc26f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.365 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.366 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.366 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e67b6e75-21a7-4beb-9b42-2ef0dc5381ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-00000015-a6a138ce-4707-4db6-892e-31809c4b4e03-tap5afcc88b-5b', 'timestamp': '2025-11-29T06:52:48.366354', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'tap5afcc88b-5b', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:e9:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afcc88b-5b'}, 'message_id': '043acccc-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.912695516, 'message_signature': '7ae9f0a64be9ff86cae6c1bfe9a7f3161485eaba315f81ab0bac684d626af191'}]}, 'timestamp': '2025-11-29 06:52:48.366666', '_unique_id': '01eeeec87e3e4f289ee9c333fd14ca30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.367 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.368 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.368 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance a6a138ce-4707-4db6-892e-31809c4b4e03: ceilometer.compute.pollsters.NoVolumeException
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.368 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.368 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfeb5f85-40cc-4382-9493-2220a16babe9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-00000015-a6a138ce-4707-4db6-892e-31809c4b4e03-tap5afcc88b-5b', 'timestamp': '2025-11-29T06:52:48.368415', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'tap5afcc88b-5b', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:e9:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap5afcc88b-5b'}, 'message_id': '043b1d62-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.912695516, 'message_signature': '2f2fa1459bb7e0bd6e6bf9527d964d2ae1f76b408fb72ba82ff3f68a1d38ddf3'}]}, 'timestamp': '2025-11-29 06:52:48.368762', '_unique_id': '0265f7305ba64859861fcbe723379adf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.369 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.370 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.370 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.370 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1803a6cf-878d-4cb9-8b09-0ebb2ce3d3c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-vda', 'timestamp': '2025-11-29T06:52:48.370246', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '043b64f2-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': 'fa96d2bc1d45dde33be3f4e4eb5cd9dc847069f9881b224df2f6f2b2eedd0ec5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-sda', 'timestamp': '2025-11-29T06:52:48.370246', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '043b6fd8-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': '383e0c37f78eee034cfe7a4ac8d39b89421fb6de69a7f6254a11ee88650b8a8c'}]}, 'timestamp': '2025-11-29 06:52:48.370819', '_unique_id': '384697da19ec4fcaa9a2fb70ed023a63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.371 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.372 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.372 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.372 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-1550340611>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-1550340611>]
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.372 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.372 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.373 12 DEBUG ceilometer.compute.pollsters [-] a6a138ce-4707-4db6-892e-31809c4b4e03/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25a4ee3b-2b5b-407f-b2d2-1d1c895e49cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-vda', 'timestamp': '2025-11-29T06:52:48.372795', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '043bc802-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': '46a6a20c6d809f127d12d6d2b1c7914988bbb77139f6ec69813f6a7170c090d7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03-sda', 'timestamp': '2025-11-29T06:52:48.372795', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-1550340611', 'name': 'instance-00000015', 'instance_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '043bd3a6-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4615.924785136, 'message_signature': '872e989769ff9fd483991e1b136d95562018bfa631846d7acd982506d4046cd3'}]}, 'timestamp': '2025-11-29 06:52:48.373377', '_unique_id': '20a16a3265c54970b53829357d5bbbcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.374 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:52:48.375 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-1550340611>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-1550340611>]
Nov 29 01:52:49 np0005539505 nova_compute[186958]: 2025-11-29 06:52:49.886 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:50 np0005539505 nova_compute[186958]: 2025-11-29 06:52:50.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:52:50 np0005539505 podman[216850]: 2025-11-29 06:52:50.764900438 +0000 UTC m=+0.081491152 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:52:50 np0005539505 podman[216849]: 2025-11-29 06:52:50.764875807 +0000 UTC m=+0.093071977 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, io.openshift.tags=minimal rhel9)
Nov 29 01:52:52 np0005539505 nova_compute[186958]: 2025-11-29 06:52:52.583 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:53 np0005539505 podman[216894]: 2025-11-29 06:52:53.718082029 +0000 UTC m=+0.051731276 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:52:54 np0005539505 nova_compute[186958]: 2025-11-29 06:52:54.889 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.380 186962 DEBUG nova.compute.manager [req-599bfb42-93ab-4332-9274-3a18d11776b9 req-3632948e-7e6a-4f4d-b59f-84cf8cb25dc3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Received event network-vif-plugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.380 186962 DEBUG oslo_concurrency.lockutils [req-599bfb42-93ab-4332-9274-3a18d11776b9 req-3632948e-7e6a-4f4d-b59f-84cf8cb25dc3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.380 186962 DEBUG oslo_concurrency.lockutils [req-599bfb42-93ab-4332-9274-3a18d11776b9 req-3632948e-7e6a-4f4d-b59f-84cf8cb25dc3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.380 186962 DEBUG oslo_concurrency.lockutils [req-599bfb42-93ab-4332-9274-3a18d11776b9 req-3632948e-7e6a-4f4d-b59f-84cf8cb25dc3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.381 186962 DEBUG nova.compute.manager [req-599bfb42-93ab-4332-9274-3a18d11776b9 req-3632948e-7e6a-4f4d-b59f-84cf8cb25dc3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Processing event network-vif-plugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.381 186962 DEBUG nova.compute.manager [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Instance event wait completed in 9 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.385 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399175.385376, a6a138ce-4707-4db6-892e-31809c4b4e03 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.385 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.387 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.390 186962 INFO nova.virt.libvirt.driver [-] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Instance spawned successfully.#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.390 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.409 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.415 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.418 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.418 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.419 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.419 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.419 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.420 186962 DEBUG nova.virt.libvirt.driver [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.452 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.502 186962 INFO nova.compute.manager [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Took 14.57 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.502 186962 DEBUG nova.compute.manager [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.610 186962 INFO nova.compute.manager [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Took 15.30 seconds to build instance.#033[00m
Nov 29 01:52:55 np0005539505 nova_compute[186958]: 2025-11-29 06:52:55.625 186962 DEBUG oslo_concurrency.lockutils [None req-4f91c92c-2977-4a1c-a811-e88534da624c 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.394s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.110 186962 INFO nova.compute.manager [None req-cd60b1f3-3457-4d13-a62f-8cea0a0cb0a7 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Pausing#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.111 186962 DEBUG nova.objects.instance [None req-cd60b1f3-3457-4d13-a62f-8cea0a0cb0a7 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'flavor' on Instance uuid a6a138ce-4707-4db6-892e-31809c4b4e03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.166 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399177.166587, a6a138ce-4707-4db6-892e-31809c4b4e03 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.167 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.169 186962 DEBUG nova.compute.manager [None req-cd60b1f3-3457-4d13-a62f-8cea0a0cb0a7 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.212 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.218 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.476 186962 DEBUG nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Received event network-vif-plugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.477 186962 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.478 186962 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.478 186962 DEBUG oslo_concurrency.lockutils [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.479 186962 DEBUG nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] No waiting events found dispatching network-vif-plugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.479 186962 WARNING nova.compute.manager [req-41bccbb2-8042-432d-b4a2-38f4ac624060 req-b20ea527-a32b-4913-81e6-e22b93296daf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Received unexpected event network-vif-plugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a for instance with vm_state paused and task_state None.#033[00m
Nov 29 01:52:57 np0005539505 nova_compute[186958]: 2025-11-29 06:52:57.586 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:59 np0005539505 podman[216914]: 2025-11-29 06:52:59.75800462 +0000 UTC m=+0.078065220 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:52:59 np0005539505 podman[216915]: 2025-11-29 06:52:59.849043326 +0000 UTC m=+0.165768942 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:52:59 np0005539505 nova_compute[186958]: 2025-11-29 06:52:59.890 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:52:59 np0005539505 nova_compute[186958]: 2025-11-29 06:52:59.986 186962 DEBUG nova.compute.manager [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.092 186962 INFO nova.compute.manager [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] instance snapshotting#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.093 186962 WARNING nova.compute.manager [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] trying to snapshot a non-running instance: (state: 3 expected: 1)#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.405 186962 INFO nova.virt.libvirt.driver [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Beginning live snapshot process#033[00m
Nov 29 01:53:00 np0005539505 virtqemud[186353]: invalid argument: disk vda does not have an active block job
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.611 186962 DEBUG oslo_concurrency.processutils [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.675 186962 DEBUG oslo_concurrency.processutils [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk --force-share --output=json -f qcow2" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.677 186962 DEBUG oslo_concurrency.processutils [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.755 186962 DEBUG oslo_concurrency.processutils [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03/disk --force-share --output=json -f qcow2" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.772 186962 DEBUG oslo_concurrency.processutils [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.834 186962 DEBUG oslo_concurrency.processutils [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.835 186962 DEBUG oslo_concurrency.processutils [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpz67bymgx/5f38d83f759e4a5cb526238fee6a1336.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.881 186962 DEBUG oslo_concurrency.processutils [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpz67bymgx/5f38d83f759e4a5cb526238fee6a1336.delta 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.882 186962 INFO nova.virt.libvirt.driver [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.944 186962 DEBUG nova.virt.libvirt.guest [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.947 186962 INFO nova.virt.libvirt.driver [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.992 186962 DEBUG nova.privsep.utils [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:53:00 np0005539505 nova_compute[186958]: 2025-11-29 06:53:00.993 186962 DEBUG oslo_concurrency.processutils [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpz67bymgx/5f38d83f759e4a5cb526238fee6a1336.delta /var/lib/nova/instances/snapshots/tmpz67bymgx/5f38d83f759e4a5cb526238fee6a1336 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:01 np0005539505 nova_compute[186958]: 2025-11-29 06:53:01.266 186962 DEBUG oslo_concurrency.processutils [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpz67bymgx/5f38d83f759e4a5cb526238fee6a1336.delta /var/lib/nova/instances/snapshots/tmpz67bymgx/5f38d83f759e4a5cb526238fee6a1336" returned: 0 in 0.273s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:01 np0005539505 nova_compute[186958]: 2025-11-29 06:53:01.268 186962 INFO nova.virt.libvirt.driver [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Snapshot extracted, beginning image upload#033[00m
Nov 29 01:53:01 np0005539505 nova_compute[186958]: 2025-11-29 06:53:01.711 186962 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Creating tmpfile /var/lib/nova/instances/tmpg68b0jet to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 29 01:53:01 np0005539505 nova_compute[186958]: 2025-11-29 06:53:01.713 186962 DEBUG nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg68b0jet',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 29 01:53:02 np0005539505 nova_compute[186958]: 2025-11-29 06:53:02.588 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:02 np0005539505 nova_compute[186958]: 2025-11-29 06:53:02.879 186962 DEBUG nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg68b0jet',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 29 01:53:02 np0005539505 nova_compute[186958]: 2025-11-29 06:53:02.917 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:02 np0005539505 nova_compute[186958]: 2025-11-29 06:53:02.918 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquired lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:02 np0005539505 nova_compute[186958]: 2025-11-29 06:53:02.918 186962 DEBUG nova.network.neutron [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:53:03 np0005539505 nova_compute[186958]: 2025-11-29 06:53:03.654 186962 INFO nova.virt.libvirt.driver [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Snapshot image upload complete#033[00m
Nov 29 01:53:03 np0005539505 nova_compute[186958]: 2025-11-29 06:53:03.655 186962 INFO nova.compute.manager [None req-56102955-0d84-41ac-a665-23c821af028b 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Took 3.54 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.622 186962 DEBUG nova.network.neutron [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating instance_info_cache with network_info: [{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.651 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Releasing lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.666 186962 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg68b0jet',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.667 186962 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Creating instance directory: /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.667 186962 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Creating disk.info with the contents: {'/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk': 'qcow2', '/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.668 186962 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.668 186962 DEBUG nova.objects.instance [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2e380200-8276-4470-965f-31baa0bfd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.697 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:04 np0005539505 podman[216990]: 2025-11-29 06:53:04.758299603 +0000 UTC m=+0.080764137 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.768 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.769 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.770 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.794 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.861 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.863 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.932 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.974 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk 1073741824" returned: 0 in 0.112s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.975 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:04 np0005539505 nova_compute[186958]: 2025-11-29 06:53:04.976 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.034 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.036 186962 DEBUG nova.virt.disk.api [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Checking if we can resize image /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.036 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.092 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.094 186962 DEBUG nova.virt.disk.api [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Cannot resize image /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.094 186962 DEBUG nova.objects.instance [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lazy-loading 'migration_context' on Instance uuid 2e380200-8276-4470-965f-31baa0bfd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.108 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.130 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config 485376" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.132 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config to /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.133 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.591 186962 DEBUG oslo_concurrency.processutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk.config /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.592 186962 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.594 186962 DEBUG nova.virt.libvirt.vif [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:52:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1351543550',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1351543550',id=23,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:52:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-68mzhrqj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:52:57Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=2e380200-8276-4470-965f-31baa0bfd760,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.594 186962 DEBUG nova.network.os_vif_util [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converting VIF {"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.595 186962 DEBUG nova.network.os_vif_util [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.595 186962 DEBUG os_vif [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.596 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.597 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.597 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.600 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.601 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ff22547-58, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.602 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1ff22547-58, col_values=(('external_ids', {'iface-id': '1ff22547-5892-4360-8abe-429ea2f212ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:a7:a1', 'vm-uuid': '2e380200-8276-4470-965f-31baa0bfd760'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.603 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539505 NetworkManager[55134]: <info>  [1764399185.6049] manager: (tap1ff22547-58): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.606 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.612 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.612 186962 INFO os_vif [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58')#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.613 186962 DEBUG nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.613 186962 DEBUG nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg68b0jet',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.769 186962 DEBUG oslo_concurrency.lockutils [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "a6a138ce-4707-4db6-892e-31809c4b4e03" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.770 186962 DEBUG oslo_concurrency.lockutils [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.771 186962 DEBUG oslo_concurrency.lockutils [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.771 186962 DEBUG oslo_concurrency.lockutils [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.772 186962 DEBUG oslo_concurrency.lockutils [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.904 186962 INFO nova.compute.manager [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Terminating instance#033[00m
Nov 29 01:53:05 np0005539505 nova_compute[186958]: 2025-11-29 06:53:05.944 186962 DEBUG nova.compute.manager [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:53:05 np0005539505 kernel: tap5afcc88b-5b (unregistering): left promiscuous mode
Nov 29 01:53:05 np0005539505 NetworkManager[55134]: <info>  [1764399185.9715] device (tap5afcc88b-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:53:06 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:06Z|00094|binding|INFO|Releasing lport 5afcc88b-5bd9-4937-8f10-36d3484dc52a from this chassis (sb_readonly=0)
Nov 29 01:53:06 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:06Z|00095|binding|INFO|Setting lport 5afcc88b-5bd9-4937-8f10-36d3484dc52a down in Southbound
Nov 29 01:53:06 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:06Z|00096|binding|INFO|Removing iface tap5afcc88b-5b ovn-installed in OVS
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.009 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.011 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.021 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:06 np0005539505 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000015.scope: Deactivated successfully.
Nov 29 01:53:06 np0005539505 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000015.scope: Consumed 2.390s CPU time.
Nov 29 01:53:06 np0005539505 systemd-machined[153285]: Machine qemu-11-instance-00000015 terminated.
Nov 29 01:53:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:06.165 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:e9:8d 10.100.0.9'], port_security=['fa:16:3e:b9:e9:8d 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a6a138ce-4707-4db6-892e-31809c4b4e03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=5afcc88b-5bd9-4937-8f10-36d3484dc52a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:06.167 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 5afcc88b-5bd9-4937-8f10-36d3484dc52a in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba unbound from our chassis#033[00m
Nov 29 01:53:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:06.170 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:53:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:06.171 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fd855702-b2c4-4d4c-b097-372ed5a7da74]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:06.171 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace which is not needed anymore#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.203 186962 INFO nova.virt.libvirt.driver [-] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Instance destroyed successfully.#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.204 186962 DEBUG nova.objects.instance [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'resources' on Instance uuid a6a138ce-4707-4db6-892e-31809c4b4e03 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.229 186962 DEBUG nova.virt.libvirt.vif [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:52:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1550340611',display_name='tempest-ImagesTestJSON-server-1550340611',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1550340611',id=21,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:52:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-qv2n60l6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:53:03Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=a6a138ce-4707-4db6-892e-31809c4b4e03,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "address": "fa:16:3e:b9:e9:8d", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afcc88b-5b", "ovs_interfaceid": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.229 186962 DEBUG nova.network.os_vif_util [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "address": "fa:16:3e:b9:e9:8d", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afcc88b-5b", "ovs_interfaceid": "5afcc88b-5bd9-4937-8f10-36d3484dc52a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.230 186962 DEBUG nova.network.os_vif_util [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:e9:8d,bridge_name='br-int',has_traffic_filtering=True,id=5afcc88b-5bd9-4937-8f10-36d3484dc52a,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afcc88b-5b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.231 186962 DEBUG os_vif [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:e9:8d,bridge_name='br-int',has_traffic_filtering=True,id=5afcc88b-5bd9-4937-8f10-36d3484dc52a,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afcc88b-5b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.232 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.233 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5afcc88b-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.234 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.236 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.239 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.242 186962 INFO os_vif [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:e9:8d,bridge_name='br-int',has_traffic_filtering=True,id=5afcc88b-5bd9-4937-8f10-36d3484dc52a,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afcc88b-5b')#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.243 186962 INFO nova.virt.libvirt.driver [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Deleting instance files /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03_del#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.243 186962 INFO nova.virt.libvirt.driver [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Deletion of /var/lib/nova/instances/a6a138ce-4707-4db6-892e-31809c4b4e03_del complete#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.311 186962 INFO nova.compute.manager [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.311 186962 DEBUG oslo.service.loopingcall [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.312 186962 DEBUG nova.compute.manager [-] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.312 186962 DEBUG nova.network.neutron [-] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:53:06 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[216834]: [NOTICE]   (216838) : haproxy version is 2.8.14-c23fe91
Nov 29 01:53:06 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[216834]: [NOTICE]   (216838) : path to executable is /usr/sbin/haproxy
Nov 29 01:53:06 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[216834]: [WARNING]  (216838) : Exiting Master process...
Nov 29 01:53:06 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[216834]: [ALERT]    (216838) : Current worker (216840) exited with code 143 (Terminated)
Nov 29 01:53:06 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[216834]: [WARNING]  (216838) : All workers exited. Exiting... (0)
Nov 29 01:53:06 np0005539505 systemd[1]: libpod-8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145.scope: Deactivated successfully.
Nov 29 01:53:06 np0005539505 podman[217078]: 2025-11-29 06:53:06.802891068 +0000 UTC m=+0.536088498 container died 8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.872 186962 DEBUG nova.compute.manager [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Received event network-vif-unplugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.873 186962 DEBUG oslo_concurrency.lockutils [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.873 186962 DEBUG oslo_concurrency.lockutils [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.873 186962 DEBUG oslo_concurrency.lockutils [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.873 186962 DEBUG nova.compute.manager [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] No waiting events found dispatching network-vif-unplugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.874 186962 DEBUG nova.compute.manager [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Received event network-vif-unplugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.874 186962 DEBUG nova.compute.manager [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Received event network-vif-plugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.875 186962 DEBUG oslo_concurrency.lockutils [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.875 186962 DEBUG oslo_concurrency.lockutils [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.875 186962 DEBUG oslo_concurrency.lockutils [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.875 186962 DEBUG nova.compute.manager [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] No waiting events found dispatching network-vif-plugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:06 np0005539505 nova_compute[186958]: 2025-11-29 06:53:06.876 186962 WARNING nova.compute.manager [req-a0737d5a-91ad-4510-817d-68b1a03de928 req-52df15d5-c042-4435-b653-f33ecfaa150c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Received unexpected event network-vif-plugged-5afcc88b-5bd9-4937-8f10-36d3484dc52a for instance with vm_state paused and task_state deleting.#033[00m
Nov 29 01:53:07 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145-userdata-shm.mount: Deactivated successfully.
Nov 29 01:53:07 np0005539505 systemd[1]: var-lib-containers-storage-overlay-0237cd70c01ac3998c9c51d767b942fff4cded6a2c184717675564ed7490afb9-merged.mount: Deactivated successfully.
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.468 186962 DEBUG nova.network.neutron [-] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.589 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.611 186962 DEBUG nova.compute.manager [req-ef8e72b3-fb86-4006-ae42-d3427810dea9 req-1a9ecafd-32e4-46e7-afed-b9950f2874c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Received event network-vif-deleted-5afcc88b-5bd9-4937-8f10-36d3484dc52a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.612 186962 INFO nova.compute.manager [req-ef8e72b3-fb86-4006-ae42-d3427810dea9 req-1a9ecafd-32e4-46e7-afed-b9950f2874c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Neutron deleted interface 5afcc88b-5bd9-4937-8f10-36d3484dc52a; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.612 186962 DEBUG nova.network.neutron [req-ef8e72b3-fb86-4006-ae42-d3427810dea9 req-1a9ecafd-32e4-46e7-afed-b9950f2874c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.622 186962 INFO nova.compute.manager [-] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Took 1.31 seconds to deallocate network for instance.#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.670 186962 DEBUG nova.compute.manager [req-ef8e72b3-fb86-4006-ae42-d3427810dea9 req-1a9ecafd-32e4-46e7-afed-b9950f2874c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Detach interface failed, port_id=5afcc88b-5bd9-4937-8f10-36d3484dc52a, reason: Instance a6a138ce-4707-4db6-892e-31809c4b4e03 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.768 186962 DEBUG oslo_concurrency.lockutils [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.769 186962 DEBUG oslo_concurrency.lockutils [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.840 186962 DEBUG nova.compute.provider_tree [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.855 186962 DEBUG nova.scheduler.client.report [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.892 186962 DEBUG oslo_concurrency.lockutils [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.895 186962 DEBUG nova.network.neutron [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Port 1ff22547-5892-4360-8abe-429ea2f212ee updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.929 186962 DEBUG nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg68b0jet',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 29 01:53:07 np0005539505 nova_compute[186958]: 2025-11-29 06:53:07.954 186962 INFO nova.scheduler.client.report [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Deleted allocations for instance a6a138ce-4707-4db6-892e-31809c4b4e03#033[00m
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.105 186962 DEBUG oslo_concurrency.lockutils [None req-461eb277-0256-49bd-bcdc-63203ea2abef 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "a6a138ce-4707-4db6-892e-31809c4b4e03" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:08 np0005539505 systemd[1]: Starting libvirt proxy daemon...
Nov 29 01:53:08 np0005539505 systemd[1]: Started libvirt proxy daemon.
Nov 29 01:53:08 np0005539505 kernel: tap1ff22547-58: entered promiscuous mode
Nov 29 01:53:08 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:08Z|00097|binding|INFO|Claiming lport 1ff22547-5892-4360-8abe-429ea2f212ee for this additional chassis.
Nov 29 01:53:08 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:08Z|00098|binding|INFO|1ff22547-5892-4360-8abe-429ea2f212ee: Claiming fa:16:3e:56:a7:a1 10.100.0.5
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.329 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:08 np0005539505 NetworkManager[55134]: <info>  [1764399188.3320] manager: (tap1ff22547-58): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Nov 29 01:53:08 np0005539505 systemd-udevd[217037]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.334 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:08 np0005539505 NetworkManager[55134]: <info>  [1764399188.3524] device (tap1ff22547-58): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:53:08 np0005539505 NetworkManager[55134]: <info>  [1764399188.3550] device (tap1ff22547-58): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:53:08 np0005539505 systemd-machined[153285]: New machine qemu-12-instance-00000017.
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.409 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:08 np0005539505 systemd[1]: Started Virtual Machine qemu-12-instance-00000017.
Nov 29 01:53:08 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:08Z|00099|binding|INFO|Setting lport 1ff22547-5892-4360-8abe-429ea2f212ee ovn-installed in OVS
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.415 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:08 np0005539505 podman[217078]: 2025-11-29 06:53:08.425183991 +0000 UTC m=+2.158381331 container cleanup 8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:53:08 np0005539505 systemd[1]: libpod-conmon-8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145.scope: Deactivated successfully.
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.796 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.799 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.816 186962 DEBUG nova.compute.manager [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.943 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.944 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.952 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:53:08 np0005539505 nova_compute[186958]: 2025-11-29 06:53:08.953 186962 INFO nova.compute.claims [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.095 186962 DEBUG nova.compute.provider_tree [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.117 186962 DEBUG nova.scheduler.client.report [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.140 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.141 186962 DEBUG nova.compute.manager [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.209 186962 DEBUG nova.compute.manager [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.210 186962 DEBUG nova.network.neutron [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.228 186962 INFO nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.247 186962 DEBUG nova.compute.manager [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:53:09 np0005539505 podman[217145]: 2025-11-29 06:53:09.318078012 +0000 UTC m=+0.861886338 container remove 8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:53:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:09.324 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2ec2e732-44c6-4a5e-adc9-ca6757ca4887]: (4, ('Sat Nov 29 06:53:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145)\n8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145\nSat Nov 29 06:53:08 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145)\n8bc6ce1de3dc235d89f79caa349f4172685be5141fe74ff8e8b4d870c09a3145\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:09.327 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[00b3d85c-5a1b-4fd1-b9cf-64987096012f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:09.328 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:09 np0005539505 kernel: tap17ec2ca4-30: left promiscuous mode
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.333 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.342 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:09.344 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0d048a-141b-43e0-8704-4d945089641a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:09.360 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[40d03c97-d6ce-4a0e-97e4-7e0ac202fd9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:09.362 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b7599c4f-21d9-4b67-938d-6bec308f4b34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.368 186962 DEBUG nova.compute.manager [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.369 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.370 186962 INFO nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Creating image(s)#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.370 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "/var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.371 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.371 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:09 np0005539505 podman[217169]: 2025-11-29 06:53:09.376955981 +0000 UTC m=+0.228089808 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 01:53:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:09.379 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f8026759-fd58-435d-aa47-7494aa95c15d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461329, 'reachable_time': 15547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217195, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:09.386 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:53:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:09.386 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[277d3c0a-3689-48d1-ad97-92ad640abc45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:09 np0005539505 systemd[1]: run-netns-ovnmeta\x2d17ec2ca4\x2d3fa9\x2d41aa\x2d80ef\x2d35bf92d404ba.mount: Deactivated successfully.
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.398 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.434 186962 DEBUG nova.policy [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.461 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.462 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.462 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.473 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.527 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.529 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.590 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk 1073741824" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.591 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.592 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.664 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.665 186962 DEBUG nova.virt.disk.api [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Checking if we can resize image /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.666 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.724 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.726 186962 DEBUG nova.virt.disk.api [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Cannot resize image /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.726 186962 DEBUG nova.objects.instance [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'migration_context' on Instance uuid bf8f759b-677f-4b17-8d4d-2eee6b28a740 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.758 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.758 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Ensure instance console log exists: /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.759 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.760 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.760 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.784 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399189.7841983, 2e380200-8276-4470-965f-31baa0bfd760 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.785 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Started (Lifecycle Event)#033[00m
Nov 29 01:53:09 np0005539505 nova_compute[186958]: 2025-11-29 06:53:09.808 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:10 np0005539505 nova_compute[186958]: 2025-11-29 06:53:10.751 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399190.7513757, 2e380200-8276-4470-965f-31baa0bfd760 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:10 np0005539505 nova_compute[186958]: 2025-11-29 06:53:10.752 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:53:10 np0005539505 nova_compute[186958]: 2025-11-29 06:53:10.783 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:10 np0005539505 nova_compute[186958]: 2025-11-29 06:53:10.785 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:53:10 np0005539505 nova_compute[186958]: 2025-11-29 06:53:10.814 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Nov 29 01:53:10 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:10Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:56:a7:a1 10.100.0.5
Nov 29 01:53:10 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:10Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:56:a7:a1 10.100.0.5
Nov 29 01:53:11 np0005539505 nova_compute[186958]: 2025-11-29 06:53:11.236 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:12 np0005539505 nova_compute[186958]: 2025-11-29 06:53:12.592 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:13 np0005539505 nova_compute[186958]: 2025-11-29 06:53:13.267 186962 DEBUG nova.network.neutron [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Successfully created port: fd28d90e-6ad0-431b-9e31-3be4166a5614 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:53:13 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:13Z|00100|binding|INFO|Claiming lport 1ff22547-5892-4360-8abe-429ea2f212ee for this chassis.
Nov 29 01:53:13 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:13Z|00101|binding|INFO|1ff22547-5892-4360-8abe-429ea2f212ee: Claiming fa:16:3e:56:a7:a1 10.100.0.5
Nov 29 01:53:13 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:13Z|00102|binding|INFO|Setting lport 1ff22547-5892-4360-8abe-429ea2f212ee up in Southbound
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.627 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:a7:a1 10.100.0.5'], port_security=['fa:16:3e:56:a7:a1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2e380200-8276-4470-965f-31baa0bfd760', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c691e2c0-bf24-480c-9af6-236639f0492c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '11', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d3611d1-4470-4c82-ad19-45393cd04081, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=1ff22547-5892-4360-8abe-429ea2f212ee) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.629 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 1ff22547-5892-4360-8abe-429ea2f212ee in datapath c691e2c0-bf24-480c-9af6-236639f0492c bound to our chassis#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.630 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c691e2c0-bf24-480c-9af6-236639f0492c#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.646 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[86eee408-ff74-4941-88a5-224c85fc7f51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.647 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc691e2c0-b1 in ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.651 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc691e2c0-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.651 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5e51b2c9-97de-4ead-85ae-f3039e91ce8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.652 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[80869a3d-5591-457f-af0a-ce8744ebef58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.671 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[407f854a-ac6d-425c-a5d0-620fa0127062]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.704 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5578efe3-7d85-44be-bd47-fdbd36952b43]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.747 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c20a95a3-74d2-4103-8a1b-daf231419355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.754 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8560147f-fc30-4974-a469-b7fe1e57bb72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 NetworkManager[55134]: <info>  [1764399193.7553] manager: (tapc691e2c0-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/56)
Nov 29 01:53:13 np0005539505 systemd-udevd[217234]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.787 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f463fee3-ee7c-4caf-a557-c3764652e043]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.790 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c3ae3519-069b-4346-a116-caa299a85faa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 nova_compute[186958]: 2025-11-29 06:53:13.796 186962 INFO nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Post operation of migration started#033[00m
Nov 29 01:53:13 np0005539505 NetworkManager[55134]: <info>  [1764399193.8183] device (tapc691e2c0-b0): carrier: link connected
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.825 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[00efefa8-3450-4c70-b1f4-42df34c6b3e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.843 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b1c9d11a-a4a4-4dff-b15d-e05245906273]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc691e2c0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3d:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464140, 'reachable_time': 17959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217253, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.861 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[721b4a05-003e-432e-a03d-5e55a8e23b65]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:3d81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464140, 'tstamp': 464140}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217254, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.879 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa70ca4-50f7-46ee-836b-08985953e7e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc691e2c0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3d:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464140, 'reachable_time': 17959, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217255, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.913 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[86a762f1-ca5c-4865-8e23-c86be845dead]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.989 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b73c3ee0-0696-4c4c-99ef-ad3ba4568a79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.991 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc691e2c0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.991 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.991 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc691e2c0-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:13 np0005539505 NetworkManager[55134]: <info>  [1764399193.9943] manager: (tapc691e2c0-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Nov 29 01:53:13 np0005539505 nova_compute[186958]: 2025-11-29 06:53:13.993 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:13 np0005539505 kernel: tapc691e2c0-b0: entered promiscuous mode
Nov 29 01:53:13 np0005539505 nova_compute[186958]: 2025-11-29 06:53:13.996 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:13.997 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc691e2c0-b0, col_values=(('external_ids', {'iface-id': 'a88e36d5-5037-4505-8d26-de14faa22faf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:13 np0005539505 nova_compute[186958]: 2025-11-29 06:53:13.998 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:13 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:13Z|00103|binding|INFO|Releasing lport a88e36d5-5037-4505-8d26-de14faa22faf from this chassis (sb_readonly=0)
Nov 29 01:53:14 np0005539505 nova_compute[186958]: 2025-11-29 06:53:14.014 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:14.015 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:14.016 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b55bbfe1-3ac3-402d-a1f9-56b3535f30c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:14.017 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-c691e2c0-bf24-480c-9af6-236639f0492c
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID c691e2c0-bf24-480c-9af6-236639f0492c
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:53:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:14.018 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'env', 'PROCESS_TAG=haproxy-c691e2c0-bf24-480c-9af6-236639f0492c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c691e2c0-bf24-480c-9af6-236639f0492c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:53:14 np0005539505 nova_compute[186958]: 2025-11-29 06:53:14.216 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:14 np0005539505 nova_compute[186958]: 2025-11-29 06:53:14.217 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquired lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:14 np0005539505 nova_compute[186958]: 2025-11-29 06:53:14.217 186962 DEBUG nova.network.neutron [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:53:14 np0005539505 podman[217288]: 2025-11-29 06:53:14.476858681 +0000 UTC m=+0.033903926 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:53:14 np0005539505 podman[217288]: 2025-11-29 06:53:14.878515889 +0000 UTC m=+0.435561114 container create 15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:53:14 np0005539505 systemd[1]: Started libpod-conmon-15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245.scope.
Nov 29 01:53:14 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:53:14 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e59acc661d7e3f818d37eca485e1bb3a4c6f9f88eafd952fb94d5efd1733e56/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:53:14 np0005539505 podman[217288]: 2025-11-29 06:53:14.941907545 +0000 UTC m=+0.498952790 container init 15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:53:14 np0005539505 podman[217288]: 2025-11-29 06:53:14.949055117 +0000 UTC m=+0.506100342 container start 15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:53:14 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217304]: [NOTICE]   (217308) : New worker (217310) forked
Nov 29 01:53:14 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217304]: [NOTICE]   (217308) : Loading success.
Nov 29 01:53:15 np0005539505 nova_compute[186958]: 2025-11-29 06:53:15.190 186962 DEBUG nova.network.neutron [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Successfully updated port: fd28d90e-6ad0-431b-9e31-3be4166a5614 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:53:15 np0005539505 nova_compute[186958]: 2025-11-29 06:53:15.209 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "refresh_cache-bf8f759b-677f-4b17-8d4d-2eee6b28a740" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:15 np0005539505 nova_compute[186958]: 2025-11-29 06:53:15.209 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquired lock "refresh_cache-bf8f759b-677f-4b17-8d4d-2eee6b28a740" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:15 np0005539505 nova_compute[186958]: 2025-11-29 06:53:15.209 186962 DEBUG nova.network.neutron [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:53:15 np0005539505 nova_compute[186958]: 2025-11-29 06:53:15.346 186962 DEBUG nova.compute.manager [req-d9cbe0aa-0263-4233-8035-978238256c38 req-1a5c2748-e46d-485e-b887-5b9c11026953 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Received event network-changed-fd28d90e-6ad0-431b-9e31-3be4166a5614 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:15 np0005539505 nova_compute[186958]: 2025-11-29 06:53:15.347 186962 DEBUG nova.compute.manager [req-d9cbe0aa-0263-4233-8035-978238256c38 req-1a5c2748-e46d-485e-b887-5b9c11026953 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Refreshing instance network info cache due to event network-changed-fd28d90e-6ad0-431b-9e31-3be4166a5614. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:53:15 np0005539505 nova_compute[186958]: 2025-11-29 06:53:15.347 186962 DEBUG oslo_concurrency.lockutils [req-d9cbe0aa-0263-4233-8035-978238256c38 req-1a5c2748-e46d-485e-b887-5b9c11026953 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-bf8f759b-677f-4b17-8d4d-2eee6b28a740" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:16 np0005539505 nova_compute[186958]: 2025-11-29 06:53:16.214 186962 DEBUG nova.network.neutron [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:53:16 np0005539505 nova_compute[186958]: 2025-11-29 06:53:16.283 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.651 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.899 186962 DEBUG nova.network.neutron [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating instance_info_cache with network_info: [{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.922 186962 DEBUG nova.network.neutron [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Updating instance_info_cache with network_info: [{"id": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "address": "fa:16:3e:1e:bb:f3", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd28d90e-6a", "ovs_interfaceid": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.926 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Releasing lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.951 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Releasing lock "refresh_cache-bf8f759b-677f-4b17-8d4d-2eee6b28a740" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.951 186962 DEBUG nova.compute.manager [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Instance network_info: |[{"id": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "address": "fa:16:3e:1e:bb:f3", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd28d90e-6a", "ovs_interfaceid": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.951 186962 DEBUG oslo_concurrency.lockutils [req-d9cbe0aa-0263-4233-8035-978238256c38 req-1a5c2748-e46d-485e-b887-5b9c11026953 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-bf8f759b-677f-4b17-8d4d-2eee6b28a740" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.952 186962 DEBUG nova.network.neutron [req-d9cbe0aa-0263-4233-8035-978238256c38 req-1a5c2748-e46d-485e-b887-5b9c11026953 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Refreshing network info cache for port fd28d90e-6ad0-431b-9e31-3be4166a5614 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.954 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Start _get_guest_xml network_info=[{"id": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "address": "fa:16:3e:1e:bb:f3", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd28d90e-6a", "ovs_interfaceid": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.959 186962 WARNING nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.961 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.961 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.962 186962 DEBUG oslo_concurrency.lockutils [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.963 186962 DEBUG nova.virt.libvirt.host [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.964 186962 DEBUG nova.virt.libvirt.host [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.966 186962 INFO nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 29 01:53:17 np0005539505 virtqemud[186353]: Domain id=12 name='instance-00000017' uuid=2e380200-8276-4470-965f-31baa0bfd760 is tainted: custom-monitor
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.973 186962 DEBUG nova.virt.libvirt.host [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.973 186962 DEBUG nova.virt.libvirt.host [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.974 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.975 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.975 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.975 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.975 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.976 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.976 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.976 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.976 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.977 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.977 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.977 186962 DEBUG nova.virt.hardware [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.980 186962 DEBUG nova.virt.libvirt.vif [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:53:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1708997664',display_name='tempest-ImagesTestJSON-server-1708997664',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1708997664',id=25,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-g5l6lwqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:53:09Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=bf8f759b-677f-4b17-8d4d-2eee6b28a740,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "address": "fa:16:3e:1e:bb:f3", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd28d90e-6a", "ovs_interfaceid": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.980 186962 DEBUG nova.network.os_vif_util [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "address": "fa:16:3e:1e:bb:f3", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd28d90e-6a", "ovs_interfaceid": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.981 186962 DEBUG nova.network.os_vif_util [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:bb:f3,bridge_name='br-int',has_traffic_filtering=True,id=fd28d90e-6ad0-431b-9e31-3be4166a5614,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd28d90e-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.982 186962 DEBUG nova.objects.instance [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'pci_devices' on Instance uuid bf8f759b-677f-4b17-8d4d-2eee6b28a740 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:17 np0005539505 nova_compute[186958]: 2025-11-29 06:53:17.999 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  <uuid>bf8f759b-677f-4b17-8d4d-2eee6b28a740</uuid>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  <name>instance-00000019</name>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <nova:name>tempest-ImagesTestJSON-server-1708997664</nova:name>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:53:17</nova:creationTime>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:        <nova:user uuid="315be492c2ce4b9f8af2898e6794a256">tempest-ImagesTestJSON-1674785298-project-member</nova:user>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:        <nova:project uuid="78f8ba841bbe4fdcb9d9e2237d97bf73">tempest-ImagesTestJSON-1674785298</nova:project>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:        <nova:port uuid="fd28d90e-6ad0-431b-9e31-3be4166a5614">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <entry name="serial">bf8f759b-677f-4b17-8d4d-2eee6b28a740</entry>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <entry name="uuid">bf8f759b-677f-4b17-8d4d-2eee6b28a740</entry>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk.config"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:1e:bb:f3"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <target dev="tapfd28d90e-6a"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/console.log" append="off"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:53:18 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:53:18 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:53:18 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:53:18 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.000 186962 DEBUG nova.compute.manager [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Preparing to wait for external event network-vif-plugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.001 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.001 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.001 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.002 186962 DEBUG nova.virt.libvirt.vif [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:53:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1708997664',display_name='tempest-ImagesTestJSON-server-1708997664',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1708997664',id=25,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-g5l6lwqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:53:09Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=bf8f759b-677f-4b17-8d4d-2eee6b28a740,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "address": "fa:16:3e:1e:bb:f3", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd28d90e-6a", "ovs_interfaceid": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.002 186962 DEBUG nova.network.os_vif_util [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "address": "fa:16:3e:1e:bb:f3", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd28d90e-6a", "ovs_interfaceid": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.002 186962 DEBUG nova.network.os_vif_util [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:bb:f3,bridge_name='br-int',has_traffic_filtering=True,id=fd28d90e-6ad0-431b-9e31-3be4166a5614,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd28d90e-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.003 186962 DEBUG os_vif [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:bb:f3,bridge_name='br-int',has_traffic_filtering=True,id=fd28d90e-6ad0-431b-9e31-3be4166a5614,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd28d90e-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.003 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.004 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.004 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.009 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.009 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd28d90e-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.009 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfd28d90e-6a, col_values=(('external_ids', {'iface-id': 'fd28d90e-6ad0-431b-9e31-3be4166a5614', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:bb:f3', 'vm-uuid': 'bf8f759b-677f-4b17-8d4d-2eee6b28a740'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.010 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:18 np0005539505 NetworkManager[55134]: <info>  [1764399198.0122] manager: (tapfd28d90e-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.014 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.017 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.019 186962 INFO os_vif [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:bb:f3,bridge_name='br-int',has_traffic_filtering=True,id=fd28d90e-6ad0-431b-9e31-3be4166a5614,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd28d90e-6a')#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.553 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.554 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.554 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No VIF found with MAC fa:16:3e:1e:bb:f3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.554 186962 INFO nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Using config drive#033[00m
Nov 29 01:53:18 np0005539505 nova_compute[186958]: 2025-11-29 06:53:18.973 186962 INFO nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 29 01:53:19 np0005539505 nova_compute[186958]: 2025-11-29 06:53:19.346 186962 INFO nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Creating config drive at /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk.config#033[00m
Nov 29 01:53:19 np0005539505 nova_compute[186958]: 2025-11-29 06:53:19.351 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5n3atfyn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:19 np0005539505 nova_compute[186958]: 2025-11-29 06:53:19.480 186962 DEBUG oslo_concurrency.processutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5n3atfyn" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:19 np0005539505 NetworkManager[55134]: <info>  [1764399199.5361] manager: (tapfd28d90e-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Nov 29 01:53:19 np0005539505 kernel: tapfd28d90e-6a: entered promiscuous mode
Nov 29 01:53:19 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:19Z|00104|binding|INFO|Claiming lport fd28d90e-6ad0-431b-9e31-3be4166a5614 for this chassis.
Nov 29 01:53:19 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:19Z|00105|binding|INFO|fd28d90e-6ad0-431b-9e31-3be4166a5614: Claiming fa:16:3e:1e:bb:f3 10.100.0.12
Nov 29 01:53:19 np0005539505 nova_compute[186958]: 2025-11-29 06:53:19.539 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.550 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:bb:f3 10.100.0.12'], port_security=['fa:16:3e:1e:bb:f3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'bf8f759b-677f-4b17-8d4d-2eee6b28a740', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=fd28d90e-6ad0-431b-9e31-3be4166a5614) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.551 104094 INFO neutron.agent.ovn.metadata.agent [-] Port fd28d90e-6ad0-431b-9e31-3be4166a5614 in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba bound to our chassis#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.554 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba#033[00m
Nov 29 01:53:19 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:19Z|00106|binding|INFO|Setting lport fd28d90e-6ad0-431b-9e31-3be4166a5614 ovn-installed in OVS
Nov 29 01:53:19 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:19Z|00107|binding|INFO|Setting lport fd28d90e-6ad0-431b-9e31-3be4166a5614 up in Southbound
Nov 29 01:53:19 np0005539505 nova_compute[186958]: 2025-11-29 06:53:19.569 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:19 np0005539505 systemd-udevd[217338]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.570 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5e18ed14-d8aa-40b8-9f10-814d45b7541d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.572 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17ec2ca4-31 in ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:53:19 np0005539505 nova_compute[186958]: 2025-11-29 06:53:19.575 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.574 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17ec2ca4-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.574 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9563d5-d6f6-4642-a899-d311bf31b7bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.577 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8ff22643-c044-45c6-81f0-231957e5cd90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 NetworkManager[55134]: <info>  [1764399199.5929] device (tapfd28d90e-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:53:19 np0005539505 NetworkManager[55134]: <info>  [1764399199.5937] device (tapfd28d90e-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.597 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[60a54e63-7ced-46ea-99ad-3ba35c802886]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 systemd-machined[153285]: New machine qemu-13-instance-00000019.
Nov 29 01:53:19 np0005539505 systemd[1]: Started Virtual Machine qemu-13-instance-00000019.
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.628 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed64aa2-0382-47cd-a8a2-e6b9a93621b2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.677 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[138e235a-be84-46a0-9054-2da17a473ad1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.685 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dd5895d2-9bd8-40ba-b1bb-105ac9f18995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 NetworkManager[55134]: <info>  [1764399199.6868] manager: (tap17ec2ca4-30): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Nov 29 01:53:19 np0005539505 systemd-udevd[217343]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.719 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a349ac-9ec7-492c-82d6-f0bc2413cce3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.723 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[42e41cf9-13fc-42a5-8e33-b04592a6d85f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 NetworkManager[55134]: <info>  [1764399199.7486] device (tap17ec2ca4-30): carrier: link connected
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.754 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e2d4be9f-c0e1-43d4-a0ad-4497437b3cbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.774 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5da7ab16-3cd0-47b0-bf16-1ba5135f524f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464733, 'reachable_time': 37917, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217373, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.790 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[857a2be7-6ebd-45bd-b53d-8246cb5fe827]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:556b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464733, 'tstamp': 464733}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217374, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.810 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3a4e3bfe-60d4-44ab-a4a0-fece69495ac8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464733, 'reachable_time': 37917, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217375, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.843 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7d35a07a-15dd-4051-856c-7aa2162d5d76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.919 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d249a2bd-0c50-4d52-9651-5e7385786b27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.920 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.921 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.921 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17ec2ca4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:19 np0005539505 NetworkManager[55134]: <info>  [1764399199.9702] manager: (tap17ec2ca4-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Nov 29 01:53:19 np0005539505 nova_compute[186958]: 2025-11-29 06:53:19.969 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:19 np0005539505 kernel: tap17ec2ca4-30: entered promiscuous mode
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.973 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17ec2ca4-30, col_values=(('external_ids', {'iface-id': '97d66506-c891-4bf7-8595-2d091560f247'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:19 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:19Z|00108|binding|INFO|Releasing lport 97d66506-c891-4bf7-8595-2d091560f247 from this chassis (sb_readonly=0)
Nov 29 01:53:19 np0005539505 nova_compute[186958]: 2025-11-29 06:53:19.974 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:19 np0005539505 nova_compute[186958]: 2025-11-29 06:53:19.980 186962 INFO nova.virt.libvirt.driver [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 29 01:53:19 np0005539505 nova_compute[186958]: 2025-11-29 06:53:19.985 186962 DEBUG nova.compute.manager [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:19 np0005539505 nova_compute[186958]: 2025-11-29 06:53:19.986 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.986 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.987 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca9d48e-5bea-4311-b9ad-41b4ecde2ba8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.987 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:53:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:19.988 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'env', 'PROCESS_TAG=haproxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.049 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399200.0495746, bf8f759b-677f-4b17-8d4d-2eee6b28a740 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.050 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] VM Started (Lifecycle Event)#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.100 186962 DEBUG nova.objects.instance [None req-b6190cf2-3100-48db-adbc-c1efc49b2a77 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.196 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.202 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399200.049672, bf8f759b-677f-4b17-8d4d-2eee6b28a740 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.202 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.224 186962 DEBUG nova.network.neutron [req-d9cbe0aa-0263-4233-8035-978238256c38 req-1a5c2748-e46d-485e-b887-5b9c11026953 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Updated VIF entry in instance network info cache for port fd28d90e-6ad0-431b-9e31-3be4166a5614. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.224 186962 DEBUG nova.network.neutron [req-d9cbe0aa-0263-4233-8035-978238256c38 req-1a5c2748-e46d-485e-b887-5b9c11026953 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Updating instance_info_cache with network_info: [{"id": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "address": "fa:16:3e:1e:bb:f3", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd28d90e-6a", "ovs_interfaceid": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.381 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.382 186962 DEBUG oslo_concurrency.lockutils [req-d9cbe0aa-0263-4233-8035-978238256c38 req-1a5c2748-e46d-485e-b887-5b9c11026953 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-bf8f759b-677f-4b17-8d4d-2eee6b28a740" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.386 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:53:20 np0005539505 podman[217411]: 2025-11-29 06:53:20.321594249 +0000 UTC m=+0.029242345 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.418 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.747 186962 DEBUG nova.compute.manager [req-d3d85f0d-8234-4543-82a1-00c14a08b0de req-cb7dbc30-561a-441d-9b5a-749ca5c59719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Received event network-vif-plugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.747 186962 DEBUG oslo_concurrency.lockutils [req-d3d85f0d-8234-4543-82a1-00c14a08b0de req-cb7dbc30-561a-441d-9b5a-749ca5c59719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.747 186962 DEBUG oslo_concurrency.lockutils [req-d3d85f0d-8234-4543-82a1-00c14a08b0de req-cb7dbc30-561a-441d-9b5a-749ca5c59719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.747 186962 DEBUG oslo_concurrency.lockutils [req-d3d85f0d-8234-4543-82a1-00c14a08b0de req-cb7dbc30-561a-441d-9b5a-749ca5c59719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.747 186962 DEBUG nova.compute.manager [req-d3d85f0d-8234-4543-82a1-00c14a08b0de req-cb7dbc30-561a-441d-9b5a-749ca5c59719 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Processing event network-vif-plugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.748 186962 DEBUG nova.compute.manager [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.752 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399200.75249, bf8f759b-677f-4b17-8d4d-2eee6b28a740 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.752 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.754 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.757 186962 INFO nova.virt.libvirt.driver [-] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Instance spawned successfully.#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.757 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.779 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.784 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.785 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.786 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.787 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.787 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.788 186962 DEBUG nova.virt.libvirt.driver [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.795 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.824 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.873 186962 INFO nova.compute.manager [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Took 11.50 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.873 186962 DEBUG nova.compute.manager [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:20 np0005539505 nova_compute[186958]: 2025-11-29 06:53:20.932 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:20.933 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:21 np0005539505 nova_compute[186958]: 2025-11-29 06:53:21.003 186962 INFO nova.compute.manager [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Took 12.10 seconds to build instance.#033[00m
Nov 29 01:53:21 np0005539505 nova_compute[186958]: 2025-11-29 06:53:21.029 186962 DEBUG oslo_concurrency.lockutils [None req-2d9e95a6-c723-46a7-8d10-43d741514be2 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:21 np0005539505 podman[217411]: 2025-11-29 06:53:21.18928833 +0000 UTC m=+0.896936376 container create 76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 01:53:21 np0005539505 nova_compute[186958]: 2025-11-29 06:53:21.202 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399186.2008722, a6a138ce-4707-4db6-892e-31809c4b4e03 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:21 np0005539505 nova_compute[186958]: 2025-11-29 06:53:21.203 186962 INFO nova.compute.manager [-] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:53:21 np0005539505 nova_compute[186958]: 2025-11-29 06:53:21.232 186962 DEBUG nova.compute.manager [None req-845c1d15-c677-45ba-a2b6-11a2920536c2 - - - - - -] [instance: a6a138ce-4707-4db6-892e-31809c4b4e03] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:21 np0005539505 systemd[1]: Started libpod-conmon-76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6.scope.
Nov 29 01:53:21 np0005539505 podman[217425]: 2025-11-29 06:53:21.346924452 +0000 UTC m=+0.102759066 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:53:21 np0005539505 podman[217424]: 2025-11-29 06:53:21.358059846 +0000 UTC m=+0.119938961 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git)
Nov 29 01:53:21 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:53:21 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efdba227cb76f149c9c5553cd4efa55ac4222b7300183c3c1c2c30670659c9a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:53:21 np0005539505 podman[217411]: 2025-11-29 06:53:21.537874813 +0000 UTC m=+1.245522939 container init 76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:53:21 np0005539505 podman[217411]: 2025-11-29 06:53:21.543161852 +0000 UTC m=+1.250809898 container start 76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 01:53:21 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217462]: [NOTICE]   (217474) : New worker (217476) forked
Nov 29 01:53:21 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217462]: [NOTICE]   (217474) : Loading success.
Nov 29 01:53:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:21.627 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:53:22 np0005539505 nova_compute[186958]: 2025-11-29 06:53:22.778 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.011 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.073 186962 DEBUG nova.compute.manager [req-6d940411-c153-4e51-8fff-ca33712717ec req-2f54c8fe-5357-4823-adc2-b81fc7cfa904 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Received event network-vif-plugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.074 186962 DEBUG oslo_concurrency.lockutils [req-6d940411-c153-4e51-8fff-ca33712717ec req-2f54c8fe-5357-4823-adc2-b81fc7cfa904 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.074 186962 DEBUG oslo_concurrency.lockutils [req-6d940411-c153-4e51-8fff-ca33712717ec req-2f54c8fe-5357-4823-adc2-b81fc7cfa904 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.074 186962 DEBUG oslo_concurrency.lockutils [req-6d940411-c153-4e51-8fff-ca33712717ec req-2f54c8fe-5357-4823-adc2-b81fc7cfa904 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.074 186962 DEBUG nova.compute.manager [req-6d940411-c153-4e51-8fff-ca33712717ec req-2f54c8fe-5357-4823-adc2-b81fc7cfa904 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] No waiting events found dispatching network-vif-plugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.074 186962 WARNING nova.compute.manager [req-6d940411-c153-4e51-8fff-ca33712717ec req-2f54c8fe-5357-4823-adc2-b81fc7cfa904 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Received unexpected event network-vif-plugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 for instance with vm_state active and task_state None.#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.795 186962 DEBUG oslo_concurrency.lockutils [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.796 186962 DEBUG oslo_concurrency.lockutils [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.796 186962 DEBUG nova.compute.manager [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.801 186962 DEBUG nova.compute.manager [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.802 186962 DEBUG nova.objects.instance [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'flavor' on Instance uuid bf8f759b-677f-4b17-8d4d-2eee6b28a740 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.825 186962 DEBUG nova.objects.instance [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'info_cache' on Instance uuid bf8f759b-677f-4b17-8d4d-2eee6b28a740 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:23 np0005539505 nova_compute[186958]: 2025-11-29 06:53:23.867 186962 DEBUG nova.virt.libvirt.driver [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 01:53:24 np0005539505 nova_compute[186958]: 2025-11-29 06:53:24.292 186962 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Check if temp file /var/lib/nova/instances/tmpy70gw60a exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 29 01:53:24 np0005539505 nova_compute[186958]: 2025-11-29 06:53:24.293 186962 DEBUG nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy70gw60a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 29 01:53:24 np0005539505 podman[217485]: 2025-11-29 06:53:24.752992991 +0000 UTC m=+0.085564832 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 01:53:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:25.629 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:25 np0005539505 nova_compute[186958]: 2025-11-29 06:53:25.650 186962 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:25 np0005539505 nova_compute[186958]: 2025-11-29 06:53:25.741 186962 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:25 np0005539505 nova_compute[186958]: 2025-11-29 06:53:25.742 186962 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:25 np0005539505 nova_compute[186958]: 2025-11-29 06:53:25.808 186962 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:26.929 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:26.929 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:26.930 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:27 np0005539505 nova_compute[186958]: 2025-11-29 06:53:27.781 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:28 np0005539505 nova_compute[186958]: 2025-11-29 06:53:28.013 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:28 np0005539505 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:53:28 np0005539505 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:53:28 np0005539505 systemd-logind[794]: New session 29 of user nova.
Nov 29 01:53:28 np0005539505 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:53:28 np0005539505 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:53:28 np0005539505 systemd[217514]: Queued start job for default target Main User Target.
Nov 29 01:53:28 np0005539505 systemd[217514]: Created slice User Application Slice.
Nov 29 01:53:28 np0005539505 systemd[217514]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:53:28 np0005539505 systemd[217514]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:53:28 np0005539505 systemd[217514]: Reached target Paths.
Nov 29 01:53:28 np0005539505 systemd[217514]: Reached target Timers.
Nov 29 01:53:28 np0005539505 systemd[217514]: Starting D-Bus User Message Bus Socket...
Nov 29 01:53:28 np0005539505 systemd[217514]: Starting Create User's Volatile Files and Directories...
Nov 29 01:53:28 np0005539505 systemd[217514]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:53:28 np0005539505 systemd[217514]: Reached target Sockets.
Nov 29 01:53:28 np0005539505 systemd[217514]: Finished Create User's Volatile Files and Directories.
Nov 29 01:53:28 np0005539505 systemd[217514]: Reached target Basic System.
Nov 29 01:53:28 np0005539505 systemd[217514]: Reached target Main User Target.
Nov 29 01:53:28 np0005539505 systemd[217514]: Startup finished in 137ms.
Nov 29 01:53:28 np0005539505 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:53:28 np0005539505 systemd[1]: Started Session 29 of User nova.
Nov 29 01:53:28 np0005539505 systemd[1]: session-29.scope: Deactivated successfully.
Nov 29 01:53:28 np0005539505 systemd-logind[794]: Session 29 logged out. Waiting for processes to exit.
Nov 29 01:53:28 np0005539505 systemd-logind[794]: Removed session 29.
Nov 29 01:53:29 np0005539505 nova_compute[186958]: 2025-11-29 06:53:29.651 186962 DEBUG nova.compute.manager [req-f38b057e-0f6f-4a8f-bd13-4d666c7b7cc0 req-b9ef10ab-97f4-4b74-8fab-e48aa8ced7ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:29 np0005539505 nova_compute[186958]: 2025-11-29 06:53:29.653 186962 DEBUG oslo_concurrency.lockutils [req-f38b057e-0f6f-4a8f-bd13-4d666c7b7cc0 req-b9ef10ab-97f4-4b74-8fab-e48aa8ced7ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:29 np0005539505 nova_compute[186958]: 2025-11-29 06:53:29.654 186962 DEBUG oslo_concurrency.lockutils [req-f38b057e-0f6f-4a8f-bd13-4d666c7b7cc0 req-b9ef10ab-97f4-4b74-8fab-e48aa8ced7ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:29 np0005539505 nova_compute[186958]: 2025-11-29 06:53:29.655 186962 DEBUG oslo_concurrency.lockutils [req-f38b057e-0f6f-4a8f-bd13-4d666c7b7cc0 req-b9ef10ab-97f4-4b74-8fab-e48aa8ced7ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:29 np0005539505 nova_compute[186958]: 2025-11-29 06:53:29.655 186962 DEBUG nova.compute.manager [req-f38b057e-0f6f-4a8f-bd13-4d666c7b7cc0 req-b9ef10ab-97f4-4b74-8fab-e48aa8ced7ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:29 np0005539505 nova_compute[186958]: 2025-11-29 06:53:29.656 186962 DEBUG nova.compute.manager [req-f38b057e-0f6f-4a8f-bd13-4d666c7b7cc0 req-b9ef10ab-97f4-4b74-8fab-e48aa8ced7ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.354 186962 INFO nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Took 4.54 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.354 186962 DEBUG nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.373 186962 DEBUG nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpy70gw60a',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2e380200-8276-4470-965f-31baa0bfd760',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(f9e8d008-f146-49a2-a2c0-835a0311a251),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.393 186962 DEBUG nova.objects.instance [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lazy-loading 'migration_context' on Instance uuid 2e380200-8276-4470-965f-31baa0bfd760 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.394 186962 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.396 186962 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.396 186962 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.409 186962 DEBUG nova.virt.libvirt.vif [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:52:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1351543550',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1351543550',id=23,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:52:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-68mzhrqj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:53:20Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=2e380200-8276-4470-965f-31baa0bfd760,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.409 186962 DEBUG nova.network.os_vif_util [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converting VIF {"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.410 186962 DEBUG nova.network.os_vif_util [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.410 186962 DEBUG nova.virt.libvirt.migration [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating guest XML with vif config: <interface type="ethernet">
Nov 29 01:53:30 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:56:a7:a1"/>
Nov 29 01:53:30 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 01:53:30 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:53:30 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 01:53:30 np0005539505 nova_compute[186958]:  <target dev="tap1ff22547-58"/>
Nov 29 01:53:30 np0005539505 nova_compute[186958]: </interface>
Nov 29 01:53:30 np0005539505 nova_compute[186958]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.411 186962 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 29 01:53:30 np0005539505 podman[217531]: 2025-11-29 06:53:30.718129264 +0000 UTC m=+0.049491456 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:53:30 np0005539505 podman[217532]: 2025-11-29 06:53:30.775263224 +0000 UTC m=+0.105422282 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.898 186962 DEBUG nova.virt.libvirt.migration [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:53:30 np0005539505 nova_compute[186958]: 2025-11-29 06:53:30.899 186962 INFO nova.virt.libvirt.migration [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.011 186962 INFO nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.514 186962 DEBUG nova.virt.libvirt.migration [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.515 186962 DEBUG nova.virt.libvirt.migration [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.785 186962 DEBUG nova.compute.manager [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.785 186962 DEBUG oslo_concurrency.lockutils [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.785 186962 DEBUG oslo_concurrency.lockutils [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.786 186962 DEBUG oslo_concurrency.lockutils [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.786 186962 DEBUG nova.compute.manager [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.786 186962 WARNING nova.compute.manager [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received unexpected event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.786 186962 DEBUG nova.compute.manager [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-changed-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.786 186962 DEBUG nova.compute.manager [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Refreshing instance network info cache due to event network-changed-1ff22547-5892-4360-8abe-429ea2f212ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.787 186962 DEBUG oslo_concurrency.lockutils [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.787 186962 DEBUG oslo_concurrency.lockutils [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:31 np0005539505 nova_compute[186958]: 2025-11-29 06:53:31.787 186962 DEBUG nova.network.neutron [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Refreshing network info cache for port 1ff22547-5892-4360-8abe-429ea2f212ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:53:32 np0005539505 nova_compute[186958]: 2025-11-29 06:53:32.018 186962 DEBUG nova.virt.libvirt.migration [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:53:32 np0005539505 nova_compute[186958]: 2025-11-29 06:53:32.018 186962 DEBUG nova.virt.libvirt.migration [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:53:32 np0005539505 nova_compute[186958]: 2025-11-29 06:53:32.521 186962 DEBUG nova.virt.libvirt.migration [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:53:32 np0005539505 nova_compute[186958]: 2025-11-29 06:53:32.523 186962 DEBUG nova.virt.libvirt.migration [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:53:32 np0005539505 nova_compute[186958]: 2025-11-29 06:53:32.783 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:32 np0005539505 nova_compute[186958]: 2025-11-29 06:53:32.886 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399212.8861995, 2e380200-8276-4470-965f-31baa0bfd760 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:32 np0005539505 nova_compute[186958]: 2025-11-29 06:53:32.886 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:53:32 np0005539505 nova_compute[186958]: 2025-11-29 06:53:32.919 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:32 np0005539505 nova_compute[186958]: 2025-11-29 06:53:32.932 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:53:32 np0005539505 nova_compute[186958]: 2025-11-29 06:53:32.957 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.025 186962 DEBUG nova.virt.libvirt.migration [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.026 186962 DEBUG nova.virt.libvirt.migration [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.189 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:33 np0005539505 kernel: tap1ff22547-58 (unregistering): left promiscuous mode
Nov 29 01:53:33 np0005539505 NetworkManager[55134]: <info>  [1764399213.2071] device (tap1ff22547-58): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:53:33 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:33Z|00109|binding|INFO|Releasing lport 1ff22547-5892-4360-8abe-429ea2f212ee from this chassis (sb_readonly=0)
Nov 29 01:53:33 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:33Z|00110|binding|INFO|Setting lport 1ff22547-5892-4360-8abe-429ea2f212ee down in Southbound
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.214 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:33 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:33Z|00111|binding|INFO|Removing iface tap1ff22547-58 ovn-installed in OVS
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.216 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.223 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:a7:a1 10.100.0.5'], port_security=['fa:16:3e:56:a7:a1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'a43628b3-9efd-4940-9509-686038e16aeb'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2e380200-8276-4470-965f-31baa0bfd760', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c691e2c0-bf24-480c-9af6-236639f0492c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '17', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d3611d1-4470-4c82-ad19-45393cd04081, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=1ff22547-5892-4360-8abe-429ea2f212ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.224 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 1ff22547-5892-4360-8abe-429ea2f212ee in datapath c691e2c0-bf24-480c-9af6-236639f0492c unbound from our chassis#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.225 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c691e2c0-bf24-480c-9af6-236639f0492c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.227 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f8979421-d089-4cc1-a6b5-5cda6e7d27bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.227 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c namespace which is not needed anymore#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.229 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:33 np0005539505 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000017.scope: Deactivated successfully.
Nov 29 01:53:33 np0005539505 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000017.scope: Consumed 3.943s CPU time.
Nov 29 01:53:33 np0005539505 systemd-machined[153285]: Machine qemu-12-instance-00000017 terminated.
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.352 186962 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.352 186962 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.352 186962 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 29 01:53:33 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217304]: [NOTICE]   (217308) : haproxy version is 2.8.14-c23fe91
Nov 29 01:53:33 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217304]: [NOTICE]   (217308) : path to executable is /usr/sbin/haproxy
Nov 29 01:53:33 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217304]: [WARNING]  (217308) : Exiting Master process...
Nov 29 01:53:33 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217304]: [WARNING]  (217308) : Exiting Master process...
Nov 29 01:53:33 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217304]: [ALERT]    (217308) : Current worker (217310) exited with code 143 (Terminated)
Nov 29 01:53:33 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[217304]: [WARNING]  (217308) : All workers exited. Exiting... (0)
Nov 29 01:53:33 np0005539505 systemd[1]: libpod-15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245.scope: Deactivated successfully.
Nov 29 01:53:33 np0005539505 podman[217636]: 2025-11-29 06:53:33.439912949 +0000 UTC m=+0.106463430 container died 15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:53:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245-userdata-shm.mount: Deactivated successfully.
Nov 29 01:53:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay-9e59acc661d7e3f818d37eca485e1bb3a4c6f9f88eafd952fb94d5efd1733e56-merged.mount: Deactivated successfully.
Nov 29 01:53:33 np0005539505 podman[217636]: 2025-11-29 06:53:33.502301318 +0000 UTC m=+0.168851779 container cleanup 15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 01:53:33 np0005539505 systemd[1]: libpod-conmon-15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245.scope: Deactivated successfully.
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.528 186962 DEBUG nova.virt.libvirt.guest [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '2e380200-8276-4470-965f-31baa0bfd760' (instance-00000017) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.529 186962 INFO nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migration operation has completed#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.529 186962 INFO nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] _post_live_migration() is started..#033[00m
Nov 29 01:53:33 np0005539505 podman[217673]: 2025-11-29 06:53:33.563943755 +0000 UTC m=+0.041524822 container remove 15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.568 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6e460e7f-565a-4a04-ad76-23bd80c9d687]: (4, ('Sat Nov 29 06:53:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c (15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245)\n15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245\nSat Nov 29 06:53:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c (15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245)\n15f61c7f66548118ecb801e4b7ecf8d920b2fb2473f1720ba93eb63de7a0b245\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.570 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2114d335-9c76-4d57-8166-d6e3824ae378]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.571 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc691e2c0-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.574 186962 DEBUG nova.network.neutron [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updated VIF entry in instance network info cache for port 1ff22547-5892-4360-8abe-429ea2f212ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.575 186962 DEBUG nova.network.neutron [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Updating instance_info_cache with network_info: [{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:33 np0005539505 kernel: tapc691e2c0-b0: left promiscuous mode
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.577 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.591 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.594 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[039a499c-d0ba-4175-b95a-7a99eaec8f7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.609 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5fafcc50-7175-4967-8f20-4661d39116a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.610 186962 DEBUG oslo_concurrency.lockutils [req-c74e9e90-1c0f-462b-82f8-ed0dd9e8e779 req-e148da3b-fa68-4455-8640-b79bb4f3f802 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2e380200-8276-4470-965f-31baa0bfd760" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.611 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[79b8f2bd-492d-4167-bfaf-159e1d97fbc9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.625 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe1952f-df60-4662-9e15-771a175f2147]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464132, 'reachable_time': 24069, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217690, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:33 np0005539505 systemd[1]: run-netns-ovnmeta\x2dc691e2c0\x2dbf24\x2d480c\x2d9af6\x2d236639f0492c.mount: Deactivated successfully.
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.630 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:53:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:33.631 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[f5dfb056-5b4d-44eb-8408-fa6e69bbab26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.754 186962 DEBUG nova.compute.manager [req-e1a0113d-95ad-49a1-bfb4-ba46a69794fd req-9725a392-8a23-4419-9850-434425910d49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.755 186962 DEBUG oslo_concurrency.lockutils [req-e1a0113d-95ad-49a1-bfb4-ba46a69794fd req-9725a392-8a23-4419-9850-434425910d49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.755 186962 DEBUG oslo_concurrency.lockutils [req-e1a0113d-95ad-49a1-bfb4-ba46a69794fd req-9725a392-8a23-4419-9850-434425910d49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.755 186962 DEBUG oslo_concurrency.lockutils [req-e1a0113d-95ad-49a1-bfb4-ba46a69794fd req-9725a392-8a23-4419-9850-434425910d49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.755 186962 DEBUG nova.compute.manager [req-e1a0113d-95ad-49a1-bfb4-ba46a69794fd req-9725a392-8a23-4419-9850-434425910d49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.756 186962 DEBUG nova.compute.manager [req-e1a0113d-95ad-49a1-bfb4-ba46a69794fd req-9725a392-8a23-4419-9850-434425910d49 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:53:33 np0005539505 nova_compute[186958]: 2025-11-29 06:53:33.913 186962 DEBUG nova.virt.libvirt.driver [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.593 186962 DEBUG nova.network.neutron [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Activated binding for port 1ff22547-5892-4360-8abe-429ea2f212ee and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.594 186962 DEBUG nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.595 186962 DEBUG nova.virt.libvirt.vif [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:52:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1351543550',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1351543550',id=23,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:52:57Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-68mzhrqj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:53:23Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=2e380200-8276-4470-965f-31baa0bfd760,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.596 186962 DEBUG nova.network.os_vif_util [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converting VIF {"id": "1ff22547-5892-4360-8abe-429ea2f212ee", "address": "fa:16:3e:56:a7:a1", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ff22547-58", "ovs_interfaceid": "1ff22547-5892-4360-8abe-429ea2f212ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.597 186962 DEBUG nova.network.os_vif_util [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.597 186962 DEBUG os_vif [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.599 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.600 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ff22547-58, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.602 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.605 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.608 186962 INFO os_vif [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:a7:a1,bridge_name='br-int',has_traffic_filtering=True,id=1ff22547-5892-4360-8abe-429ea2f212ee,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ff22547-58')#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.608 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.609 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.609 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.609 186962 DEBUG nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.609 186962 INFO nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Deleting instance files /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760_del#033[00m
Nov 29 01:53:34 np0005539505 nova_compute[186958]: 2025-11-29 06:53:34.610 186962 INFO nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Deletion of /var/lib/nova/instances/2e380200-8276-4470-965f-31baa0bfd760_del complete#033[00m
Nov 29 01:53:35 np0005539505 podman[217691]: 2025-11-29 06:53:35.738444401 +0000 UTC m=+0.071029673 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.844 186962 DEBUG nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.845 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.845 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.846 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.846 186962 DEBUG nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.846 186962 WARNING nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received unexpected event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.846 186962 DEBUG nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.846 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.847 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.847 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.847 186962 DEBUG nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.847 186962 WARNING nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received unexpected event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.847 186962 DEBUG nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.847 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.848 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.848 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.848 186962 DEBUG nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.848 186962 DEBUG nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-unplugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.848 186962 DEBUG nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.849 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.849 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.849 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.849 186962 DEBUG nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.850 186962 WARNING nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received unexpected event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.850 186962 DEBUG nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.850 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.850 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.850 186962 DEBUG oslo_concurrency.lockutils [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.851 186962 DEBUG nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] No waiting events found dispatching network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:35 np0005539505 nova_compute[186958]: 2025-11-29 06:53:35.851 186962 WARNING nova.compute.manager [req-59dc6e7b-05f6-4e85-9f88-a994d0bb2461 req-f784fa4e-76e3-4c68-96da-4b6c4403cf51 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Received unexpected event network-vif-plugged-1ff22547-5892-4360-8abe-429ea2f212ee for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:53:36 np0005539505 kernel: tapfd28d90e-6a (unregistering): left promiscuous mode
Nov 29 01:53:36 np0005539505 NetworkManager[55134]: <info>  [1764399216.1668] device (tapfd28d90e-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:53:36 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:36Z|00112|binding|INFO|Releasing lport fd28d90e-6ad0-431b-9e31-3be4166a5614 from this chassis (sb_readonly=0)
Nov 29 01:53:36 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:36Z|00113|binding|INFO|Setting lport fd28d90e-6ad0-431b-9e31-3be4166a5614 down in Southbound
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.176 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:36 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:36Z|00114|binding|INFO|Removing iface tapfd28d90e-6a ovn-installed in OVS
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.180 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.186 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:bb:f3 10.100.0.12'], port_security=['fa:16:3e:1e:bb:f3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'bf8f759b-677f-4b17-8d4d-2eee6b28a740', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=fd28d90e-6ad0-431b-9e31-3be4166a5614) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.189 104094 INFO neutron.agent.ovn.metadata.agent [-] Port fd28d90e-6ad0-431b-9e31-3be4166a5614 in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba unbound from our chassis#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.192 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.194 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a61a48d9-fdaf-486e-9e43-18dbce6501b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.194 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.195 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace which is not needed anymore#033[00m
Nov 29 01:53:36 np0005539505 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000019.scope: Deactivated successfully.
Nov 29 01:53:36 np0005539505 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000019.scope: Consumed 12.738s CPU time.
Nov 29 01:53:36 np0005539505 systemd-machined[153285]: Machine qemu-13-instance-00000019 terminated.
Nov 29 01:53:36 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217462]: [NOTICE]   (217474) : haproxy version is 2.8.14-c23fe91
Nov 29 01:53:36 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217462]: [NOTICE]   (217474) : path to executable is /usr/sbin/haproxy
Nov 29 01:53:36 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217462]: [WARNING]  (217474) : Exiting Master process...
Nov 29 01:53:36 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217462]: [ALERT]    (217474) : Current worker (217476) exited with code 143 (Terminated)
Nov 29 01:53:36 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[217462]: [WARNING]  (217474) : All workers exited. Exiting... (0)
Nov 29 01:53:36 np0005539505 systemd[1]: libpod-76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6.scope: Deactivated successfully.
Nov 29 01:53:36 np0005539505 podman[217733]: 2025-11-29 06:53:36.319288487 +0000 UTC m=+0.044966438 container died 76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:53:36 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6-userdata-shm.mount: Deactivated successfully.
Nov 29 01:53:36 np0005539505 systemd[1]: var-lib-containers-storage-overlay-efdba227cb76f149c9c5553cd4efa55ac4222b7300183c3c1c2c30670659c9a8-merged.mount: Deactivated successfully.
Nov 29 01:53:36 np0005539505 podman[217733]: 2025-11-29 06:53:36.353847241 +0000 UTC m=+0.079525192 container cleanup 76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 01:53:36 np0005539505 systemd[1]: libpod-conmon-76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6.scope: Deactivated successfully.
Nov 29 01:53:36 np0005539505 NetworkManager[55134]: <info>  [1764399216.3930] manager: (tapfd28d90e-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Nov 29 01:53:36 np0005539505 podman[217761]: 2025-11-29 06:53:36.442730796 +0000 UTC m=+0.068747708 container remove 76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.449 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d14298-9ef8-42d3-a132-160ea2f89e17]: (4, ('Sat Nov 29 06:53:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6)\n76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6\nSat Nov 29 06:53:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6)\n76ea28a4ff45a4b1dc8894d18c047af743e3ab641e45d956ead84cae5a185ef6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.451 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b169f6-3bbb-4cbf-aec7-5a91e7c1e8b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.452 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.454 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:36 np0005539505 kernel: tap17ec2ca4-30: left promiscuous mode
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.471 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.474 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0eeeaaee-2e15-4af5-9a33-c7cd558d01e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.493 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c98824-8c01-476f-a3a6-4e426c3d8eed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.495 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ff55df-91f2-41f1-8315-bf6f69eb5ac0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.511 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c84c7203-1e4c-4741-ab58-00baa81e3a96]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464725, 'reachable_time': 24191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217794, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:36 np0005539505 systemd[1]: run-netns-ovnmeta\x2d17ec2ca4\x2d3fa9\x2d41aa\x2d80ef\x2d35bf92d404ba.mount: Deactivated successfully.
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.515 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:53:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:36.515 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a24284-24c4-4eae-b9aa-6b01d176b4ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.553 186962 DEBUG nova.compute.manager [req-235be538-2fba-42f6-8c37-381c1e816445 req-8489155b-2ea9-4233-8567-1747da7e963b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Received event network-vif-unplugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.554 186962 DEBUG oslo_concurrency.lockutils [req-235be538-2fba-42f6-8c37-381c1e816445 req-8489155b-2ea9-4233-8567-1747da7e963b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.554 186962 DEBUG oslo_concurrency.lockutils [req-235be538-2fba-42f6-8c37-381c1e816445 req-8489155b-2ea9-4233-8567-1747da7e963b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.554 186962 DEBUG oslo_concurrency.lockutils [req-235be538-2fba-42f6-8c37-381c1e816445 req-8489155b-2ea9-4233-8567-1747da7e963b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.554 186962 DEBUG nova.compute.manager [req-235be538-2fba-42f6-8c37-381c1e816445 req-8489155b-2ea9-4233-8567-1747da7e963b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] No waiting events found dispatching network-vif-unplugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.554 186962 WARNING nova.compute.manager [req-235be538-2fba-42f6-8c37-381c1e816445 req-8489155b-2ea9-4233-8567-1747da7e963b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Received unexpected event network-vif-unplugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 for instance with vm_state active and task_state powering-off.#033[00m
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.926 186962 INFO nova.virt.libvirt.driver [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Instance shutdown successfully after 13 seconds.#033[00m
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.931 186962 INFO nova.virt.libvirt.driver [-] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Instance destroyed successfully.#033[00m
Nov 29 01:53:36 np0005539505 nova_compute[186958]: 2025-11-29 06:53:36.932 186962 DEBUG nova.objects.instance [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'numa_topology' on Instance uuid bf8f759b-677f-4b17-8d4d-2eee6b28a740 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:37 np0005539505 nova_compute[186958]: 2025-11-29 06:53:37.081 186962 DEBUG nova.compute.manager [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:37 np0005539505 nova_compute[186958]: 2025-11-29 06:53:37.155 186962 DEBUG oslo_concurrency.lockutils [None req-167a6239-79e3-4067-86b2-5171fd057799 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.359s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:37 np0005539505 nova_compute[186958]: 2025-11-29 06:53:37.818 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:38 np0005539505 nova_compute[186958]: 2025-11-29 06:53:38.676 186962 DEBUG nova.compute.manager [req-c177bd08-54b6-4e74-b8de-5561c0fb0d51 req-cf034d3c-9a03-4cf3-a3a9-2a0904d3ecdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Received event network-vif-plugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:38 np0005539505 nova_compute[186958]: 2025-11-29 06:53:38.676 186962 DEBUG oslo_concurrency.lockutils [req-c177bd08-54b6-4e74-b8de-5561c0fb0d51 req-cf034d3c-9a03-4cf3-a3a9-2a0904d3ecdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:38 np0005539505 nova_compute[186958]: 2025-11-29 06:53:38.676 186962 DEBUG oslo_concurrency.lockutils [req-c177bd08-54b6-4e74-b8de-5561c0fb0d51 req-cf034d3c-9a03-4cf3-a3a9-2a0904d3ecdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:38 np0005539505 nova_compute[186958]: 2025-11-29 06:53:38.677 186962 DEBUG oslo_concurrency.lockutils [req-c177bd08-54b6-4e74-b8de-5561c0fb0d51 req-cf034d3c-9a03-4cf3-a3a9-2a0904d3ecdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:38 np0005539505 nova_compute[186958]: 2025-11-29 06:53:38.677 186962 DEBUG nova.compute.manager [req-c177bd08-54b6-4e74-b8de-5561c0fb0d51 req-cf034d3c-9a03-4cf3-a3a9-2a0904d3ecdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] No waiting events found dispatching network-vif-plugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:53:38 np0005539505 nova_compute[186958]: 2025-11-29 06:53:38.677 186962 WARNING nova.compute.manager [req-c177bd08-54b6-4e74-b8de-5561c0fb0d51 req-cf034d3c-9a03-4cf3-a3a9-2a0904d3ecdc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Received unexpected event network-vif-plugged-fd28d90e-6ad0-431b-9e31-3be4166a5614 for instance with vm_state stopped and task_state None.#033[00m
Nov 29 01:53:38 np0005539505 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:53:38 np0005539505 systemd[217514]: Activating special unit Exit the Session...
Nov 29 01:53:38 np0005539505 systemd[217514]: Stopped target Main User Target.
Nov 29 01:53:38 np0005539505 systemd[217514]: Stopped target Basic System.
Nov 29 01:53:38 np0005539505 systemd[217514]: Stopped target Paths.
Nov 29 01:53:38 np0005539505 systemd[217514]: Stopped target Sockets.
Nov 29 01:53:38 np0005539505 systemd[217514]: Stopped target Timers.
Nov 29 01:53:38 np0005539505 systemd[217514]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:53:38 np0005539505 systemd[217514]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:53:38 np0005539505 systemd[217514]: Closed D-Bus User Message Bus Socket.
Nov 29 01:53:38 np0005539505 systemd[217514]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:53:38 np0005539505 systemd[217514]: Removed slice User Application Slice.
Nov 29 01:53:38 np0005539505 systemd[217514]: Reached target Shutdown.
Nov 29 01:53:38 np0005539505 systemd[217514]: Finished Exit the Session.
Nov 29 01:53:38 np0005539505 systemd[217514]: Reached target Exit the Session.
Nov 29 01:53:38 np0005539505 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:53:38 np0005539505 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:53:38 np0005539505 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:53:38 np0005539505 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:53:38 np0005539505 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:53:38 np0005539505 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:53:38 np0005539505 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:53:39 np0005539505 nova_compute[186958]: 2025-11-29 06:53:39.603 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:39 np0005539505 podman[217797]: 2025-11-29 06:53:39.74995382 +0000 UTC m=+0.073520333 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.030 186962 DEBUG nova.compute.manager [None req-341b99e6-d072-486d-8b63-57620651ba28 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.087 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "2e380200-8276-4470-965f-31baa0bfd760-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.088 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.089 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "2e380200-8276-4470-965f-31baa0bfd760-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.116 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.117 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.117 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.117 186962 DEBUG nova.compute.resource_tracker [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.328 186962 INFO nova.compute.manager [None req-341b99e6-d072-486d-8b63-57620651ba28 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] instance snapshotting#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.329 186962 WARNING nova.compute.manager [None req-341b99e6-d072-486d-8b63-57620651ba28 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] trying to snapshot a non-running instance: (state: 4 expected: 1)#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.342 186962 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.401 186962 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.402 186962 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.481 186962 DEBUG oslo_concurrency.processutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.662 186962 INFO nova.virt.libvirt.driver [None req-341b99e6-d072-486d-8b63-57620651ba28 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Beginning cold snapshot process#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.682 186962 WARNING nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.684 186962 DEBUG nova.compute.resource_tracker [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5735MB free_disk=73.23998641967773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.684 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.685 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.762 186962 DEBUG nova.compute.resource_tracker [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Migration for instance 2e380200-8276-4470-965f-31baa0bfd760 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.793 186962 DEBUG nova.compute.resource_tracker [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.829 186962 DEBUG nova.compute.resource_tracker [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Instance bf8f759b-677f-4b17-8d4d-2eee6b28a740 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.829 186962 DEBUG nova.compute.resource_tracker [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Migration f9e8d008-f146-49a2-a2c0-835a0311a251 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.830 186962 DEBUG nova.compute.resource_tracker [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.830 186962 DEBUG nova.compute.resource_tracker [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.893 186962 DEBUG nova.compute.provider_tree [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.901 186962 DEBUG nova.privsep.utils [None req-341b99e6-d072-486d-8b63-57620651ba28 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.901 186962 DEBUG oslo_concurrency.processutils [None req-341b99e6-d072-486d-8b63-57620651ba28 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk /var/lib/nova/instances/snapshots/tmpbhmlauo6/3c0cd61843204ce188ea1bad6f77c446 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.927 186962 DEBUG nova.scheduler.client.report [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.955 186962 DEBUG nova.compute.resource_tracker [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.955 186962 DEBUG oslo_concurrency.lockutils [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:40 np0005539505 nova_compute[186958]: 2025-11-29 06:53:40.975 186962 INFO nova.compute.manager [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Migrating instance to compute-1.ctlplane.example.com finished successfully.#033[00m
Nov 29 01:53:41 np0005539505 nova_compute[186958]: 2025-11-29 06:53:41.060 186962 INFO nova.scheduler.client.report [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Deleted allocation for migration f9e8d008-f146-49a2-a2c0-835a0311a251#033[00m
Nov 29 01:53:41 np0005539505 nova_compute[186958]: 2025-11-29 06:53:41.061 186962 DEBUG nova.virt.libvirt.driver [None req-7febfa0d-d5f7-419e-9615-fc0d7a379468 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 29 01:53:41 np0005539505 nova_compute[186958]: 2025-11-29 06:53:41.404 186962 DEBUG oslo_concurrency.processutils [None req-341b99e6-d072-486d-8b63-57620651ba28 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk /var/lib/nova/instances/snapshots/tmpbhmlauo6/3c0cd61843204ce188ea1bad6f77c446" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:41 np0005539505 nova_compute[186958]: 2025-11-29 06:53:41.406 186962 INFO nova.virt.libvirt.driver [None req-341b99e6-d072-486d-8b63-57620651ba28 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Snapshot extracted, beginning image upload#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.394 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.419 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.420 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.420 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.421 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.509 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.597 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.598 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.680 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.811 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.812 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5755MB free_disk=73.1911506652832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.813 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.813 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.820 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.893 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance bf8f759b-677f-4b17-8d4d-2eee6b28a740 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.894 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.894 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.913 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.931 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.931 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:53:42 np0005539505 nova_compute[186958]: 2025-11-29 06:53:42.974 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:53:43 np0005539505 nova_compute[186958]: 2025-11-29 06:53:43.011 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:53:43 np0005539505 nova_compute[186958]: 2025-11-29 06:53:43.072 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:43 np0005539505 nova_compute[186958]: 2025-11-29 06:53:43.095 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:43 np0005539505 nova_compute[186958]: 2025-11-29 06:53:43.128 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:53:43 np0005539505 nova_compute[186958]: 2025-11-29 06:53:43.129 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:44 np0005539505 nova_compute[186958]: 2025-11-29 06:53:44.568 186962 INFO nova.virt.libvirt.driver [None req-341b99e6-d072-486d-8b63-57620651ba28 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Snapshot image upload complete#033[00m
Nov 29 01:53:44 np0005539505 nova_compute[186958]: 2025-11-29 06:53:44.569 186962 INFO nova.compute.manager [None req-341b99e6-d072-486d-8b63-57620651ba28 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Took 4.21 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 01:53:44 np0005539505 nova_compute[186958]: 2025-11-29 06:53:44.605 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:45 np0005539505 nova_compute[186958]: 2025-11-29 06:53:45.129 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:45 np0005539505 nova_compute[186958]: 2025-11-29 06:53:45.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:45 np0005539505 nova_compute[186958]: 2025-11-29 06:53:45.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.539 186962 DEBUG oslo_concurrency.lockutils [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.540 186962 DEBUG oslo_concurrency.lockutils [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.541 186962 DEBUG oslo_concurrency.lockutils [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.541 186962 DEBUG oslo_concurrency.lockutils [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.542 186962 DEBUG oslo_concurrency.lockutils [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.558 186962 INFO nova.compute.manager [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Terminating instance#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.573 186962 DEBUG nova.compute.manager [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.583 186962 INFO nova.virt.libvirt.driver [-] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Instance destroyed successfully.#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.584 186962 DEBUG nova.objects.instance [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'resources' on Instance uuid bf8f759b-677f-4b17-8d4d-2eee6b28a740 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.607 186962 DEBUG nova.virt.libvirt.vif [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:53:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1708997664',display_name='tempest-ImagesTestJSON-server-1708997664',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1708997664',id=25,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:53:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-g5l6lwqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:53:44Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=bf8f759b-677f-4b17-8d4d-2eee6b28a740,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "address": "fa:16:3e:1e:bb:f3", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd28d90e-6a", "ovs_interfaceid": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.608 186962 DEBUG nova.network.os_vif_util [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "address": "fa:16:3e:1e:bb:f3", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd28d90e-6a", "ovs_interfaceid": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.610 186962 DEBUG nova.network.os_vif_util [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:bb:f3,bridge_name='br-int',has_traffic_filtering=True,id=fd28d90e-6ad0-431b-9e31-3be4166a5614,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd28d90e-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.610 186962 DEBUG os_vif [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:bb:f3,bridge_name='br-int',has_traffic_filtering=True,id=fd28d90e-6ad0-431b-9e31-3be4166a5614,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd28d90e-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.612 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.613 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd28d90e-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.615 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.617 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.617 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.620 186962 INFO os_vif [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:bb:f3,bridge_name='br-int',has_traffic_filtering=True,id=fd28d90e-6ad0-431b-9e31-3be4166a5614,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd28d90e-6a')#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.621 186962 INFO nova.virt.libvirt.driver [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Deleting instance files /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740_del#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.627 186962 INFO nova.virt.libvirt.driver [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Deletion of /var/lib/nova/instances/bf8f759b-677f-4b17-8d4d-2eee6b28a740_del complete#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.636 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-bf8f759b-677f-4b17-8d4d-2eee6b28a740" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.637 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-bf8f759b-677f-4b17-8d4d-2eee6b28a740" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.637 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.637 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bf8f759b-677f-4b17-8d4d-2eee6b28a740 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.950 186962 INFO nova.compute.manager [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.951 186962 DEBUG oslo.service.loopingcall [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.951 186962 DEBUG nova.compute.manager [-] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:53:46 np0005539505 nova_compute[186958]: 2025-11-29 06:53:46.952 186962 DEBUG nova.network.neutron [-] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:53:47 np0005539505 nova_compute[186958]: 2025-11-29 06:53:47.821 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.231 186962 DEBUG nova.network.neutron [-] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.252 186962 INFO nova.compute.manager [-] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Took 1.30 seconds to deallocate network for instance.#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.332 186962 DEBUG oslo_concurrency.lockutils [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.332 186962 DEBUG oslo_concurrency.lockutils [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.349 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399213.3485153, 2e380200-8276-4470-965f-31baa0bfd760 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.349 186962 INFO nova.compute.manager [-] [instance: 2e380200-8276-4470-965f-31baa0bfd760] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.360 186962 DEBUG nova.compute.manager [req-8f9ee6cd-c68e-4869-8580-7f3c2e1ee1c0 req-a76e2bcb-636d-42d9-9153-239c29a87a9a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Received event network-vif-deleted-fd28d90e-6ad0-431b-9e31-3be4166a5614 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.372 186962 DEBUG nova.compute.manager [None req-63fadbf5-a693-44e0-bf27-ffae79905a6f - - - - - -] [instance: 2e380200-8276-4470-965f-31baa0bfd760] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.401 186962 DEBUG nova.compute.provider_tree [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.426 186962 DEBUG nova.scheduler.client.report [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.501 186962 DEBUG oslo_concurrency.lockutils [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.537 186962 INFO nova.scheduler.client.report [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Deleted allocations for instance bf8f759b-677f-4b17-8d4d-2eee6b28a740#033[00m
Nov 29 01:53:48 np0005539505 nova_compute[186958]: 2025-11-29 06:53:48.627 186962 DEBUG oslo_concurrency.lockutils [None req-a1a8f111-a04c-41e8-bc51-f19989772a19 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "bf8f759b-677f-4b17-8d4d-2eee6b28a740" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:49 np0005539505 nova_compute[186958]: 2025-11-29 06:53:49.276 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Updating instance_info_cache with network_info: [{"id": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "address": "fa:16:3e:1e:bb:f3", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd28d90e-6a", "ovs_interfaceid": "fd28d90e-6ad0-431b-9e31-3be4166a5614", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:49 np0005539505 nova_compute[186958]: 2025-11-29 06:53:49.300 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-bf8f759b-677f-4b17-8d4d-2eee6b28a740" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:49 np0005539505 nova_compute[186958]: 2025-11-29 06:53:49.300 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:53:49 np0005539505 nova_compute[186958]: 2025-11-29 06:53:49.301 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:49 np0005539505 nova_compute[186958]: 2025-11-29 06:53:49.301 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:51 np0005539505 nova_compute[186958]: 2025-11-29 06:53:51.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:53:51 np0005539505 nova_compute[186958]: 2025-11-29 06:53:51.428 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399216.4267015, bf8f759b-677f-4b17-8d4d-2eee6b28a740 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:51 np0005539505 nova_compute[186958]: 2025-11-29 06:53:51.428 186962 INFO nova.compute.manager [-] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:53:51 np0005539505 nova_compute[186958]: 2025-11-29 06:53:51.448 186962 DEBUG nova.compute.manager [None req-5ff6e6eb-6ec0-403c-bd13-b80b3a8ea7f1 - - - - - -] [instance: bf8f759b-677f-4b17-8d4d-2eee6b28a740] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:53:51 np0005539505 nova_compute[186958]: 2025-11-29 06:53:51.615 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:51 np0005539505 podman[217842]: 2025-11-29 06:53:51.727167673 +0000 UTC m=+0.053799387 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:53:51 np0005539505 podman[217841]: 2025-11-29 06:53:51.740377425 +0000 UTC m=+0.071387333 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 01:53:52 np0005539505 nova_compute[186958]: 2025-11-29 06:53:52.824 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.343 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.344 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.377 186962 DEBUG nova.compute.manager [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.641 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.642 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.648 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.649 186962 INFO nova.compute.claims [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.762 186962 DEBUG nova.compute.provider_tree [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.778 186962 DEBUG nova.scheduler.client.report [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.803 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.804 186962 DEBUG nova.compute.manager [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.856 186962 DEBUG nova.compute.manager [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.857 186962 DEBUG nova.network.neutron [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.882 186962 INFO nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:53:54 np0005539505 nova_compute[186958]: 2025-11-29 06:53:54.910 186962 DEBUG nova.compute.manager [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.047 186962 DEBUG nova.compute.manager [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.050 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.050 186962 INFO nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Creating image(s)#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.051 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "/var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.052 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "/var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.054 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "/var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.082 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.160 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.162 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.162 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.174 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.247 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.248 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.292 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.294 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.295 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.325 186962 DEBUG nova.policy [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea965b54cc694db4abef98ad9973e9f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.363 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.364 186962 DEBUG nova.virt.disk.api [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Checking if we can resize image /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.365 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.430 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.432 186962 DEBUG nova.virt.disk.api [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Cannot resize image /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.433 186962 DEBUG nova.objects.instance [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lazy-loading 'migration_context' on Instance uuid 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.469 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.470 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Ensure instance console log exists: /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.471 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.471 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:55 np0005539505 nova_compute[186958]: 2025-11-29 06:53:55.472 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:55 np0005539505 podman[217903]: 2025-11-29 06:53:55.750079454 +0000 UTC m=+0.060960237 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:53:56 np0005539505 nova_compute[186958]: 2025-11-29 06:53:56.617 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:56 np0005539505 nova_compute[186958]: 2025-11-29 06:53:56.772 186962 DEBUG nova.network.neutron [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Successfully updated port: 38a0a755-afa1-4c97-9582-82fb739c7e6c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:53:56 np0005539505 nova_compute[186958]: 2025-11-29 06:53:56.803 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:56 np0005539505 nova_compute[186958]: 2025-11-29 06:53:56.804 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquired lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:56 np0005539505 nova_compute[186958]: 2025-11-29 06:53:56.804 186962 DEBUG nova.network.neutron [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:53:57 np0005539505 nova_compute[186958]: 2025-11-29 06:53:57.003 186962 DEBUG nova.compute.manager [req-33c8a198-1491-482a-9b0e-2977a30aa5ac req-d98af5d8-defc-40f7-9485-122653c58e99 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-changed-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:57 np0005539505 nova_compute[186958]: 2025-11-29 06:53:57.003 186962 DEBUG nova.compute.manager [req-33c8a198-1491-482a-9b0e-2977a30aa5ac req-d98af5d8-defc-40f7-9485-122653c58e99 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Refreshing instance network info cache due to event network-changed-38a0a755-afa1-4c97-9582-82fb739c7e6c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:53:57 np0005539505 nova_compute[186958]: 2025-11-29 06:53:57.005 186962 DEBUG oslo_concurrency.lockutils [req-33c8a198-1491-482a-9b0e-2977a30aa5ac req-d98af5d8-defc-40f7-9485-122653c58e99 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:53:57 np0005539505 nova_compute[186958]: 2025-11-29 06:53:57.033 186962 DEBUG nova.network.neutron [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:53:57 np0005539505 nova_compute[186958]: 2025-11-29 06:53:57.826 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.483 186962 DEBUG nova.network.neutron [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Updating instance_info_cache with network_info: [{"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.504 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Releasing lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.505 186962 DEBUG nova.compute.manager [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Instance network_info: |[{"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.506 186962 DEBUG oslo_concurrency.lockutils [req-33c8a198-1491-482a-9b0e-2977a30aa5ac req-d98af5d8-defc-40f7-9485-122653c58e99 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.508 186962 DEBUG nova.network.neutron [req-33c8a198-1491-482a-9b0e-2977a30aa5ac req-d98af5d8-defc-40f7-9485-122653c58e99 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Refreshing network info cache for port 38a0a755-afa1-4c97-9582-82fb739c7e6c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.514 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Start _get_guest_xml network_info=[{"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.522 186962 WARNING nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.533 186962 DEBUG nova.virt.libvirt.host [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.535 186962 DEBUG nova.virt.libvirt.host [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.541 186962 DEBUG nova.virt.libvirt.host [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.542 186962 DEBUG nova.virt.libvirt.host [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.544 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.545 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.546 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.546 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.547 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.547 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.548 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.549 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.549 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.550 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.550 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.551 186962 DEBUG nova.virt.hardware [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.558 186962 DEBUG nova.virt.libvirt.vif [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:53:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1120356955',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1120356955',id=27,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-x246813w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:53:54Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.559 186962 DEBUG nova.network.os_vif_util [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converting VIF {"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.563 186962 DEBUG nova.network.os_vif_util [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.565 186962 DEBUG nova.objects.instance [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.584 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  <uuid>7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6</uuid>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  <name>instance-0000001b</name>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1120356955</nova:name>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:53:58</nova:creationTime>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:        <nova:user uuid="ea965b54cc694db4abef98ad9973e9f2">tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member</nova:user>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:        <nova:project uuid="93dcd8ffe78147b69c244e2e3bfc2121">tempest-LiveAutoBlockMigrationV225Test-1343206834</nova:project>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:        <nova:port uuid="38a0a755-afa1-4c97-9582-82fb739c7e6c">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <entry name="serial">7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6</entry>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <entry name="uuid">7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6</entry>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.config"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:dd:1a:af"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <target dev="tap38a0a755-af"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/console.log" append="off"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:53:58 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:53:58 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:53:58 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:53:58 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.588 186962 DEBUG nova.compute.manager [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Preparing to wait for external event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.589 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.589 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.589 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.590 186962 DEBUG nova.virt.libvirt.vif [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:53:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1120356955',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1120356955',id=27,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-x246813w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:53:54Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.590 186962 DEBUG nova.network.os_vif_util [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converting VIF {"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.591 186962 DEBUG nova.network.os_vif_util [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.591 186962 DEBUG os_vif [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.592 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.592 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.593 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.596 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.596 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap38a0a755-af, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.596 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap38a0a755-af, col_values=(('external_ids', {'iface-id': '38a0a755-afa1-4c97-9582-82fb739c7e6c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:1a:af', 'vm-uuid': '7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.598 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:58 np0005539505 NetworkManager[55134]: <info>  [1764399238.5995] manager: (tap38a0a755-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.601 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.608 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.608 186962 INFO os_vif [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af')#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.676 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.677 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.677 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] No VIF found with MAC fa:16:3e:dd:1a:af, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:53:58 np0005539505 nova_compute[186958]: 2025-11-29 06:53:58.677 186962 INFO nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Using config drive#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.080 186962 INFO nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Creating config drive at /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.config#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.086 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkzewcq_p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.212 186962 DEBUG oslo_concurrency.processutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkzewcq_p" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:53:59 np0005539505 kernel: tap38a0a755-af: entered promiscuous mode
Nov 29 01:53:59 np0005539505 NetworkManager[55134]: <info>  [1764399239.2801] manager: (tap38a0a755-af): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Nov 29 01:53:59 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:59Z|00115|binding|INFO|Claiming lport 38a0a755-afa1-4c97-9582-82fb739c7e6c for this chassis.
Nov 29 01:53:59 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:59Z|00116|binding|INFO|38a0a755-afa1-4c97-9582-82fb739c7e6c: Claiming fa:16:3e:dd:1a:af 10.100.0.7
Nov 29 01:53:59 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:59Z|00117|binding|INFO|Claiming lport c3ae15da-9b95-4494-b01e-202115261f9d for this chassis.
Nov 29 01:53:59 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:59Z|00118|binding|INFO|c3ae15da-9b95-4494-b01e-202115261f9d: Claiming fa:16:3e:02:a9:0c 19.80.0.52
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.287 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.295 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:1a:af 10.100.0.7'], port_security=['fa:16:3e:dd:1a:af 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-2018647846', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c691e2c0-bf24-480c-9af6-236639f0492c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-2018647846', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d3611d1-4470-4c82-ad19-45393cd04081, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=38a0a755-afa1-4c97-9582-82fb739c7e6c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.299 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:a9:0c 19.80.0.52'], port_security=['fa:16:3e:02:a9:0c 19.80.0.52'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['38a0a755-afa1-4c97-9582-82fb739c7e6c'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-70072018', 'neutron:cidrs': '19.80.0.52/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-70072018', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=69db7c70-3ac5-4b08-99b1-a77caa10cb9e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c3ae15da-9b95-4494-b01e-202115261f9d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.301 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 38a0a755-afa1-4c97-9582-82fb739c7e6c in datapath c691e2c0-bf24-480c-9af6-236639f0492c bound to our chassis#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.304 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c691e2c0-bf24-480c-9af6-236639f0492c#033[00m
Nov 29 01:53:59 np0005539505 systemd-udevd[217942]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.321 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cd156813-bae1-4889-ab33-be2c0f90cb83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.322 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc691e2c0-b1 in ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:53:59 np0005539505 NetworkManager[55134]: <info>  [1764399239.3260] device (tap38a0a755-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.325 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc691e2c0-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.325 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd4465c-2683-47dc-8560-c1f8394a3ce8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 NetworkManager[55134]: <info>  [1764399239.3270] device (tap38a0a755-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.327 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[63cba57f-78c9-4425-9589-9d7e33d2b30b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 systemd-machined[153285]: New machine qemu-14-instance-0000001b.
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.348 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[9541c9dc-3d1c-46b8-a70e-395d7836a2d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.360 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:59 np0005539505 systemd[1]: Started Virtual Machine qemu-14-instance-0000001b.
Nov 29 01:53:59 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:59Z|00119|binding|INFO|Setting lport 38a0a755-afa1-4c97-9582-82fb739c7e6c ovn-installed in OVS
Nov 29 01:53:59 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:59Z|00120|binding|INFO|Setting lport 38a0a755-afa1-4c97-9582-82fb739c7e6c up in Southbound
Nov 29 01:53:59 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:59Z|00121|binding|INFO|Setting lport c3ae15da-9b95-4494-b01e-202115261f9d up in Southbound
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.367 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.382 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[22979db6-1292-4422-b3cf-42781ed77124]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.425 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[d2de14c1-6bb2-4df3-99da-b90c5be4e5f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.433 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[22ac9cdf-dcf7-4a90-941b-78fc25d7177d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 NetworkManager[55134]: <info>  [1764399239.4341] manager: (tapc691e2c0-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.476 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3a3e47-0e3c-48e4-a9ee-7153f7489a3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.480 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f49295fb-d39c-4f34-8b88-a4879f11e793]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 NetworkManager[55134]: <info>  [1764399239.5114] device (tapc691e2c0-b0): carrier: link connected
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.517 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f73abc98-a1b1-4684-9e49-a9ad87b9fef8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.533 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[15f786c0-43f7-46ec-9883-6ea119621911]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc691e2c0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3d:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468709, 'reachable_time': 41903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217975, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.551 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cda4d8d5-450b-437d-a823-5531be447beb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:3d81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468709, 'tstamp': 468709}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217976, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.572 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8145a482-0788-48f3-aba6-aa501c56ce7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc691e2c0-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:3d:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468709, 'reachable_time': 41903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217977, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.611 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5c894f-37f0-4adc-8793-29a4d06b01ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.680 186962 DEBUG nova.compute.manager [req-56b4b0c9-a696-4fee-bd81-81eef3000f7e req-13eb5e35-e999-4921-baf5-1d94c8b87af7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.680 186962 DEBUG oslo_concurrency.lockutils [req-56b4b0c9-a696-4fee-bd81-81eef3000f7e req-13eb5e35-e999-4921-baf5-1d94c8b87af7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.687 186962 DEBUG oslo_concurrency.lockutils [req-56b4b0c9-a696-4fee-bd81-81eef3000f7e req-13eb5e35-e999-4921-baf5-1d94c8b87af7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.688 186962 DEBUG oslo_concurrency.lockutils [req-56b4b0c9-a696-4fee-bd81-81eef3000f7e req-13eb5e35-e999-4921-baf5-1d94c8b87af7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.688 186962 DEBUG nova.compute.manager [req-56b4b0c9-a696-4fee-bd81-81eef3000f7e req-13eb5e35-e999-4921-baf5-1d94c8b87af7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Processing event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.691 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[711d79a1-4886-4e7f-b1f0-695a7fe49632]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.694 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc691e2c0-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.694 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.694 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc691e2c0-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:59 np0005539505 NetworkManager[55134]: <info>  [1764399239.6988] manager: (tapc691e2c0-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Nov 29 01:53:59 np0005539505 kernel: tapc691e2c0-b0: entered promiscuous mode
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.699 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.702 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc691e2c0-b0, col_values=(('external_ids', {'iface-id': 'a88e36d5-5037-4505-8d26-de14faa22faf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:53:59 np0005539505 ovn_controller[95143]: 2025-11-29T06:53:59Z|00122|binding|INFO|Releasing lport a88e36d5-5037-4505-8d26-de14faa22faf from this chassis (sb_readonly=0)
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.722 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.722 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.723 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dfe1a74a-aa66-4ffb-a1ae-6b5633dfcb98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.724 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-c691e2c0-bf24-480c-9af6-236639f0492c
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/c691e2c0-bf24-480c-9af6-236639f0492c.pid.haproxy
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID c691e2c0-bf24-480c-9af6-236639f0492c
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:53:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:53:59.724 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'env', 'PROCESS_TAG=haproxy-c691e2c0-bf24-480c-9af6-236639f0492c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c691e2c0-bf24-480c-9af6-236639f0492c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.891 186962 DEBUG nova.network.neutron [req-33c8a198-1491-482a-9b0e-2977a30aa5ac req-d98af5d8-defc-40f7-9485-122653c58e99 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Updated VIF entry in instance network info cache for port 38a0a755-afa1-4c97-9582-82fb739c7e6c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.892 186962 DEBUG nova.network.neutron [req-33c8a198-1491-482a-9b0e-2977a30aa5ac req-d98af5d8-defc-40f7-9485-122653c58e99 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Updating instance_info_cache with network_info: [{"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.931 186962 DEBUG oslo_concurrency.lockutils [req-33c8a198-1491-482a-9b0e-2977a30aa5ac req-d98af5d8-defc-40f7-9485-122653c58e99 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.953 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399239.9517488, 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.954 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] VM Started (Lifecycle Event)#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.957 186962 DEBUG nova.compute.manager [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.961 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.966 186962 INFO nova.virt.libvirt.driver [-] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Instance spawned successfully.#033[00m
Nov 29 01:53:59 np0005539505 nova_compute[186958]: 2025-11-29 06:53:59.967 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.006 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.016 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.023 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.024 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.024 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.025 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.026 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.027 186962 DEBUG nova.virt.libvirt.driver [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.067 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.068 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399239.952555, 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.068 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.136 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.141 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399239.9600832, 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.142 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.203 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.209 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.239 186962 INFO nova.compute.manager [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Took 5.19 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.241 186962 DEBUG nova.compute.manager [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.243 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:54:00 np0005539505 podman[218013]: 2025-11-29 06:54:00.156698467 +0000 UTC m=+0.034540584 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.516 186962 INFO nova.compute.manager [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Took 5.91 seconds to build instance.#033[00m
Nov 29 01:54:00 np0005539505 nova_compute[186958]: 2025-11-29 06:54:00.555 186962 DEBUG oslo_concurrency.lockutils [None req-d6a5bed4-feee-48b8-9f48-f56fe1f8c7f0 ea965b54cc694db4abef98ad9973e9f2 93dcd8ffe78147b69c244e2e3bfc2121 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:01 np0005539505 podman[218013]: 2025-11-29 06:54:01.718940309 +0000 UTC m=+1.596782396 container create 38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 01:54:01 np0005539505 nova_compute[186958]: 2025-11-29 06:54:01.749 186962 DEBUG nova.compute.manager [req-48658007-9c7e-4c0a-8644-9ad035ad8750 req-b5b25a9e-b400-4d22-89ce-a84ebdf856f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:01 np0005539505 nova_compute[186958]: 2025-11-29 06:54:01.751 186962 DEBUG oslo_concurrency.lockutils [req-48658007-9c7e-4c0a-8644-9ad035ad8750 req-b5b25a9e-b400-4d22-89ce-a84ebdf856f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:01 np0005539505 nova_compute[186958]: 2025-11-29 06:54:01.751 186962 DEBUG oslo_concurrency.lockutils [req-48658007-9c7e-4c0a-8644-9ad035ad8750 req-b5b25a9e-b400-4d22-89ce-a84ebdf856f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:01 np0005539505 nova_compute[186958]: 2025-11-29 06:54:01.751 186962 DEBUG oslo_concurrency.lockutils [req-48658007-9c7e-4c0a-8644-9ad035ad8750 req-b5b25a9e-b400-4d22-89ce-a84ebdf856f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:01 np0005539505 nova_compute[186958]: 2025-11-29 06:54:01.752 186962 DEBUG nova.compute.manager [req-48658007-9c7e-4c0a-8644-9ad035ad8750 req-b5b25a9e-b400-4d22-89ce-a84ebdf856f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] No waiting events found dispatching network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:01 np0005539505 nova_compute[186958]: 2025-11-29 06:54:01.752 186962 WARNING nova.compute.manager [req-48658007-9c7e-4c0a-8644-9ad035ad8750 req-b5b25a9e-b400-4d22-89ce-a84ebdf856f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received unexpected event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c for instance with vm_state active and task_state None.#033[00m
Nov 29 01:54:02 np0005539505 systemd[1]: Started libpod-conmon-38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f.scope.
Nov 29 01:54:02 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:54:02 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6365263f053b05ed5dbda5362f2c2341264beac455a54fc961e953b21e061d0e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:54:02 np0005539505 podman[218026]: 2025-11-29 06:54:02.121732569 +0000 UTC m=+0.444138296 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:54:02 np0005539505 podman[218013]: 2025-11-29 06:54:02.291594806 +0000 UTC m=+2.169436903 container init 38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 01:54:02 np0005539505 podman[218013]: 2025-11-29 06:54:02.307160194 +0000 UTC m=+2.185002261 container start 38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 01:54:02 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[218055]: [NOTICE]   (218078) : New worker (218080) forked
Nov 29 01:54:02 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[218055]: [NOTICE]   (218078) : Loading success.
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.772 104094 INFO neutron.agent.ovn.metadata.agent [-] Port c3ae15da-9b95-4494-b01e-202115261f9d in datapath 3e6779cb-7f6f-419d-b2e7-0b18b601b6be unbound from our chassis#033[00m
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.775 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3e6779cb-7f6f-419d-b2e7-0b18b601b6be#033[00m
Nov 29 01:54:02 np0005539505 podman[218027]: 2025-11-29 06:54:02.782876662 +0000 UTC m=+1.107729928 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.787 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2afbd1a6-d724-4a29-a89e-0a373fc80e83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.788 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3e6779cb-71 in ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.791 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3e6779cb-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.791 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d88216-d3a2-4fe9-bd9b-770e9a0188cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.793 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6a24f23a-cee4-4e2a-ba0d-451d346c8d1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.803 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[40112983-2d1e-4bdd-b4dc-d6e8ce281af4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.826 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0f7ab07d-eca1-4fcd-8524-dcfe4e60b62a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:02 np0005539505 nova_compute[186958]: 2025-11-29 06:54:02.828 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.857 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[7de7e85a-9d10-4d31-b60c-27832895e5dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:02 np0005539505 NetworkManager[55134]: <info>  [1764399242.8798] manager: (tap3e6779cb-70): new Veth device (/org/freedesktop/NetworkManager/Devices/67)
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.879 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c9dc4849-eb3d-4b79-89f4-3618037c9284]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:02 np0005539505 systemd-udevd[218099]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.931 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2b11a1ca-95b8-4f4f-940c-4c0db91e89b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.935 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b62adca8-5b66-4c1e-846d-d10739e8957e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:02 np0005539505 NetworkManager[55134]: <info>  [1764399242.9711] device (tap3e6779cb-70): carrier: link connected
Nov 29 01:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:02.978 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f29fbe1d-2a3f-4987-bef1-764ee63def64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.009 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[335adae8-a184-4881-a5f1-70eb485b2650]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e6779cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:bd:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469055, 'reachable_time': 26842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218118, 'error': None, 'target': 'ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.026 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c5336707-c073-40bf-a6ec-afd2573c012a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe37:bd1b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 469055, 'tstamp': 469055}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218119, 'error': None, 'target': 'ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.047 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[82e7a3dc-4107-4cae-ab91-e8714a2c9fdf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3e6779cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:37:bd:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469055, 'reachable_time': 26842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218120, 'error': None, 'target': 'ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.083 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7a6d7318-ba2e-426e-9c4d-e46c50cb12ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.133 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[48817bbc-879d-4ff3-a56c-05f29747fe0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.135 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e6779cb-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.135 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.136 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e6779cb-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:03 np0005539505 NetworkManager[55134]: <info>  [1764399243.1384] manager: (tap3e6779cb-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Nov 29 01:54:03 np0005539505 nova_compute[186958]: 2025-11-29 06:54:03.138 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:03 np0005539505 kernel: tap3e6779cb-70: entered promiscuous mode
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.146 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3e6779cb-70, col_values=(('external_ids', {'iface-id': '933cdbf7-1588-4a26-b171-b3d2ec3cd1a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:03 np0005539505 nova_compute[186958]: 2025-11-29 06:54:03.147 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:03 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:03Z|00123|binding|INFO|Releasing lport 933cdbf7-1588-4a26-b171-b3d2ec3cd1a3 from this chassis (sb_readonly=0)
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.158 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3e6779cb-7f6f-419d-b2e7-0b18b601b6be.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3e6779cb-7f6f-419d-b2e7-0b18b601b6be.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.158 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a4506877-956b-4c3c-b385-e1844db965bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.159 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-3e6779cb-7f6f-419d-b2e7-0b18b601b6be
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/3e6779cb-7f6f-419d-b2e7-0b18b601b6be.pid.haproxy
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 3e6779cb-7f6f-419d-b2e7-0b18b601b6be
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:54:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:03.160 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'env', 'PROCESS_TAG=haproxy-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3e6779cb-7f6f-419d-b2e7-0b18b601b6be.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:54:03 np0005539505 nova_compute[186958]: 2025-11-29 06:54:03.164 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:03 np0005539505 podman[218152]: 2025-11-29 06:54:03.54612233 +0000 UTC m=+0.062831202 container create db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:54:03 np0005539505 systemd[1]: Started libpod-conmon-db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b.scope.
Nov 29 01:54:03 np0005539505 nova_compute[186958]: 2025-11-29 06:54:03.598 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:03 np0005539505 podman[218152]: 2025-11-29 06:54:03.521925878 +0000 UTC m=+0.038634770 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:54:03 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:54:03 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78a1553def243cbf273bab52631d8061b2b1435960211787e997a70038adedd6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:54:03 np0005539505 podman[218152]: 2025-11-29 06:54:03.626059292 +0000 UTC m=+0.142768184 container init db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 01:54:03 np0005539505 podman[218152]: 2025-11-29 06:54:03.631866636 +0000 UTC m=+0.148575508 container start db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:54:03 np0005539505 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[218168]: [NOTICE]   (218172) : New worker (218174) forked
Nov 29 01:54:03 np0005539505 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[218168]: [NOTICE]   (218172) : Loading success.
Nov 29 01:54:04 np0005539505 nova_compute[186958]: 2025-11-29 06:54:04.412 186962 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Check if temp file /var/lib/nova/instances/tmpkxgr55ve exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Nov 29 01:54:04 np0005539505 nova_compute[186958]: 2025-11-29 06:54:04.413 186962 DEBUG nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkxgr55ve',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Nov 29 01:54:05 np0005539505 nova_compute[186958]: 2025-11-29 06:54:05.687 186962 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:05 np0005539505 nova_compute[186958]: 2025-11-29 06:54:05.760 186962 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:05 np0005539505 nova_compute[186958]: 2025-11-29 06:54:05.762 186962 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:05 np0005539505 nova_compute[186958]: 2025-11-29 06:54:05.822 186962 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:06 np0005539505 podman[218189]: 2025-11-29 06:54:06.777561058 +0000 UTC m=+0.095465931 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:54:07 np0005539505 nova_compute[186958]: 2025-11-29 06:54:07.831 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:08 np0005539505 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:54:08 np0005539505 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:54:08 np0005539505 systemd-logind[794]: New session 31 of user nova.
Nov 29 01:54:08 np0005539505 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:54:08 np0005539505 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:54:08 np0005539505 systemd[218213]: Queued start job for default target Main User Target.
Nov 29 01:54:08 np0005539505 systemd[218213]: Created slice User Application Slice.
Nov 29 01:54:08 np0005539505 systemd[218213]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:54:08 np0005539505 systemd[218213]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:54:08 np0005539505 systemd[218213]: Reached target Paths.
Nov 29 01:54:08 np0005539505 systemd[218213]: Reached target Timers.
Nov 29 01:54:08 np0005539505 systemd[218213]: Starting D-Bus User Message Bus Socket...
Nov 29 01:54:08 np0005539505 systemd[218213]: Starting Create User's Volatile Files and Directories...
Nov 29 01:54:08 np0005539505 systemd[218213]: Finished Create User's Volatile Files and Directories.
Nov 29 01:54:08 np0005539505 systemd[218213]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:54:08 np0005539505 systemd[218213]: Reached target Sockets.
Nov 29 01:54:08 np0005539505 systemd[218213]: Reached target Basic System.
Nov 29 01:54:08 np0005539505 systemd[218213]: Reached target Main User Target.
Nov 29 01:54:08 np0005539505 systemd[218213]: Startup finished in 195ms.
Nov 29 01:54:08 np0005539505 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:54:08 np0005539505 systemd[1]: Started Session 31 of User nova.
Nov 29 01:54:08 np0005539505 systemd[1]: session-31.scope: Deactivated successfully.
Nov 29 01:54:08 np0005539505 systemd-logind[794]: Session 31 logged out. Waiting for processes to exit.
Nov 29 01:54:08 np0005539505 systemd-logind[794]: Removed session 31.
Nov 29 01:54:08 np0005539505 nova_compute[186958]: 2025-11-29 06:54:08.601 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:09 np0005539505 nova_compute[186958]: 2025-11-29 06:54:09.503 186962 DEBUG nova.compute.manager [req-5fbc6eea-7ace-411a-b96a-7f58a604b6d7 req-4f41c176-8de1-4618-bd8a-bce470febab7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-unplugged-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:09 np0005539505 nova_compute[186958]: 2025-11-29 06:54:09.503 186962 DEBUG oslo_concurrency.lockutils [req-5fbc6eea-7ace-411a-b96a-7f58a604b6d7 req-4f41c176-8de1-4618-bd8a-bce470febab7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:09 np0005539505 nova_compute[186958]: 2025-11-29 06:54:09.503 186962 DEBUG oslo_concurrency.lockutils [req-5fbc6eea-7ace-411a-b96a-7f58a604b6d7 req-4f41c176-8de1-4618-bd8a-bce470febab7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:09 np0005539505 nova_compute[186958]: 2025-11-29 06:54:09.504 186962 DEBUG oslo_concurrency.lockutils [req-5fbc6eea-7ace-411a-b96a-7f58a604b6d7 req-4f41c176-8de1-4618-bd8a-bce470febab7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:09 np0005539505 nova_compute[186958]: 2025-11-29 06:54:09.504 186962 DEBUG nova.compute.manager [req-5fbc6eea-7ace-411a-b96a-7f58a604b6d7 req-4f41c176-8de1-4618-bd8a-bce470febab7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] No waiting events found dispatching network-vif-unplugged-38a0a755-afa1-4c97-9582-82fb739c7e6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:09 np0005539505 nova_compute[186958]: 2025-11-29 06:54:09.505 186962 DEBUG nova.compute.manager [req-5fbc6eea-7ace-411a-b96a-7f58a604b6d7 req-4f41c176-8de1-4618-bd8a-bce470febab7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-unplugged-38a0a755-afa1-4c97-9582-82fb739c7e6c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:54:10 np0005539505 podman[218229]: 2025-11-29 06:54:10.734016566 +0000 UTC m=+0.056522173 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.847 186962 INFO nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Took 5.02 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.#033[00m
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.848 186962 DEBUG nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.869 186962 DEBUG nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkxgr55ve',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(c740b153-ca40-4917-9ce9-b4debd1533e6),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.898 186962 DEBUG nova.objects.instance [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lazy-loading 'migration_context' on Instance uuid 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.899 186962 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.901 186962 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.901 186962 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.950 186962 DEBUG nova.virt.libvirt.vif [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:53:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1120356955',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1120356955',id=27,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:54:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-x246813w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:54:00Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.950 186962 DEBUG nova.network.os_vif_util [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converting VIF {"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.951 186962 DEBUG nova.network.os_vif_util [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.952 186962 DEBUG nova.virt.libvirt.migration [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Updating guest XML with vif config: <interface type="ethernet">
Nov 29 01:54:10 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:dd:1a:af"/>
Nov 29 01:54:10 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 01:54:10 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:54:10 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 01:54:10 np0005539505 nova_compute[186958]:  <target dev="tap38a0a755-af"/>
Nov 29 01:54:10 np0005539505 nova_compute[186958]: </interface>
Nov 29 01:54:10 np0005539505 nova_compute[186958]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Nov 29 01:54:10 np0005539505 nova_compute[186958]: 2025-11-29 06:54:10.952 186962 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.404 186962 DEBUG nova.virt.libvirt.migration [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.405 186962 INFO nova.virt.libvirt.migration [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.503 186962 INFO nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.628 186962 DEBUG nova.compute.manager [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.628 186962 DEBUG oslo_concurrency.lockutils [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.629 186962 DEBUG oslo_concurrency.lockutils [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.629 186962 DEBUG oslo_concurrency.lockutils [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.629 186962 DEBUG nova.compute.manager [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] No waiting events found dispatching network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.629 186962 WARNING nova.compute.manager [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received unexpected event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.629 186962 DEBUG nova.compute.manager [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-changed-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.630 186962 DEBUG nova.compute.manager [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Refreshing instance network info cache due to event network-changed-38a0a755-afa1-4c97-9582-82fb739c7e6c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.630 186962 DEBUG oslo_concurrency.lockutils [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.630 186962 DEBUG oslo_concurrency.lockutils [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:54:11 np0005539505 nova_compute[186958]: 2025-11-29 06:54:11.630 186962 DEBUG nova.network.neutron [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Refreshing network info cache for port 38a0a755-afa1-4c97-9582-82fb739c7e6c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:54:12 np0005539505 nova_compute[186958]: 2025-11-29 06:54:12.006 186962 DEBUG nova.virt.libvirt.migration [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:54:12 np0005539505 nova_compute[186958]: 2025-11-29 06:54:12.007 186962 DEBUG nova.virt.libvirt.migration [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:54:12 np0005539505 nova_compute[186958]: 2025-11-29 06:54:12.511 186962 DEBUG nova.virt.libvirt.migration [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:54:12 np0005539505 nova_compute[186958]: 2025-11-29 06:54:12.511 186962 DEBUG nova.virt.libvirt.migration [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:54:12 np0005539505 nova_compute[186958]: 2025-11-29 06:54:12.834 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:12 np0005539505 nova_compute[186958]: 2025-11-29 06:54:12.913 186962 DEBUG nova.network.neutron [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Updated VIF entry in instance network info cache for port 38a0a755-afa1-4c97-9582-82fb739c7e6c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:54:12 np0005539505 nova_compute[186958]: 2025-11-29 06:54:12.914 186962 DEBUG nova.network.neutron [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Updating instance_info_cache with network_info: [{"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.007 186962 DEBUG oslo_concurrency.lockutils [req-20b20537-719d-4c08-8aa5-dadd2e364011 req-143e82c3-8271-4be5-80bf-d8a7656c60fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.015 186962 DEBUG nova.virt.libvirt.migration [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.015 186962 DEBUG nova.virt.libvirt.migration [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.447 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399253.4472404, 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.448 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.466 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.606 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.611 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.631 186962 DEBUG nova.virt.libvirt.migration [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.633 186962 DEBUG nova.virt.libvirt.migration [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.637 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Nov 29 01:54:13 np0005539505 kernel: tap38a0a755-af (unregistering): left promiscuous mode
Nov 29 01:54:13 np0005539505 NetworkManager[55134]: <info>  [1764399253.8207] device (tap38a0a755-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.835 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:13 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:13Z|00124|binding|INFO|Releasing lport 38a0a755-afa1-4c97-9582-82fb739c7e6c from this chassis (sb_readonly=0)
Nov 29 01:54:13 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:13Z|00125|binding|INFO|Setting lport 38a0a755-afa1-4c97-9582-82fb739c7e6c down in Southbound
Nov 29 01:54:13 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:13Z|00126|binding|INFO|Releasing lport c3ae15da-9b95-4494-b01e-202115261f9d from this chassis (sb_readonly=0)
Nov 29 01:54:13 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:13Z|00127|binding|INFO|Setting lport c3ae15da-9b95-4494-b01e-202115261f9d down in Southbound
Nov 29 01:54:13 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:13Z|00128|binding|INFO|Removing iface tap38a0a755-af ovn-installed in OVS
Nov 29 01:54:13 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:13Z|00129|binding|INFO|Releasing lport 933cdbf7-1588-4a26-b171-b3d2ec3cd1a3 from this chassis (sb_readonly=0)
Nov 29 01:54:13 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:13Z|00130|binding|INFO|Releasing lport a88e36d5-5037-4505-8d26-de14faa22faf from this chassis (sb_readonly=0)
Nov 29 01:54:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:13.861 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:1a:af 10.100.0.7'], port_security=['fa:16:3e:dd:1a:af 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'a43628b3-9efd-4940-9509-686038e16aeb'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-2018647846', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c691e2c0-bf24-480c-9af6-236639f0492c', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-2018647846', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d3611d1-4470-4c82-ad19-45393cd04081, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=38a0a755-afa1-4c97-9582-82fb739c7e6c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:13.866 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:a9:0c 19.80.0.52'], port_security=['fa:16:3e:02:a9:0c 19.80.0.52'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['38a0a755-afa1-4c97-9582-82fb739c7e6c'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-70072018', 'neutron:cidrs': '19.80.0.52/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-70072018', 'neutron:project_id': '93dcd8ffe78147b69c244e2e3bfc2121', 'neutron:revision_number': '3', 'neutron:security_group_ids': '8ea4a5be-b4e5-421b-8054-0313211cec38', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=69db7c70-3ac5-4b08-99b1-a77caa10cb9e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c3ae15da-9b95-4494-b01e-202115261f9d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:13.869 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 38a0a755-afa1-4c97-9582-82fb739c7e6c in datapath c691e2c0-bf24-480c-9af6-236639f0492c unbound from our chassis#033[00m
Nov 29 01:54:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:13.872 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c691e2c0-bf24-480c-9af6-236639f0492c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.876 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:13.876 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f9f610c2-1607-4320-a916-517ba8451091]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:13.878 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c namespace which is not needed anymore#033[00m
Nov 29 01:54:13 np0005539505 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Nov 29 01:54:13 np0005539505 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001b.scope: Consumed 13.473s CPU time.
Nov 29 01:54:13 np0005539505 nova_compute[186958]: 2025-11-29 06:54:13.977 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:13 np0005539505 systemd-machined[153285]: Machine qemu-14-instance-0000001b terminated.
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.019 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.024 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.069 186962 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.070 186962 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.071 186962 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.136 186962 DEBUG nova.virt.libvirt.guest [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6' (instance-0000001b) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.137 186962 INFO nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Migration operation has completed#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.137 186962 INFO nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] _post_live_migration() is started..#033[00m
Nov 29 01:54:14 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[218055]: [NOTICE]   (218078) : haproxy version is 2.8.14-c23fe91
Nov 29 01:54:14 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[218055]: [NOTICE]   (218078) : path to executable is /usr/sbin/haproxy
Nov 29 01:54:14 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[218055]: [WARNING]  (218078) : Exiting Master process...
Nov 29 01:54:14 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[218055]: [ALERT]    (218078) : Current worker (218080) exited with code 143 (Terminated)
Nov 29 01:54:14 np0005539505 neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c[218055]: [WARNING]  (218078) : All workers exited. Exiting... (0)
Nov 29 01:54:14 np0005539505 systemd[1]: libpod-38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f.scope: Deactivated successfully.
Nov 29 01:54:14 np0005539505 podman[218297]: 2025-11-29 06:54:14.609298749 +0000 UTC m=+0.604410203 container died 38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.655 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "ca75cc77-948e-4e28-a5db-b95961a337a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.655 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.675 186962 DEBUG nova.compute.manager [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.771 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.772 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.780 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.781 186962 INFO nova.compute.claims [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.939 186962 DEBUG nova.compute.provider_tree [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:54:14 np0005539505 nova_compute[186958]: 2025-11-29 06:54:14.963 186962 DEBUG nova.scheduler.client.report [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.000 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.001 186962 DEBUG nova.compute.manager [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.068 186962 DEBUG nova.compute.manager [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.069 186962 DEBUG nova.network.neutron [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.095 186962 INFO nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.118 186962 DEBUG nova.compute.manager [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.260 186962 DEBUG nova.compute.manager [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.263 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.264 186962 INFO nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Creating image(s)#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.265 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "/var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.265 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.267 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "/var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.293 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.323 186962 DEBUG nova.policy [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:54:15 np0005539505 systemd[1]: var-lib-containers-storage-overlay-6365263f053b05ed5dbda5362f2c2341264beac455a54fc961e953b21e061d0e-merged.mount: Deactivated successfully.
Nov 29 01:54:15 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f-userdata-shm.mount: Deactivated successfully.
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.365 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.367 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.368 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.379 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.446 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.447 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:15 np0005539505 podman[218297]: 2025-11-29 06:54:15.61900751 +0000 UTC m=+1.614118914 container cleanup 38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.618 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk 1073741824" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.619 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.620 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:15 np0005539505 systemd[1]: libpod-conmon-38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f.scope: Deactivated successfully.
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.680 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.681 186962 DEBUG nova.virt.disk.api [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Checking if we can resize image /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.682 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.744 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.745 186962 DEBUG nova.virt.disk.api [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Cannot resize image /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.745 186962 DEBUG nova.objects.instance [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'migration_context' on Instance uuid ca75cc77-948e-4e28-a5db-b95961a337a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.770 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.770 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Ensure instance console log exists: /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.771 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.771 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.772 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:15 np0005539505 podman[218354]: 2025-11-29 06:54:15.7904002 +0000 UTC m=+0.136813406 container remove 38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.795 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[00849cbf-f0b3-4207-a506-1006943bc7b9]: (4, ('Sat Nov 29 06:54:13 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c (38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f)\n38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f\nSat Nov 29 06:54:15 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c (38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f)\n38dc0b308994ed3748cc49354d75121562f91cc01b30f8430e58e0116408521f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.797 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[83472596-effe-4f79-bf85-bf25111361f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.798 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc691e2c0-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.800 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:15 np0005539505 kernel: tapc691e2c0-b0: left promiscuous mode
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.816 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.818 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.821 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb672a0-ee6c-4ef2-b9b4-840ac8dc7bfb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.823 186962 DEBUG nova.compute.manager [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-unplugged-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.823 186962 DEBUG oslo_concurrency.lockutils [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.823 186962 DEBUG oslo_concurrency.lockutils [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.823 186962 DEBUG oslo_concurrency.lockutils [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.823 186962 DEBUG nova.compute.manager [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] No waiting events found dispatching network-vif-unplugged-38a0a755-afa1-4c97-9582-82fb739c7e6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.824 186962 DEBUG nova.compute.manager [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-unplugged-38a0a755-afa1-4c97-9582-82fb739c7e6c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.824 186962 DEBUG nova.compute.manager [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.824 186962 DEBUG oslo_concurrency.lockutils [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.824 186962 DEBUG oslo_concurrency.lockutils [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.824 186962 DEBUG oslo_concurrency.lockutils [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.824 186962 DEBUG nova.compute.manager [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] No waiting events found dispatching network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.825 186962 WARNING nova.compute.manager [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received unexpected event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.825 186962 DEBUG nova.compute.manager [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.825 186962 DEBUG oslo_concurrency.lockutils [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.825 186962 DEBUG oslo_concurrency.lockutils [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.825 186962 DEBUG oslo_concurrency.lockutils [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.826 186962 DEBUG nova.compute.manager [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] No waiting events found dispatching network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:15 np0005539505 nova_compute[186958]: 2025-11-29 06:54:15.826 186962 WARNING nova.compute.manager [req-7ec98cfc-4e6d-42bb-9778-1fec7474da44 req-c5231efa-b050-4c9c-90eb-ede195113752 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received unexpected event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.839 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ced2333e-a7d5-4934-aa47-2d4a31f9a73d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.842 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6db17e8f-365d-44d2-be2e-02986bcf26dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.866 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[185b447d-af18-47b0-aaaa-d72d301e4c9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468700, 'reachable_time': 29381, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218379, 'error': None, 'target': 'ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539505 systemd[1]: run-netns-ovnmeta\x2dc691e2c0\x2dbf24\x2d480c\x2d9af6\x2d236639f0492c.mount: Deactivated successfully.
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.881 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c691e2c0-bf24-480c-9af6-236639f0492c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.884 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[9589936a-df3d-4812-8b02-c2d3cda6e346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.886 104094 INFO neutron.agent.ovn.metadata.agent [-] Port c3ae15da-9b95-4494-b01e-202115261f9d in datapath 3e6779cb-7f6f-419d-b2e7-0b18b601b6be unbound from our chassis#033[00m
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.890 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3e6779cb-7f6f-419d-b2e7-0b18b601b6be, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.891 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[13ef6f5e-05fc-4079-9634-fe94be9a2094]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:15.892 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be namespace which is not needed anymore#033[00m
Nov 29 01:54:16 np0005539505 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[218168]: [NOTICE]   (218172) : haproxy version is 2.8.14-c23fe91
Nov 29 01:54:16 np0005539505 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[218168]: [NOTICE]   (218172) : path to executable is /usr/sbin/haproxy
Nov 29 01:54:16 np0005539505 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[218168]: [WARNING]  (218172) : Exiting Master process...
Nov 29 01:54:16 np0005539505 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[218168]: [ALERT]    (218172) : Current worker (218174) exited with code 143 (Terminated)
Nov 29 01:54:16 np0005539505 neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be[218168]: [WARNING]  (218172) : All workers exited. Exiting... (0)
Nov 29 01:54:16 np0005539505 systemd[1]: libpod-db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b.scope: Deactivated successfully.
Nov 29 01:54:16 np0005539505 podman[218397]: 2025-11-29 06:54:16.128990251 +0000 UTC m=+0.083592526 container died db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:54:16 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b-userdata-shm.mount: Deactivated successfully.
Nov 29 01:54:16 np0005539505 systemd[1]: var-lib-containers-storage-overlay-78a1553def243cbf273bab52631d8061b2b1435960211787e997a70038adedd6-merged.mount: Deactivated successfully.
Nov 29 01:54:16 np0005539505 podman[218397]: 2025-11-29 06:54:16.170166832 +0000 UTC m=+0.124769097 container cleanup db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:54:16 np0005539505 systemd[1]: libpod-conmon-db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b.scope: Deactivated successfully.
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.244 186962 DEBUG nova.network.neutron [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Activated binding for port 38a0a755-afa1-4c97-9582-82fb739c7e6c and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.245 186962 DEBUG nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.246 186962 DEBUG nova.virt.libvirt.vif [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:53:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1120356955',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1120356955',id=27,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:54:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='93dcd8ffe78147b69c244e2e3bfc2121',ramdisk_id='',reservation_id='r-x246813w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1343206834',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1343206834-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:54:03Z,user_data=None,user_id='ea965b54cc694db4abef98ad9973e9f2',uuid=7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.246 186962 DEBUG nova.network.os_vif_util [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converting VIF {"id": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "address": "fa:16:3e:dd:1a:af", "network": {"id": "c691e2c0-bf24-480c-9af6-236639f0492c", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1801234319-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "93dcd8ffe78147b69c244e2e3bfc2121", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap38a0a755-af", "ovs_interfaceid": "38a0a755-afa1-4c97-9582-82fb739c7e6c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.247 186962 DEBUG nova.network.os_vif_util [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.247 186962 DEBUG os_vif [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.249 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.249 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38a0a755-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.253 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:54:16 np0005539505 podman[218426]: 2025-11-29 06:54:16.255687111 +0000 UTC m=+0.056937585 container remove db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 01:54:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:16.262 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[80db44ec-5734-422e-af0b-d96f8a0d1247]: (4, ('Sat Nov 29 06:54:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be (db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b)\ndb352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b\nSat Nov 29 06:54:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be (db352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b)\ndb352615db863c85150c73027b6e52276dd859a4886d3b421db32d1cb254eb6b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:16.264 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fc852602-95dc-4dc3-83cd-0d09ff447fff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:16.265 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e6779cb-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.265 186962 INFO os_vif [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:1a:af,bridge_name='br-int',has_traffic_filtering=True,id=38a0a755-afa1-4c97-9582-82fb739c7e6c,network=Network(c691e2c0-bf24-480c-9af6-236639f0492c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap38a0a755-af')#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.266 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.266 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.266 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.266 186962 DEBUG nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.267 186962 INFO nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Deleting instance files /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6_del#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.268 186962 INFO nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Deletion of /var/lib/nova/instances/7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6_del complete#033[00m
Nov 29 01:54:16 np0005539505 kernel: tap3e6779cb-70: left promiscuous mode
Nov 29 01:54:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:16.272 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e77fe9a8-20c3-4307-8dc1-5c4ce34897bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.274 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.283 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:16.298 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[487872a9-96c1-43d5-bc5c-47f72000c1a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:16.300 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[307889d3-4a0f-4168-9d45-42623550556f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:16.321 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f780c760-de18-470c-85bc-a701c51ffc19]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469043, 'reachable_time': 44528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218441, 'error': None, 'target': 'ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:16.324 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3e6779cb-7f6f-419d-b2e7-0b18b601b6be deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:54:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:16.325 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[2ead6c71-8d34-49b9-88f2-9d84848259fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:16 np0005539505 systemd[1]: run-netns-ovnmeta\x2d3e6779cb\x2d7f6f\x2d419d\x2db2e7\x2d0b18b601b6be.mount: Deactivated successfully.
Nov 29 01:54:16 np0005539505 nova_compute[186958]: 2025-11-29 06:54:16.899 186962 DEBUG nova.network.neutron [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Successfully created port: a38a0b58-03dd-4e7a-b14d-caa4b2952069 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:54:17 np0005539505 nova_compute[186958]: 2025-11-29 06:54:17.836 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:17 np0005539505 nova_compute[186958]: 2025-11-29 06:54:17.864 186962 DEBUG nova.network.neutron [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Successfully updated port: a38a0b58-03dd-4e7a-b14d-caa4b2952069 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:54:17 np0005539505 nova_compute[186958]: 2025-11-29 06:54:17.911 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "refresh_cache-ca75cc77-948e-4e28-a5db-b95961a337a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:54:17 np0005539505 nova_compute[186958]: 2025-11-29 06:54:17.912 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquired lock "refresh_cache-ca75cc77-948e-4e28-a5db-b95961a337a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:54:17 np0005539505 nova_compute[186958]: 2025-11-29 06:54:17.913 186962 DEBUG nova.network.neutron [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:54:18 np0005539505 nova_compute[186958]: 2025-11-29 06:54:18.116 186962 DEBUG nova.compute.manager [req-6d9fa8b0-fcdb-4998-bea2-244adadea369 req-2658d129-133d-45f5-842b-3ab2493273a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:18 np0005539505 nova_compute[186958]: 2025-11-29 06:54:18.117 186962 DEBUG oslo_concurrency.lockutils [req-6d9fa8b0-fcdb-4998-bea2-244adadea369 req-2658d129-133d-45f5-842b-3ab2493273a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:18 np0005539505 nova_compute[186958]: 2025-11-29 06:54:18.118 186962 DEBUG oslo_concurrency.lockutils [req-6d9fa8b0-fcdb-4998-bea2-244adadea369 req-2658d129-133d-45f5-842b-3ab2493273a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:18 np0005539505 nova_compute[186958]: 2025-11-29 06:54:18.118 186962 DEBUG oslo_concurrency.lockutils [req-6d9fa8b0-fcdb-4998-bea2-244adadea369 req-2658d129-133d-45f5-842b-3ab2493273a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:18 np0005539505 nova_compute[186958]: 2025-11-29 06:54:18.118 186962 DEBUG nova.compute.manager [req-6d9fa8b0-fcdb-4998-bea2-244adadea369 req-2658d129-133d-45f5-842b-3ab2493273a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] No waiting events found dispatching network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:18 np0005539505 nova_compute[186958]: 2025-11-29 06:54:18.119 186962 WARNING nova.compute.manager [req-6d9fa8b0-fcdb-4998-bea2-244adadea369 req-2658d129-133d-45f5-842b-3ab2493273a0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Received unexpected event network-vif-plugged-38a0a755-afa1-4c97-9582-82fb739c7e6c for instance with vm_state active and task_state migrating.#033[00m
Nov 29 01:54:18 np0005539505 nova_compute[186958]: 2025-11-29 06:54:18.119 186962 DEBUG nova.compute.manager [req-1ff31749-e865-4ec4-be58-53adef472238 req-e3daf78b-737e-465f-91e0-8fcd849b7484 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Received event network-changed-a38a0b58-03dd-4e7a-b14d-caa4b2952069 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:18 np0005539505 nova_compute[186958]: 2025-11-29 06:54:18.120 186962 DEBUG nova.compute.manager [req-1ff31749-e865-4ec4-be58-53adef472238 req-e3daf78b-737e-465f-91e0-8fcd849b7484 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Refreshing instance network info cache due to event network-changed-a38a0b58-03dd-4e7a-b14d-caa4b2952069. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:54:18 np0005539505 nova_compute[186958]: 2025-11-29 06:54:18.120 186962 DEBUG oslo_concurrency.lockutils [req-1ff31749-e865-4ec4-be58-53adef472238 req-e3daf78b-737e-465f-91e0-8fcd849b7484 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-ca75cc77-948e-4e28-a5db-b95961a337a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:54:18 np0005539505 nova_compute[186958]: 2025-11-29 06:54:18.262 186962 DEBUG nova.network.neutron [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:54:18 np0005539505 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:54:18 np0005539505 systemd[218213]: Activating special unit Exit the Session...
Nov 29 01:54:18 np0005539505 systemd[218213]: Stopped target Main User Target.
Nov 29 01:54:18 np0005539505 systemd[218213]: Stopped target Basic System.
Nov 29 01:54:18 np0005539505 systemd[218213]: Stopped target Paths.
Nov 29 01:54:18 np0005539505 systemd[218213]: Stopped target Sockets.
Nov 29 01:54:18 np0005539505 systemd[218213]: Stopped target Timers.
Nov 29 01:54:18 np0005539505 systemd[218213]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:54:18 np0005539505 systemd[218213]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:54:18 np0005539505 systemd[218213]: Closed D-Bus User Message Bus Socket.
Nov 29 01:54:18 np0005539505 systemd[218213]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:54:18 np0005539505 systemd[218213]: Removed slice User Application Slice.
Nov 29 01:54:18 np0005539505 systemd[218213]: Reached target Shutdown.
Nov 29 01:54:18 np0005539505 systemd[218213]: Finished Exit the Session.
Nov 29 01:54:18 np0005539505 systemd[218213]: Reached target Exit the Session.
Nov 29 01:54:18 np0005539505 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:54:18 np0005539505 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:54:18 np0005539505 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:54:18 np0005539505 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:54:18 np0005539505 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:54:18 np0005539505 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:54:18 np0005539505 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.359 186962 DEBUG nova.network.neutron [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Updating instance_info_cache with network_info: [{"id": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "address": "fa:16:3e:d3:8a:64", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38a0b58-03", "ovs_interfaceid": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.407 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Releasing lock "refresh_cache-ca75cc77-948e-4e28-a5db-b95961a337a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.408 186962 DEBUG nova.compute.manager [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Instance network_info: |[{"id": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "address": "fa:16:3e:d3:8a:64", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38a0b58-03", "ovs_interfaceid": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.409 186962 DEBUG oslo_concurrency.lockutils [req-1ff31749-e865-4ec4-be58-53adef472238 req-e3daf78b-737e-465f-91e0-8fcd849b7484 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-ca75cc77-948e-4e28-a5db-b95961a337a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.409 186962 DEBUG nova.network.neutron [req-1ff31749-e865-4ec4-be58-53adef472238 req-e3daf78b-737e-465f-91e0-8fcd849b7484 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Refreshing network info cache for port a38a0b58-03dd-4e7a-b14d-caa4b2952069 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.412 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Start _get_guest_xml network_info=[{"id": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "address": "fa:16:3e:d3:8a:64", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38a0b58-03", "ovs_interfaceid": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.417 186962 WARNING nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.421 186962 DEBUG nova.virt.libvirt.host [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.422 186962 DEBUG nova.virt.libvirt.host [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.425 186962 DEBUG nova.virt.libvirt.host [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.426 186962 DEBUG nova.virt.libvirt.host [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.427 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.428 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.428 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.428 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.429 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.429 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.429 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.429 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.430 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.430 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.430 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.430 186962 DEBUG nova.virt.hardware [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.434 186962 DEBUG nova.virt.libvirt.vif [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:54:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2126280730',display_name='tempest-ImagesTestJSON-server-2126280730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2126280730',id=31,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-a3qp5rci',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:54:15Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=ca75cc77-948e-4e28-a5db-b95961a337a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "address": "fa:16:3e:d3:8a:64", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38a0b58-03", "ovs_interfaceid": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.434 186962 DEBUG nova.network.os_vif_util [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "address": "fa:16:3e:d3:8a:64", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38a0b58-03", "ovs_interfaceid": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.435 186962 DEBUG nova.network.os_vif_util [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:8a:64,bridge_name='br-int',has_traffic_filtering=True,id=a38a0b58-03dd-4e7a-b14d-caa4b2952069,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38a0b58-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.436 186962 DEBUG nova.objects.instance [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'pci_devices' on Instance uuid ca75cc77-948e-4e28-a5db-b95961a337a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.452 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  <uuid>ca75cc77-948e-4e28-a5db-b95961a337a5</uuid>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  <name>instance-0000001f</name>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <nova:name>tempest-ImagesTestJSON-server-2126280730</nova:name>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:54:20</nova:creationTime>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:        <nova:user uuid="315be492c2ce4b9f8af2898e6794a256">tempest-ImagesTestJSON-1674785298-project-member</nova:user>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:        <nova:project uuid="78f8ba841bbe4fdcb9d9e2237d97bf73">tempest-ImagesTestJSON-1674785298</nova:project>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:        <nova:port uuid="a38a0b58-03dd-4e7a-b14d-caa4b2952069">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <entry name="serial">ca75cc77-948e-4e28-a5db-b95961a337a5</entry>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <entry name="uuid">ca75cc77-948e-4e28-a5db-b95961a337a5</entry>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk.config"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:d3:8a:64"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <target dev="tapa38a0b58-03"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/console.log" append="off"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:54:20 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:54:20 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:54:20 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:54:20 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.454 186962 DEBUG nova.compute.manager [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Preparing to wait for external event network-vif-plugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.456 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.456 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.457 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.459 186962 DEBUG nova.virt.libvirt.vif [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:54:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2126280730',display_name='tempest-ImagesTestJSON-server-2126280730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2126280730',id=31,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-a3qp5rci',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:54:15Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=ca75cc77-948e-4e28-a5db-b95961a337a5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "address": "fa:16:3e:d3:8a:64", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38a0b58-03", "ovs_interfaceid": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.459 186962 DEBUG nova.network.os_vif_util [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "address": "fa:16:3e:d3:8a:64", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38a0b58-03", "ovs_interfaceid": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.461 186962 DEBUG nova.network.os_vif_util [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:8a:64,bridge_name='br-int',has_traffic_filtering=True,id=a38a0b58-03dd-4e7a-b14d-caa4b2952069,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38a0b58-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.462 186962 DEBUG os_vif [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:8a:64,bridge_name='br-int',has_traffic_filtering=True,id=a38a0b58-03dd-4e7a-b14d-caa4b2952069,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38a0b58-03') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.463 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.464 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.464 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.468 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.469 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa38a0b58-03, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.470 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa38a0b58-03, col_values=(('external_ids', {'iface-id': 'a38a0b58-03dd-4e7a-b14d-caa4b2952069', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:8a:64', 'vm-uuid': 'ca75cc77-948e-4e28-a5db-b95961a337a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.476 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:20 np0005539505 NetworkManager[55134]: <info>  [1764399260.4769] manager: (tapa38a0b58-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.479 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.482 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.483 186962 INFO os_vif [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:8a:64,bridge_name='br-int',has_traffic_filtering=True,id=a38a0b58-03dd-4e7a-b14d-caa4b2952069,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38a0b58-03')#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.621 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.621 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.621 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] No VIF found with MAC fa:16:3e:d3:8a:64, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:54:20 np0005539505 nova_compute[186958]: 2025-11-29 06:54:20.622 186962 INFO nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Using config drive#033[00m
Nov 29 01:54:21 np0005539505 nova_compute[186958]: 2025-11-29 06:54:21.429 186962 INFO nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Creating config drive at /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk.config#033[00m
Nov 29 01:54:21 np0005539505 nova_compute[186958]: 2025-11-29 06:54:21.442 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyizwtcct execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:21 np0005539505 nova_compute[186958]: 2025-11-29 06:54:21.607 186962 DEBUG oslo_concurrency.processutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyizwtcct" returned: 0 in 0.164s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:21 np0005539505 kernel: tapa38a0b58-03: entered promiscuous mode
Nov 29 01:54:21 np0005539505 NetworkManager[55134]: <info>  [1764399261.7076] manager: (tapa38a0b58-03): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Nov 29 01:54:21 np0005539505 nova_compute[186958]: 2025-11-29 06:54:21.707 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:21 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:21Z|00131|binding|INFO|Claiming lport a38a0b58-03dd-4e7a-b14d-caa4b2952069 for this chassis.
Nov 29 01:54:21 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:21Z|00132|binding|INFO|a38a0b58-03dd-4e7a-b14d-caa4b2952069: Claiming fa:16:3e:d3:8a:64 10.100.0.10
Nov 29 01:54:21 np0005539505 nova_compute[186958]: 2025-11-29 06:54:21.717 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:21 np0005539505 nova_compute[186958]: 2025-11-29 06:54:21.785 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:21 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:21Z|00133|binding|INFO|Setting lport a38a0b58-03dd-4e7a-b14d-caa4b2952069 ovn-installed in OVS
Nov 29 01:54:21 np0005539505 nova_compute[186958]: 2025-11-29 06:54:21.792 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:21 np0005539505 systemd-machined[153285]: New machine qemu-15-instance-0000001f.
Nov 29 01:54:21 np0005539505 systemd[1]: Started Virtual Machine qemu-15-instance-0000001f.
Nov 29 01:54:21 np0005539505 systemd-udevd[218477]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:54:21 np0005539505 NetworkManager[55134]: <info>  [1764399261.8742] device (tapa38a0b58-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:54:21 np0005539505 NetworkManager[55134]: <info>  [1764399261.8764] device (tapa38a0b58-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:54:21 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:21Z|00134|binding|INFO|Setting lport a38a0b58-03dd-4e7a-b14d-caa4b2952069 up in Southbound
Nov 29 01:54:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:21.885 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:8a:64 10.100.0.10'], port_security=['fa:16:3e:d3:8a:64 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=a38a0b58-03dd-4e7a-b14d-caa4b2952069) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:21.887 104094 INFO neutron.agent.ovn.metadata.agent [-] Port a38a0b58-03dd-4e7a-b14d-caa4b2952069 in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba bound to our chassis#033[00m
Nov 29 01:54:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:21.889 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba#033[00m
Nov 29 01:54:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:21.907 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d3114fd7-abcd-4938-bb18-2343872c01a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:21.909 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap17ec2ca4-31 in ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:54:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:21.915 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap17ec2ca4-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:54:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:21.916 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[615eb485-16a7-49c6-8f8b-30f3f98a3a27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:21.920 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[62ce7137-81be-444a-87d6-dced5f0e3eef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:21 np0005539505 podman[218463]: 2025-11-29 06:54:21.930668496 +0000 UTC m=+0.117736729 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:54:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:21.941 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[63fa40ef-be2d-462e-9298-6cb5452f2e78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:21 np0005539505 podman[218465]: 2025-11-29 06:54:21.957162703 +0000 UTC m=+0.138637228 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, version=9.6, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public)
Nov 29 01:54:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:21.971 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1021099a-9b2d-4617-a69c-632fd792737c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.003 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[9ac3c85b-a6ab-48f1-bf88-b8130609c46c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 NetworkManager[55134]: <info>  [1764399262.0101] manager: (tap17ec2ca4-30): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Nov 29 01:54:22 np0005539505 systemd-udevd[218485]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.011 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[51bb7be9-cf42-4d14-a2d3-a3959a909a01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.050 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[77281673-0b20-499e-9af0-0658b090c330]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.053 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c430c9fb-0dd2-463c-ad75-1c88de0cb21d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 NetworkManager[55134]: <info>  [1764399262.0800] device (tap17ec2ca4-30): carrier: link connected
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.091 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c0e01408-efec-48fd-ab63-3bb51050b40a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.113 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[73b88bcf-3879-4737-9d20-655641a994f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470966, 'reachable_time': 18410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218537, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.136 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fd8dae4d-799a-4ef2-b5b2-33dce5b69fc1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:556b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 470966, 'tstamp': 470966}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218538, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.155 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0da2419a-a1e1-4426-a76f-6d9ecd69804d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap17ec2ca4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:55:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470966, 'reachable_time': 18410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218539, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.197 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[53d7a546-d8dd-4801-a3bd-1f054db9b496]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.268 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[804d09d7-2b4e-4d7f-9722-5311f41ca20b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.270 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.271 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.272 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17ec2ca4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.274 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:22 np0005539505 NetworkManager[55134]: <info>  [1764399262.2757] manager: (tap17ec2ca4-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Nov 29 01:54:22 np0005539505 kernel: tap17ec2ca4-30: entered promiscuous mode
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.278 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.279 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.279 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.280 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.285 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap17ec2ca4-30, col_values=(('external_ids', {'iface-id': '97d66506-c891-4bf7-8595-2d091560f247'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:22Z|00135|binding|INFO|Releasing lport 97d66506-c891-4bf7-8595-2d091560f247 from this chassis (sb_readonly=0)
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.287 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.302 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.304 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.305 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe44886-2fa9-4c50-934e-e7a7e56eddfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.306 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.pid.haproxy
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:54:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:22.307 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'env', 'PROCESS_TAG=haproxy-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/17ec2ca4-3fa9-41aa-80ef-35bf92d404ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.306 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.309 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.309 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.310 186962 DEBUG nova.compute.resource_tracker [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.386 186962 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.429 186962 DEBUG nova.compute.manager [req-00926602-ce64-4497-b5a5-5b1f2e4f98ae req-498ee74e-20a2-4104-8aa4-bde6d9fd7860 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Received event network-vif-plugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.432 186962 DEBUG oslo_concurrency.lockutils [req-00926602-ce64-4497-b5a5-5b1f2e4f98ae req-498ee74e-20a2-4104-8aa4-bde6d9fd7860 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.433 186962 DEBUG oslo_concurrency.lockutils [req-00926602-ce64-4497-b5a5-5b1f2e4f98ae req-498ee74e-20a2-4104-8aa4-bde6d9fd7860 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.433 186962 DEBUG oslo_concurrency.lockutils [req-00926602-ce64-4497-b5a5-5b1f2e4f98ae req-498ee74e-20a2-4104-8aa4-bde6d9fd7860 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.434 186962 DEBUG nova.compute.manager [req-00926602-ce64-4497-b5a5-5b1f2e4f98ae req-498ee74e-20a2-4104-8aa4-bde6d9fd7860 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Processing event network-vif-plugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.469 186962 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.471 186962 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.537 186962 DEBUG oslo_concurrency.processutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.634 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399262.6327324, ca75cc77-948e-4e28-a5db-b95961a337a5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.635 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] VM Started (Lifecycle Event)#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.638 186962 DEBUG nova.compute.manager [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.642 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.645 186962 INFO nova.virt.libvirt.driver [-] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Instance spawned successfully.#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.646 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.722 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.724 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.739 186962 WARNING nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.741 186962 DEBUG nova.compute.resource_tracker [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5689MB free_disk=73.26797103881836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.741 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.741 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:22 np0005539505 podman[218584]: 2025-11-29 06:54:22.743264483 +0000 UTC m=+0.057288024 container create 01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:54:22 np0005539505 systemd[1]: Started libpod-conmon-01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217.scope.
Nov 29 01:54:22 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:54:22 np0005539505 podman[218584]: 2025-11-29 06:54:22.714988058 +0000 UTC m=+0.029011629 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:54:22 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b27623b3cfb92dec4ad04af794a6f893bf67614e0082a681acf042b36739d5f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:54:22 np0005539505 podman[218584]: 2025-11-29 06:54:22.828881976 +0000 UTC m=+0.142905547 container init 01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:54:22 np0005539505 podman[218584]: 2025-11-29 06:54:22.834491074 +0000 UTC m=+0.148514625 container start 01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.851 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.852 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399262.6330004, ca75cc77-948e-4e28-a5db-b95961a337a5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.853 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:54:22 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218598]: [NOTICE]   (218602) : New worker (218604) forked
Nov 29 01:54:22 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218598]: [NOTICE]   (218602) : Loading success.
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.886 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.888 186962 DEBUG nova.compute.resource_tracker [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Migration for instance 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.893 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.893 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.894 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.894 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.895 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.895 186962 DEBUG nova.virt.libvirt.driver [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.899 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.904 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399262.6416779, ca75cc77-948e-4e28-a5db-b95961a337a5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.904 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.927 186962 DEBUG nova.compute.resource_tracker [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.961 186962 DEBUG nova.compute.resource_tracker [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Migration c740b153-ca40-4917-9ce9-b4debd1533e6 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.962 186962 DEBUG nova.compute.resource_tracker [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Instance ca75cc77-948e-4e28-a5db-b95961a337a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.963 186962 DEBUG nova.compute.resource_tracker [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:54:22 np0005539505 nova_compute[186958]: 2025-11-29 06:54:22.963 186962 DEBUG nova.compute.resource_tracker [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:54:23 np0005539505 nova_compute[186958]: 2025-11-29 06:54:23.039 186962 DEBUG nova.compute.provider_tree [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:54:23 np0005539505 nova_compute[186958]: 2025-11-29 06:54:23.829 186962 DEBUG nova.scheduler.client.report [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:54:23 np0005539505 nova_compute[186958]: 2025-11-29 06:54:23.835 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:23 np0005539505 nova_compute[186958]: 2025-11-29 06:54:23.841 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:54:23 np0005539505 nova_compute[186958]: 2025-11-29 06:54:23.875 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:54:23 np0005539505 nova_compute[186958]: 2025-11-29 06:54:23.878 186962 DEBUG nova.compute.resource_tracker [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:54:23 np0005539505 nova_compute[186958]: 2025-11-29 06:54:23.879 186962 DEBUG oslo_concurrency.lockutils [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:23 np0005539505 nova_compute[186958]: 2025-11-29 06:54:23.895 186962 INFO nova.compute.manager [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Took 8.63 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:54:23 np0005539505 nova_compute[186958]: 2025-11-29 06:54:23.895 186962 DEBUG nova.compute.manager [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:23 np0005539505 nova_compute[186958]: 2025-11-29 06:54:23.899 186962 INFO nova.compute.manager [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Migrating instance to compute-1.ctlplane.example.com finished successfully.#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.249 186962 INFO nova.compute.manager [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Took 9.52 seconds to build instance.#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.289 186962 DEBUG oslo_concurrency.lockutils [None req-c6af30bb-9aea-47c0-bbe5-965e4b7ab5d6 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:24.314 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.316 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:24.317 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.336 186962 INFO nova.scheduler.client.report [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] Deleted allocation for migration c740b153-ca40-4917-9ce9-b4debd1533e6#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.337 186962 DEBUG nova.virt.libvirt.driver [None req-cf938d11-909f-4277-bb6a-3e9bc1849860 5fd239f5ddc3494e9e5be527102acf37 c4152d593bb743c58c1274b268243acf - - default default] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.487 186962 DEBUG nova.network.neutron [req-1ff31749-e865-4ec4-be58-53adef472238 req-e3daf78b-737e-465f-91e0-8fcd849b7484 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Updated VIF entry in instance network info cache for port a38a0b58-03dd-4e7a-b14d-caa4b2952069. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.488 186962 DEBUG nova.network.neutron [req-1ff31749-e865-4ec4-be58-53adef472238 req-e3daf78b-737e-465f-91e0-8fcd849b7484 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Updating instance_info_cache with network_info: [{"id": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "address": "fa:16:3e:d3:8a:64", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38a0b58-03", "ovs_interfaceid": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.508 186962 DEBUG oslo_concurrency.lockutils [req-1ff31749-e865-4ec4-be58-53adef472238 req-e3daf78b-737e-465f-91e0-8fcd849b7484 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-ca75cc77-948e-4e28-a5db-b95961a337a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.555 186962 DEBUG nova.compute.manager [req-4ab62eb8-1243-462e-b071-45411fe62239 req-f4de898b-f9e0-4ccb-9438-a2127bb0c943 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Received event network-vif-plugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.558 186962 DEBUG oslo_concurrency.lockutils [req-4ab62eb8-1243-462e-b071-45411fe62239 req-f4de898b-f9e0-4ccb-9438-a2127bb0c943 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.559 186962 DEBUG oslo_concurrency.lockutils [req-4ab62eb8-1243-462e-b071-45411fe62239 req-f4de898b-f9e0-4ccb-9438-a2127bb0c943 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.559 186962 DEBUG oslo_concurrency.lockutils [req-4ab62eb8-1243-462e-b071-45411fe62239 req-f4de898b-f9e0-4ccb-9438-a2127bb0c943 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.560 186962 DEBUG nova.compute.manager [req-4ab62eb8-1243-462e-b071-45411fe62239 req-f4de898b-f9e0-4ccb-9438-a2127bb0c943 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] No waiting events found dispatching network-vif-plugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:24 np0005539505 nova_compute[186958]: 2025-11-29 06:54:24.560 186962 WARNING nova.compute.manager [req-4ab62eb8-1243-462e-b071-45411fe62239 req-f4de898b-f9e0-4ccb-9438-a2127bb0c943 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Received unexpected event network-vif-plugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 for instance with vm_state active and task_state None.#033[00m
Nov 29 01:54:25 np0005539505 nova_compute[186958]: 2025-11-29 06:54:25.479 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:26 np0005539505 nova_compute[186958]: 2025-11-29 06:54:26.513 186962 DEBUG nova.compute.manager [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:26 np0005539505 nova_compute[186958]: 2025-11-29 06:54:26.596 186962 INFO nova.compute.manager [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] instance snapshotting#033[00m
Nov 29 01:54:26 np0005539505 podman[218613]: 2025-11-29 06:54:26.763474478 +0000 UTC m=+0.086546649 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:54:26 np0005539505 nova_compute[186958]: 2025-11-29 06:54:26.816 186962 INFO nova.virt.libvirt.driver [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Beginning live snapshot process#033[00m
Nov 29 01:54:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:26.931 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:26.934 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:26.935 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:27 np0005539505 virtqemud[186353]: invalid argument: disk vda does not have an active block job
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.084 186962 DEBUG oslo_concurrency.processutils [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.170 186962 DEBUG oslo_concurrency.processutils [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json -f qcow2" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.171 186962 DEBUG oslo_concurrency.processutils [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.238 186962 DEBUG oslo_concurrency.processutils [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json -f qcow2" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.252 186962 DEBUG oslo_concurrency.processutils [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.344 186962 DEBUG oslo_concurrency.processutils [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.346 186962 DEBUG oslo_concurrency.processutils [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmps8gl0jf7/851966f2908f4234b9937f163a65634e.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.742 186962 DEBUG oslo_concurrency.processutils [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/snapshots/tmps8gl0jf7/851966f2908f4234b9937f163a65634e.delta 1073741824" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.744 186962 INFO nova.virt.libvirt.driver [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Quiescing instance not available: QEMU guest agent is not enabled.#033[00m
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.820 186962 DEBUG nova.virt.libvirt.guest [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846#033[00m
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.824 186962 INFO nova.virt.libvirt.driver [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Skipping quiescing instance: QEMU guest agent is not enabled.#033[00m
Nov 29 01:54:27 np0005539505 nova_compute[186958]: 2025-11-29 06:54:27.913 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:28 np0005539505 nova_compute[186958]: 2025-11-29 06:54:28.049 186962 DEBUG nova.privsep.utils [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:54:28 np0005539505 nova_compute[186958]: 2025-11-29 06:54:28.050 186962 DEBUG oslo_concurrency.processutils [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmps8gl0jf7/851966f2908f4234b9937f163a65634e.delta /var/lib/nova/instances/snapshots/tmps8gl0jf7/851966f2908f4234b9937f163a65634e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:29 np0005539505 nova_compute[186958]: 2025-11-29 06:54:29.068 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399254.066762, 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:54:29 np0005539505 nova_compute[186958]: 2025-11-29 06:54:29.069 186962 INFO nova.compute.manager [-] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:54:29 np0005539505 nova_compute[186958]: 2025-11-29 06:54:29.097 186962 DEBUG nova.compute.manager [None req-830a499e-a245-423d-87e5-d0f433fef532 - - - - - -] [instance: 7aa5ed66-1c17-4438-a36f-7b7f52ddb3e6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:54:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:29.322 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:29 np0005539505 nova_compute[186958]: 2025-11-29 06:54:29.352 186962 DEBUG oslo_concurrency.processutils [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmps8gl0jf7/851966f2908f4234b9937f163a65634e.delta /var/lib/nova/instances/snapshots/tmps8gl0jf7/851966f2908f4234b9937f163a65634e" returned: 0 in 1.303s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:29 np0005539505 nova_compute[186958]: 2025-11-29 06:54:29.354 186962 INFO nova.virt.libvirt.driver [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Snapshot extracted, beginning image upload#033[00m
Nov 29 01:54:30 np0005539505 nova_compute[186958]: 2025-11-29 06:54:30.531 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:31 np0005539505 nova_compute[186958]: 2025-11-29 06:54:31.668 186962 INFO nova.virt.libvirt.driver [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Snapshot image upload complete#033[00m
Nov 29 01:54:31 np0005539505 nova_compute[186958]: 2025-11-29 06:54:31.669 186962 INFO nova.compute.manager [None req-b3c0c2b1-0cc1-4563-a0d7-df5778762f1a 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Took 5.06 seconds to snapshot the instance on the hypervisor.#033[00m
Nov 29 01:54:32 np0005539505 podman[218655]: 2025-11-29 06:54:32.754694755 +0000 UTC m=+0.069554821 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:54:32 np0005539505 nova_compute[186958]: 2025-11-29 06:54:32.968 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:33 np0005539505 podman[218681]: 2025-11-29 06:54:33.785978814 +0000 UTC m=+0.109253959 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:54:35 np0005539505 nova_compute[186958]: 2025-11-29 06:54:35.535 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:36 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:36Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:8a:64 10.100.0.10
Nov 29 01:54:36 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:36Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:8a:64 10.100.0.10
Nov 29 01:54:37 np0005539505 podman[218718]: 2025-11-29 06:54:37.757342353 +0000 UTC m=+0.091180020 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0)
Nov 29 01:54:37 np0005539505 nova_compute[186958]: 2025-11-29 06:54:37.968 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:40 np0005539505 nova_compute[186958]: 2025-11-29 06:54:40.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:40 np0005539505 nova_compute[186958]: 2025-11-29 06:54:40.540 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:41 np0005539505 nova_compute[186958]: 2025-11-29 06:54:41.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:41 np0005539505 podman[218741]: 2025-11-29 06:54:41.733303061 +0000 UTC m=+0.058499800 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:54:43 np0005539505 nova_compute[186958]: 2025-11-29 06:54:43.007 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.416 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.417 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.418 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.418 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.498 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.585 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.587 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.680 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.844 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.845 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5572MB free_disk=73.23979949951172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.846 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.846 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.950 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance ca75cc77-948e-4e28-a5db-b95961a337a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.951 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:54:44 np0005539505 nova_compute[186958]: 2025-11-29 06:54:44.952 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:54:45 np0005539505 nova_compute[186958]: 2025-11-29 06:54:45.042 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:54:45 np0005539505 nova_compute[186958]: 2025-11-29 06:54:45.074 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:54:45 np0005539505 nova_compute[186958]: 2025-11-29 06:54:45.104 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:54:45 np0005539505 nova_compute[186958]: 2025-11-29 06:54:45.105 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:45 np0005539505 nova_compute[186958]: 2025-11-29 06:54:45.543 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:46 np0005539505 nova_compute[186958]: 2025-11-29 06:54:46.101 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:46 np0005539505 nova_compute[186958]: 2025-11-29 06:54:46.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:46 np0005539505 nova_compute[186958]: 2025-11-29 06:54:46.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:54:48 np0005539505 nova_compute[186958]: 2025-11-29 06:54:48.009 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.082 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'name': 'tempest-ImagesTestJSON-server-2126280730', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001f', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'hostId': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.083 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.093 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.094 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a8d8385-e8e4-4e59-b712-15ccb2230c25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-vda', 'timestamp': '2025-11-29T06:54:48.083885', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b97ccfa-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.72472273, 'message_signature': 'de35dd5388de990ab9009d93b56ab4e09f8ac4f040923fe48b91ed01998174dd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-sda', 'timestamp': '2025-11-29T06:54:48.083885', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b97dcea-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.72472273, 'message_signature': 'a9dff33ebaff59947d412802b3170d9c39c72c4777d7ee0593c5e2c3f179d91d'}]}, 'timestamp': '2025-11-29 06:54:48.094817', '_unique_id': '20c47b9ac0c34bb293ed163c81ea0669'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.096 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.098 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.122 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.write.latency volume: 4554402030 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.123 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd46d1201-d6b1-4f1e-9b03-eb03f8e7b10e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4554402030, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-vda', 'timestamp': '2025-11-29T06:54:48.098129', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b9c3ace-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': '0b7beda3218674b7917a935179e25334d2113f34408d233cc5beec7517802760'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-sda', 'timestamp': '2025-11-29T06:54:48.098129', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b9c4a46-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': '7787b5a4e1a3923cb756f953cdb7a99c206dbcad940ee9218d146b258c78267e'}]}, 'timestamp': '2025-11-29 06:54:48.123882', '_unique_id': 'ce45354b080b4f849a0ea528bb646d47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.126 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.126 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.read.requests volume: 1117 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.127 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3e2d53a-8034-46f5-900e-fb11ae62a857', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1117, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-vda', 'timestamp': '2025-11-29T06:54:48.126683', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b9cc91c-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': 'e5a6ec6d6d26b63e4e2a73a0fb00804f54512d523995e70c3c8d9120af81c7da'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-sda', 'timestamp': '2025-11-29T06:54:48.126683', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b9cd4ac-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': 'b1abb09b4b7dbcbba6321e38e3d8a4b90ab22249cb7b0efad7e7a75421a59342'}]}, 'timestamp': '2025-11-29 06:54:48.127385', '_unique_id': '0c9deaf28c324fa5ad4029c23b2b7523'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.129 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.write.bytes volume: 72880128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.129 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a0ffbe2-ce62-4a6a-a67c-cf53225cb0e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72880128, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-vda', 'timestamp': '2025-11-29T06:54:48.129395', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b9d3780-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': '774f6e41d0f4c4a565e1879d8fb280d59425bab33a3b23e6205f78605d3e858d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-sda', 'timestamp': '2025-11-29T06:54:48.129395', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b9d44e6-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': 'e71ec35ef8cac37e7d15343ae64dba83204891b1dbf53ffae82656b2940a6d57'}]}, 'timestamp': '2025-11-29 06:54:48.130212', '_unique_id': '72e1bfc42c9d419eb5c958e5229b7bc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.134 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ca75cc77-948e-4e28-a5db-b95961a337a5 / tapa38a0b58-03 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.134 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34b70298-d9a4-4c2d-abf0-eb1359ae1ac5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-0000001f-ca75cc77-948e-4e28-a5db-b95961a337a5-tapa38a0b58-03', 'timestamp': '2025-11-29T06:54:48.131771', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'tapa38a0b58-03', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:8a:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa38a0b58-03'}, 'message_id': '4b9e0aca-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.77264626, 'message_signature': '019072b0ce1be2ca006c7aaff26e3f4389cd0d399e06ac21715ae3350634be44'}]}, 'timestamp': '2025-11-29 06:54:48.135488', '_unique_id': '49f4d9d39ae8412b893405287ba76f30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.152 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/memory.usage volume: 42.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06d1a430-8140-453e-b30c-ee7466b6ac5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.421875, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'timestamp': '2025-11-29T06:54:48.137729', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4ba0d3ae-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.793628732, 'message_signature': '8e3935e003c0ee0f4e5a4eb8d2d6e40ffaa4d1678158137be4efa17fa11eaf15'}]}, 'timestamp': '2025-11-29 06:54:48.153645', '_unique_id': 'c762edf89e8542319eefb0167915efba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.155 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/network.outgoing.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd548a264-4520-4c9d-8fb4-e1bd7d7d7d77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-0000001f-ca75cc77-948e-4e28-a5db-b95961a337a5-tapa38a0b58-03', 'timestamp': '2025-11-29T06:54:48.155650', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'tapa38a0b58-03', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:8a:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa38a0b58-03'}, 'message_id': '4ba131aa-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.77264626, 'message_signature': '4a2c3aa4c50fb1711eab5eb22b1304d6ee7bbb46409aa32a1dbec509deff7cee'}]}, 'timestamp': '2025-11-29 06:54:48.155999', '_unique_id': '4237af1b23214cff950c4c29209808d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.157 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '249bd648-34cf-42f7-9699-109b718966db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-0000001f-ca75cc77-948e-4e28-a5db-b95961a337a5-tapa38a0b58-03', 'timestamp': '2025-11-29T06:54:48.157632', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'tapa38a0b58-03', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:8a:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa38a0b58-03'}, 'message_id': '4ba17eda-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.77264626, 'message_signature': '411f281406ac72d946b5c599b1dfc6ebf25c89afac84fde82b34c344105e3744'}]}, 'timestamp': '2025-11-29 06:54:48.157884', '_unique_id': '310c916a9fbe4dfd801cb9ab1bcb2d5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.159 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.159 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/cpu volume: 11470000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04428de6-a5f2-4903-b749-4e8a4f808d36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11470000000, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'timestamp': '2025-11-29T06:54:48.159161', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4ba1bc88-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.793628732, 'message_signature': 'f06aa75c4c7a21c3cc4e6d69ce663d1631c6378df6d6d51c57181823245e0284'}]}, 'timestamp': '2025-11-29 06:54:48.159493', '_unique_id': 'e92213c3fa7a45a197cb350d2271c7e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.160 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ca96884-2d93-4ffd-b27b-e835b17031b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-0000001f-ca75cc77-948e-4e28-a5db-b95961a337a5-tapa38a0b58-03', 'timestamp': '2025-11-29T06:54:48.160747', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'tapa38a0b58-03', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:8a:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa38a0b58-03'}, 'message_id': '4ba1f964-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.77264626, 'message_signature': 'c6a02fc477b727c56566175c4db0b613853d693c16f947a65fa3090e3a4ec313'}]}, 'timestamp': '2025-11-29 06:54:48.161019', '_unique_id': '0f7768cf90184e56ad92145daaf8124f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.162 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '474e2214-9a4f-42a3-b1f1-d55865abb0e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-0000001f-ca75cc77-948e-4e28-a5db-b95961a337a5-tapa38a0b58-03', 'timestamp': '2025-11-29T06:54:48.162489', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'tapa38a0b58-03', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:8a:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa38a0b58-03'}, 'message_id': '4ba23d7a-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.77264626, 'message_signature': 'bfadf14fcfaada6b5d9dd373c1257a397943cf087df79a148716a3c87f08476f'}]}, 'timestamp': '2025-11-29 06:54:48.162814', '_unique_id': '505d3555f6e048ebaea1aa2f5016d86d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.164 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.164 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-2126280730>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-2126280730>]
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.164 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2d721cd-68dc-4399-8d2c-01bb828442ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-vda', 'timestamp': '2025-11-29T06:54:48.164793', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4ba2969e-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.72472273, 'message_signature': '198fd588f76a0f66884537fedab3034bb8e7023242f40a71200abff7adfa779c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-sda', 'timestamp': '2025-11-29T06:54:48.164793', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4ba29f18-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.72472273, 'message_signature': 'e5ee7de11b6c2a997cca8a0dc783fb54bacd9c1dedc317e461c221e14405f890'}]}, 'timestamp': '2025-11-29 06:54:48.165266', '_unique_id': 'cb02eccce58a4b689331c238c1542a88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.166 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.166 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-2126280730>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-2126280730>]
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.166 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/network.outgoing.bytes volume: 1550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c9b333a-00ad-42aa-bc9c-4747ab54fe10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1550, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-0000001f-ca75cc77-948e-4e28-a5db-b95961a337a5-tapa38a0b58-03', 'timestamp': '2025-11-29T06:54:48.166968', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'tapa38a0b58-03', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:8a:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa38a0b58-03'}, 'message_id': '4ba2ec48-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.77264626, 'message_signature': 'bd5ea029e8715720a8e8a65ac4b0f39c1a1ae79a97cbe7bb83fb06281017091f'}]}, 'timestamp': '2025-11-29 06:54:48.167297', '_unique_id': '93a5d7a27c294788b33e3c2706661854'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.168 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.read.bytes volume: 31181312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.168 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d979c03-bd97-47c5-ac6c-22de866cef2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31181312, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-vda', 'timestamp': '2025-11-29T06:54:48.168602', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4ba32a96-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': '7c6eab154831c926b54de2d9000542b30a46289f638f00de029478e1316be8b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-sda', 'timestamp': '2025-11-29T06:54:48.168602', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4ba3328e-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': 'aa05042c34a26840d3ceaf00fb6f28af64559948dca84cf6dfc287e22cb08311'}]}, 'timestamp': '2025-11-29 06:54:48.169025', '_unique_id': 'f930363703824693b0662e03d6d51694'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.170 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.write.requests volume: 274 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.170 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e15981ee-e53b-43c4-93fa-bc74d4d8847f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 274, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-vda', 'timestamp': '2025-11-29T06:54:48.170487', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4ba375e6-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': 'dd16286513750c5f09da00e58eb24e8d70f7ec7531b6d44abe58775b884388d4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-sda', 'timestamp': '2025-11-29T06:54:48.170487', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4ba380a4-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': 'e796a3a8e0ef4f706b0ad693386cd869a3cf0b2aecff49839917783f9dd5115c'}]}, 'timestamp': '2025-11-29 06:54:48.171019', '_unique_id': '819c56404e384fd28c2f7c2dde7409df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.171 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.172 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.172 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49a71ea3-0f12-4a06-a4b0-ce15dc48891e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-0000001f-ca75cc77-948e-4e28-a5db-b95961a337a5-tapa38a0b58-03', 'timestamp': '2025-11-29T06:54:48.172275', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'tapa38a0b58-03', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:8a:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa38a0b58-03'}, 'message_id': '4ba3ba88-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.77264626, 'message_signature': '31c86e9b2add64660159f3f97c23eff8c707e170bfc75f965f7ac2e909c4adb3'}]}, 'timestamp': '2025-11-29 06:54:48.172514', '_unique_id': '5f150dd7b021419ea1d1d40442ebcbb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.173 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.read.latency volume: 228020365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.174 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.read.latency volume: 24678415 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d506a3a-dcbd-49e1-b0d4-4ab0abfca6ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 228020365, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-vda', 'timestamp': '2025-11-29T06:54:48.173933', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4ba3fdfe-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': 'bde92e435396b5a305aa8926ae8bf6efccc1fd7a530cad432895e88ee36d36dd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24678415, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-sda', 'timestamp': '2025-11-29T06:54:48.173933', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4ba4092a-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.739037653, 'message_signature': 'b1f009f6c708ce2cf1c46a4ea6dbaac35483c5e41c9e44817e048ea9a1e8df3e'}]}, 'timestamp': '2025-11-29 06:54:48.174548', '_unique_id': '4c15d358474b4ccb8232da81f6625b6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.176 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.176 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-2126280730>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-2126280730>]
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.176 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.176 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f0856a7-ed06-43bd-bcf4-d778d5c5277d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-0000001f-ca75cc77-948e-4e28-a5db-b95961a337a5-tapa38a0b58-03', 'timestamp': '2025-11-29T06:54:48.176365', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'tapa38a0b58-03', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:8a:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa38a0b58-03'}, 'message_id': '4ba45a38-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.77264626, 'message_signature': '2b85fbd8712e9acbca9eed87aa3bb94cd2f9dfc1b98c6f7eae490fce28319515'}]}, 'timestamp': '2025-11-29 06:54:48.176605', '_unique_id': '012653d798124a73b502c1720c39dd8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.177 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af975947-fdf5-4bea-8101-b2919c2a9dbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-0000001f-ca75cc77-948e-4e28-a5db-b95961a337a5-tapa38a0b58-03', 'timestamp': '2025-11-29T06:54:48.177719', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'tapa38a0b58-03', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:8a:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa38a0b58-03'}, 'message_id': '4ba48f58-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.77264626, 'message_signature': '4d6b8ed8553b279d884325850e9132998b8fa92973ac24d316beb07263034692'}]}, 'timestamp': '2025-11-29 06:54:48.177965', '_unique_id': '63193262a32a4a0e9cfa82160ef1772b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.178 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.179 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.179 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ImagesTestJSON-server-2126280730>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ImagesTestJSON-server-2126280730>]
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.179 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b93d3e0-4017-4b42-ae78-67b3adf03f3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'instance-0000001f-ca75cc77-948e-4e28-a5db-b95961a337a5-tapa38a0b58-03', 'timestamp': '2025-11-29T06:54:48.179407', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'tapa38a0b58-03', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d3:8a:64', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa38a0b58-03'}, 'message_id': '4ba4d0c6-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.77264626, 'message_signature': '85cd3169ee251fb449478d1ef38b67a18963b54ef61dee27b523e27298447a7a'}]}, 'timestamp': '2025-11-29 06:54:48.179647', '_unique_id': 'bd1fede34cd6490fabdae0a77d91c07d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.180 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.181 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.181 12 DEBUG ceilometer.compute.pollsters [-] ca75cc77-948e-4e28-a5db-b95961a337a5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66b686d3-5e70-406f-a6f0-3e14bd062d0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-vda', 'timestamp': '2025-11-29T06:54:48.181079', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4ba5146e-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.72472273, 'message_signature': '7cdc7abd6efd56b59844c30d927743a2a53cd65257a796cbca009b563424d2c8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '315be492c2ce4b9f8af2898e6794a256', 'user_name': None, 'project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'project_name': None, 'resource_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5-sda', 'timestamp': '2025-11-29T06:54:48.181079', 'resource_metadata': {'display_name': 'tempest-ImagesTestJSON-server-2126280730', 'name': 'instance-0000001f', 'instance_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'instance_type': 'm1.nano', 'host': '9e8a020ddc982bb9b39c2f06d1e22a36257b29987c9450e78e8c6919', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4ba51f7c-ccf0-11f0-8954-fa163e5a5606', 'monotonic_time': 4735.72472273, 'message_signature': 'a3a1f726e3ce2e4568a3f348e8e2e4e4249849bfa6939a9437471f4f1e2b191e'}]}, 'timestamp': '2025-11-29 06:54:48.181677', '_unique_id': 'bfe2822d358a44aa961cec306be04482'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 01:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:54:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 01:54:48 np0005539505 nova_compute[186958]: 2025-11-29 06:54:48.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:48 np0005539505 nova_compute[186958]: 2025-11-29 06:54:48.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:54:48 np0005539505 nova_compute[186958]: 2025-11-29 06:54:48.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:54:49 np0005539505 nova_compute[186958]: 2025-11-29 06:54:49.132 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-ca75cc77-948e-4e28-a5db-b95961a337a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:54:49 np0005539505 nova_compute[186958]: 2025-11-29 06:54:49.132 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-ca75cc77-948e-4e28-a5db-b95961a337a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:54:49 np0005539505 nova_compute[186958]: 2025-11-29 06:54:49.133 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:54:49 np0005539505 nova_compute[186958]: 2025-11-29 06:54:49.133 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ca75cc77-948e-4e28-a5db-b95961a337a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:50 np0005539505 nova_compute[186958]: 2025-11-29 06:54:50.548 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:52 np0005539505 podman[218768]: 2025-11-29 06:54:52.780268713 +0000 UTC m=+0.093113905 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:54:52 np0005539505 podman[218767]: 2025-11-29 06:54:52.791669744 +0000 UTC m=+0.116105173 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, version=9.6, release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.060 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.214 186962 DEBUG oslo_concurrency.lockutils [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "ca75cc77-948e-4e28-a5db-b95961a337a5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.215 186962 DEBUG oslo_concurrency.lockutils [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.215 186962 DEBUG oslo_concurrency.lockutils [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.215 186962 DEBUG oslo_concurrency.lockutils [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.215 186962 DEBUG oslo_concurrency.lockutils [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.230 186962 INFO nova.compute.manager [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Terminating instance#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.242 186962 DEBUG nova.compute.manager [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:54:53 np0005539505 kernel: tapa38a0b58-03 (unregistering): left promiscuous mode
Nov 29 01:54:53 np0005539505 NetworkManager[55134]: <info>  [1764399293.2754] device (tapa38a0b58-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:54:53 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:53Z|00136|binding|INFO|Releasing lport a38a0b58-03dd-4e7a-b14d-caa4b2952069 from this chassis (sb_readonly=0)
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.282 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:53 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:53Z|00137|binding|INFO|Setting lport a38a0b58-03dd-4e7a-b14d-caa4b2952069 down in Southbound
Nov 29 01:54:53 np0005539505 ovn_controller[95143]: 2025-11-29T06:54:53Z|00138|binding|INFO|Removing iface tapa38a0b58-03 ovn-installed in OVS
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.285 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.307 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:53.307 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:8a:64 10.100.0.10'], port_security=['fa:16:3e:d3:8a:64 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'ca75cc77-948e-4e28-a5db-b95961a337a5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '78f8ba841bbe4fdcb9d9e2237d97bf73', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ca8fef31-1a4b-4249-948f-73ea087430b2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8122595a-c31d-4e3d-a668-dbae500c1d72, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=a38a0b58-03dd-4e7a-b14d-caa4b2952069) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:54:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:53.310 104094 INFO neutron.agent.ovn.metadata.agent [-] Port a38a0b58-03dd-4e7a-b14d-caa4b2952069 in datapath 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba unbound from our chassis#033[00m
Nov 29 01:54:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:53.315 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:54:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:53.317 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[83ec5f35-e96f-4e17-b2f0-65b223715249]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:53.318 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba namespace which is not needed anymore#033[00m
Nov 29 01:54:53 np0005539505 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Nov 29 01:54:53 np0005539505 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001f.scope: Consumed 13.890s CPU time.
Nov 29 01:54:53 np0005539505 systemd-machined[153285]: Machine qemu-15-instance-0000001f terminated.
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.497 186962 INFO nova.virt.libvirt.driver [-] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Instance destroyed successfully.#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.499 186962 DEBUG nova.objects.instance [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lazy-loading 'resources' on Instance uuid ca75cc77-948e-4e28-a5db-b95961a337a5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.520 186962 DEBUG nova.virt.libvirt.vif [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:54:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2126280730',display_name='tempest-ImagesTestJSON-server-2126280730',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-2126280730',id=31,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:54:23Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='78f8ba841bbe4fdcb9d9e2237d97bf73',ramdisk_id='',reservation_id='r-a3qp5rci',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-1674785298',owner_user_name='tempest-ImagesTestJSON-1674785298-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:54:31Z,user_data=None,user_id='315be492c2ce4b9f8af2898e6794a256',uuid=ca75cc77-948e-4e28-a5db-b95961a337a5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "address": "fa:16:3e:d3:8a:64", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38a0b58-03", "ovs_interfaceid": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.521 186962 DEBUG nova.network.os_vif_util [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converting VIF {"id": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "address": "fa:16:3e:d3:8a:64", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38a0b58-03", "ovs_interfaceid": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.522 186962 DEBUG nova.network.os_vif_util [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:8a:64,bridge_name='br-int',has_traffic_filtering=True,id=a38a0b58-03dd-4e7a-b14d-caa4b2952069,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38a0b58-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.522 186962 DEBUG os_vif [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:8a:64,bridge_name='br-int',has_traffic_filtering=True,id=a38a0b58-03dd-4e7a-b14d-caa4b2952069,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38a0b58-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.525 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.526 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa38a0b58-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.528 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.530 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.538 186962 INFO os_vif [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:8a:64,bridge_name='br-int',has_traffic_filtering=True,id=a38a0b58-03dd-4e7a-b14d-caa4b2952069,network=Network(17ec2ca4-3fa9-41aa-80ef-35bf92d404ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38a0b58-03')#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.539 186962 INFO nova.virt.libvirt.driver [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Deleting instance files /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5_del#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.539 186962 INFO nova.virt.libvirt.driver [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Deletion of /var/lib/nova/instances/ca75cc77-948e-4e28-a5db-b95961a337a5_del complete#033[00m
Nov 29 01:54:53 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218598]: [NOTICE]   (218602) : haproxy version is 2.8.14-c23fe91
Nov 29 01:54:53 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218598]: [NOTICE]   (218602) : path to executable is /usr/sbin/haproxy
Nov 29 01:54:53 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218598]: [WARNING]  (218602) : Exiting Master process...
Nov 29 01:54:53 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218598]: [WARNING]  (218602) : Exiting Master process...
Nov 29 01:54:53 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218598]: [ALERT]    (218602) : Current worker (218604) exited with code 143 (Terminated)
Nov 29 01:54:53 np0005539505 neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba[218598]: [WARNING]  (218602) : All workers exited. Exiting... (0)
Nov 29 01:54:53 np0005539505 systemd[1]: libpod-01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217.scope: Deactivated successfully.
Nov 29 01:54:53 np0005539505 podman[218834]: 2025-11-29 06:54:53.596091672 +0000 UTC m=+0.188177734 container died 01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.723 186962 INFO nova.compute.manager [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.724 186962 DEBUG oslo.service.loopingcall [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.724 186962 DEBUG nova.compute.manager [-] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:54:53 np0005539505 nova_compute[186958]: 2025-11-29 06:54:53.724 186962 DEBUG nova.network.neutron [-] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:54:53 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217-userdata-shm.mount: Deactivated successfully.
Nov 29 01:54:53 np0005539505 systemd[1]: var-lib-containers-storage-overlay-b27623b3cfb92dec4ad04af794a6f893bf67614e0082a681acf042b36739d5f9-merged.mount: Deactivated successfully.
Nov 29 01:54:53 np0005539505 podman[218834]: 2025-11-29 06:54:53.878329625 +0000 UTC m=+0.470415667 container cleanup 01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 01:54:53 np0005539505 systemd[1]: libpod-conmon-01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217.scope: Deactivated successfully.
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.256 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Updating instance_info_cache with network_info: [{"id": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "address": "fa:16:3e:d3:8a:64", "network": {"id": "17ec2ca4-3fa9-41aa-80ef-35bf92d404ba", "bridge": "br-int", "label": "tempest-ImagesTestJSON-31490053-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "78f8ba841bbe4fdcb9d9e2237d97bf73", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38a0b58-03", "ovs_interfaceid": "a38a0b58-03dd-4e7a-b14d-caa4b2952069", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.312 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-ca75cc77-948e-4e28-a5db-b95961a337a5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.313 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.314 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.314 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.314 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.328 186962 DEBUG nova.compute.manager [req-454f6164-c27a-4546-90d0-e4f299958eda req-606b3413-2328-4c15-a2e5-66dc96d87d88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Received event network-vif-unplugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.328 186962 DEBUG oslo_concurrency.lockutils [req-454f6164-c27a-4546-90d0-e4f299958eda req-606b3413-2328-4c15-a2e5-66dc96d87d88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.328 186962 DEBUG oslo_concurrency.lockutils [req-454f6164-c27a-4546-90d0-e4f299958eda req-606b3413-2328-4c15-a2e5-66dc96d87d88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.329 186962 DEBUG oslo_concurrency.lockutils [req-454f6164-c27a-4546-90d0-e4f299958eda req-606b3413-2328-4c15-a2e5-66dc96d87d88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.329 186962 DEBUG nova.compute.manager [req-454f6164-c27a-4546-90d0-e4f299958eda req-606b3413-2328-4c15-a2e5-66dc96d87d88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] No waiting events found dispatching network-vif-unplugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.329 186962 DEBUG nova.compute.manager [req-454f6164-c27a-4546-90d0-e4f299958eda req-606b3413-2328-4c15-a2e5-66dc96d87d88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Received event network-vif-unplugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:54:55 np0005539505 podman[218883]: 2025-11-29 06:54:55.57448026 +0000 UTC m=+1.657970380 container remove 01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:54:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:55.584 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef0580a-4203-4cc0-aa9b-74385471a5dc]: (4, ('Sat Nov 29 06:54:53 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217)\n01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217\nSat Nov 29 06:54:53 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba (01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217)\n01485a00813595d50b8858fa356043cc4302071600e868050af5140562b8f217\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:55.587 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b59f9d8f-9fc7-4a37-98a9-294f298ebc32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:55.590 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17ec2ca4-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.593 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:55 np0005539505 kernel: tap17ec2ca4-30: left promiscuous mode
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.595 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:55.599 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[67bbca89-bf9e-49b5-b6b5-bba62b4e0130]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:55 np0005539505 nova_compute[186958]: 2025-11-29 06:54:55.608 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:55.625 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[025f1ef3-8bb3-438f-ae37-c4c3d39fbe6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:55.628 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[92bce1d1-b135-4f5c-8f5d-1b746cb79b9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:55.661 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[68599fe6-cec7-4af7-9855-47b781f5657b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 470958, 'reachable_time': 26034, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218899, 'error': None, 'target': 'ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:55 np0005539505 systemd[1]: run-netns-ovnmeta\x2d17ec2ca4\x2d3fa9\x2d41aa\x2d80ef\x2d35bf92d404ba.mount: Deactivated successfully.
Nov 29 01:54:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:55.668 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-17ec2ca4-3fa9-41aa-80ef-35bf92d404ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:54:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:54:55.668 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[6e76e7c1-2925-4df9-b764-8cf652419e65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.433 186962 DEBUG nova.compute.manager [req-8dee0495-57d8-4b49-b871-849a7dab83e1 req-a9310567-5197-4255-829e-f0e34e051a5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Received event network-vif-plugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.434 186962 DEBUG oslo_concurrency.lockutils [req-8dee0495-57d8-4b49-b871-849a7dab83e1 req-a9310567-5197-4255-829e-f0e34e051a5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.435 186962 DEBUG oslo_concurrency.lockutils [req-8dee0495-57d8-4b49-b871-849a7dab83e1 req-a9310567-5197-4255-829e-f0e34e051a5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.435 186962 DEBUG oslo_concurrency.lockutils [req-8dee0495-57d8-4b49-b871-849a7dab83e1 req-a9310567-5197-4255-829e-f0e34e051a5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.435 186962 DEBUG nova.compute.manager [req-8dee0495-57d8-4b49-b871-849a7dab83e1 req-a9310567-5197-4255-829e-f0e34e051a5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] No waiting events found dispatching network-vif-plugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.436 186962 WARNING nova.compute.manager [req-8dee0495-57d8-4b49-b871-849a7dab83e1 req-a9310567-5197-4255-829e-f0e34e051a5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Received unexpected event network-vif-plugged-a38a0b58-03dd-4e7a-b14d-caa4b2952069 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.575 186962 DEBUG nova.network.neutron [-] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.621 186962 INFO nova.compute.manager [-] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Took 3.90 seconds to deallocate network for instance.#033[00m
Nov 29 01:54:57 np0005539505 podman[218900]: 2025-11-29 06:54:57.762596129 +0000 UTC m=+0.073272906 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.770 186962 DEBUG oslo_concurrency.lockutils [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.770 186962 DEBUG oslo_concurrency.lockutils [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.916 186962 DEBUG nova.compute.provider_tree [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.934 186962 DEBUG nova.scheduler.client.report [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:54:57 np0005539505 nova_compute[186958]: 2025-11-29 06:54:57.983 186962 DEBUG oslo_concurrency.lockutils [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:58 np0005539505 nova_compute[186958]: 2025-11-29 06:54:58.033 186962 INFO nova.scheduler.client.report [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Deleted allocations for instance ca75cc77-948e-4e28-a5db-b95961a337a5#033[00m
Nov 29 01:54:58 np0005539505 nova_compute[186958]: 2025-11-29 06:54:58.073 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:58 np0005539505 nova_compute[186958]: 2025-11-29 06:54:58.143 186962 DEBUG oslo_concurrency.lockutils [None req-7ded8313-e99a-4d7c-b953-8f0996a3ced5 315be492c2ce4b9f8af2898e6794a256 78f8ba841bbe4fdcb9d9e2237d97bf73 - - default default] Lock "ca75cc77-948e-4e28-a5db-b95961a337a5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:54:58 np0005539505 nova_compute[186958]: 2025-11-29 06:54:58.530 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:54:59 np0005539505 nova_compute[186958]: 2025-11-29 06:54:59.787 186962 DEBUG nova.compute.manager [req-0afc7adc-314f-41aa-a575-3008ab51ad42 req-8ea4cc9c-a8d8-4eb6-87f0-7fddceac5e90 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Received event network-vif-deleted-a38a0b58-03dd-4e7a-b14d-caa4b2952069 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:55:03 np0005539505 nova_compute[186958]: 2025-11-29 06:55:03.112 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:03 np0005539505 nova_compute[186958]: 2025-11-29 06:55:03.533 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:03 np0005539505 podman[218921]: 2025-11-29 06:55:03.770307799 +0000 UTC m=+0.089245795 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:55:04 np0005539505 podman[218947]: 2025-11-29 06:55:04.818864137 +0000 UTC m=+0.144286667 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 01:55:08 np0005539505 nova_compute[186958]: 2025-11-29 06:55:08.114 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:08 np0005539505 nova_compute[186958]: 2025-11-29 06:55:08.495 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399293.4942813, ca75cc77-948e-4e28-a5db-b95961a337a5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:55:08 np0005539505 nova_compute[186958]: 2025-11-29 06:55:08.496 186962 INFO nova.compute.manager [-] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:55:08 np0005539505 nova_compute[186958]: 2025-11-29 06:55:08.520 186962 DEBUG nova.compute.manager [None req-f3ad6421-247b-4e67-bf5d-9736627f1e98 - - - - - -] [instance: ca75cc77-948e-4e28-a5db-b95961a337a5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:55:08 np0005539505 nova_compute[186958]: 2025-11-29 06:55:08.536 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:08 np0005539505 podman[218973]: 2025-11-29 06:55:08.752270696 +0000 UTC m=+0.074410398 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 01:55:11 np0005539505 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 01:55:12 np0005539505 podman[218994]: 2025-11-29 06:55:12.754410733 +0000 UTC m=+0.081943140 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 01:55:13 np0005539505 nova_compute[186958]: 2025-11-29 06:55:13.171 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:13 np0005539505 nova_compute[186958]: 2025-11-29 06:55:13.538 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:18 np0005539505 nova_compute[186958]: 2025-11-29 06:55:18.174 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:18 np0005539505 nova_compute[186958]: 2025-11-29 06:55:18.540 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:23 np0005539505 nova_compute[186958]: 2025-11-29 06:55:23.176 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:23 np0005539505 nova_compute[186958]: 2025-11-29 06:55:23.611 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:23 np0005539505 podman[219015]: 2025-11-29 06:55:23.747640182 +0000 UTC m=+0.083768533 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 01:55:23 np0005539505 podman[219016]: 2025-11-29 06:55:23.75242003 +0000 UTC m=+0.074719942 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 01:55:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:55:26.931 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:55:26.932 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:55:26.932 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:28 np0005539505 nova_compute[186958]: 2025-11-29 06:55:28.208 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:28 np0005539505 podman[219057]: 2025-11-29 06:55:28.292303593 +0000 UTC m=+0.057383110 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:55:28 np0005539505 nova_compute[186958]: 2025-11-29 06:55:28.613 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:29 np0005539505 nova_compute[186958]: 2025-11-29 06:55:29.875 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:33 np0005539505 nova_compute[186958]: 2025-11-29 06:55:33.211 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:55:33.331 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:55:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:55:33.332 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:55:33 np0005539505 nova_compute[186958]: 2025-11-29 06:55:33.332 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:33 np0005539505 nova_compute[186958]: 2025-11-29 06:55:33.615 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:34 np0005539505 podman[219076]: 2025-11-29 06:55:34.749420939 +0000 UTC m=+0.085787781 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.335 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "a7c7d375-ef91-4869-987b-662d0c1de55c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.335 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "a7c7d375-ef91-4869-987b-662d0c1de55c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.374 186962 DEBUG nova.compute.manager [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.514 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.514 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.523 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.524 186962 INFO nova.compute.claims [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.635 186962 DEBUG nova.compute.provider_tree [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.667 186962 DEBUG nova.scheduler.client.report [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.693 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.694 186962 DEBUG nova.compute.manager [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.757 186962 DEBUG nova.compute.manager [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.758 186962 DEBUG nova.network.neutron [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:55:35 np0005539505 podman[219103]: 2025-11-29 06:55:35.765455456 +0000 UTC m=+0.097042317 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.778 186962 INFO nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.794 186962 DEBUG nova.compute.manager [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.949 186962 DEBUG nova.compute.manager [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.950 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.951 186962 INFO nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Creating image(s)#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.951 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.951 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.952 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:35 np0005539505 nova_compute[186958]: 2025-11-29 06:55:35.966 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.022 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.023 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.023 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.034 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.096 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.098 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.179 186962 DEBUG nova.network.neutron [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.179 186962 DEBUG nova.compute.manager [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.400 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk 1073741824" returned: 0 in 0.302s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.401 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.378s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.402 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.481 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.483 186962 DEBUG nova.virt.disk.api [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Checking if we can resize image /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.484 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.574 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.576 186962 DEBUG nova.virt.disk.api [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Cannot resize image /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.577 186962 DEBUG nova.objects.instance [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'migration_context' on Instance uuid a7c7d375-ef91-4869-987b-662d0c1de55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.599 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.600 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Ensure instance console log exists: /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.601 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.602 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.602 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.605 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.613 186962 WARNING nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.621 186962 DEBUG nova.virt.libvirt.host [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.622 186962 DEBUG nova.virt.libvirt.host [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.626 186962 DEBUG nova.virt.libvirt.host [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.627 186962 DEBUG nova.virt.libvirt.host [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.631 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.632 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.633 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.634 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.634 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.635 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.635 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.636 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.637 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.637 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.638 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.638 186962 DEBUG nova.virt.hardware [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.646 186962 DEBUG nova.objects.instance [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'pci_devices' on Instance uuid a7c7d375-ef91-4869-987b-662d0c1de55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.662 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  <uuid>a7c7d375-ef91-4869-987b-662d0c1de55c</uuid>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  <name>instance-00000023</name>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <nova:name>tempest-MigrationsAdminTest-server-989129995</nova:name>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:55:36</nova:creationTime>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:55:36 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:        <nova:user uuid="53ee944c04484336b9b14d84235a62b8">tempest-MigrationsAdminTest-1601255173-project-member</nova:user>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:        <nova:project uuid="890f94a625b342fdb17128922403c925">tempest-MigrationsAdminTest-1601255173</nova:project>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <nova:ports/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <entry name="serial">a7c7d375-ef91-4869-987b-662d0c1de55c</entry>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <entry name="uuid">a7c7d375-ef91-4869-987b-662d0c1de55c</entry>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.config"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/console.log" append="off"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:55:36 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:55:36 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:55:36 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:55:36 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.913 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.913 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:55:36 np0005539505 nova_compute[186958]: 2025-11-29 06:55:36.914 186962 INFO nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Using config drive#033[00m
Nov 29 01:55:37 np0005539505 nova_compute[186958]: 2025-11-29 06:55:37.205 186962 INFO nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Creating config drive at /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.config#033[00m
Nov 29 01:55:37 np0005539505 nova_compute[186958]: 2025-11-29 06:55:37.212 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_5owvcn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:37 np0005539505 nova_compute[186958]: 2025-11-29 06:55:37.357 186962 DEBUG oslo_concurrency.processutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_5owvcn" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:37 np0005539505 systemd-machined[153285]: New machine qemu-16-instance-00000023.
Nov 29 01:55:37 np0005539505 systemd[1]: Started Virtual Machine qemu-16-instance-00000023.
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.093 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399338.0928457, a7c7d375-ef91-4869-987b-662d0c1de55c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.094 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.096 186962 DEBUG nova.compute.manager [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.097 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.100 186962 INFO nova.virt.libvirt.driver [-] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance spawned successfully.#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.101 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.112 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.117 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.122 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.122 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.123 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.123 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.123 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.124 186962 DEBUG nova.virt.libvirt.driver [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.154 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.154 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399338.0941725, a7c7d375-ef91-4869-987b-662d0c1de55c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.154 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] VM Started (Lifecycle Event)#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.213 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.231 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.235 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.265 186962 INFO nova.compute.manager [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Took 2.32 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.265 186962 DEBUG nova.compute.manager [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.271 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.363 186962 INFO nova.compute.manager [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Took 2.89 seconds to build instance.#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.393 186962 DEBUG oslo_concurrency.lockutils [None req-5f4c25ac-2470-4e0e-bd4b-9a29ef5e3f1e 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "a7c7d375-ef91-4869-987b-662d0c1de55c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:38 np0005539505 nova_compute[186958]: 2025-11-29 06:55:38.618 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:55:39.334 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:55:39 np0005539505 podman[219173]: 2025-11-29 06:55:39.733684551 +0000 UTC m=+0.062994303 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 01:55:41 np0005539505 nova_compute[186958]: 2025-11-29 06:55:41.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:42 np0005539505 nova_compute[186958]: 2025-11-29 06:55:42.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:42 np0005539505 nova_compute[186958]: 2025-11-29 06:55:42.400 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:43 np0005539505 nova_compute[186958]: 2025-11-29 06:55:43.279 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:43 np0005539505 nova_compute[186958]: 2025-11-29 06:55:43.479 186962 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:55:43 np0005539505 nova_compute[186958]: 2025-11-29 06:55:43.480 186962 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:55:43 np0005539505 nova_compute[186958]: 2025-11-29 06:55:43.481 186962 DEBUG nova.network.neutron [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:55:43 np0005539505 nova_compute[186958]: 2025-11-29 06:55:43.621 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:43 np0005539505 podman[219194]: 2025-11-29 06:55:43.752466807 +0000 UTC m=+0.077625766 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:55:43 np0005539505 nova_compute[186958]: 2025-11-29 06:55:43.799 186962 DEBUG nova.network.neutron [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:55:44 np0005539505 nova_compute[186958]: 2025-11-29 06:55:44.228 186962 DEBUG nova.network.neutron [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:55:44 np0005539505 nova_compute[186958]: 2025-11-29 06:55:44.247 186962 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:55:44 np0005539505 nova_compute[186958]: 2025-11-29 06:55:44.412 186962 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 29 01:55:44 np0005539505 nova_compute[186958]: 2025-11-29 06:55:44.413 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Creating file /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/c98b8348ed6442d1b4026206fe2fe0b5.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 29 01:55:44 np0005539505 nova_compute[186958]: 2025-11-29 06:55:44.413 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/c98b8348ed6442d1b4026206fe2fe0b5.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:44 np0005539505 nova_compute[186958]: 2025-11-29 06:55:44.862 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/c98b8348ed6442d1b4026206fe2fe0b5.tmp" returned: 1 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:44 np0005539505 nova_compute[186958]: 2025-11-29 06:55:44.864 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/c98b8348ed6442d1b4026206fe2fe0b5.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 01:55:44 np0005539505 nova_compute[186958]: 2025-11-29 06:55:44.864 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Creating directory /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 29 01:55:44 np0005539505 nova_compute[186958]: 2025-11-29 06:55:44.865 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.069 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c" returned: 0 in 0.205s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.075 186962 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.410 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.411 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.411 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.412 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.484 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.546 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.547 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.610 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.773 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.775 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5619MB free_disk=73.26776504516602GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.775 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:45 np0005539505 nova_compute[186958]: 2025-11-29 06:55:45.776 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:46 np0005539505 nova_compute[186958]: 2025-11-29 06:55:46.273 186962 INFO nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updating resource usage from migration 6f1aff30-664d-4ffd-b820-c05fab910a30#033[00m
Nov 29 01:55:46 np0005539505 nova_compute[186958]: 2025-11-29 06:55:46.345 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Migration 6f1aff30-664d-4ffd-b820-c05fab910a30 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 29 01:55:46 np0005539505 nova_compute[186958]: 2025-11-29 06:55:46.346 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:55:46 np0005539505 nova_compute[186958]: 2025-11-29 06:55:46.346 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:55:46 np0005539505 nova_compute[186958]: 2025-11-29 06:55:46.438 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:55:46 np0005539505 nova_compute[186958]: 2025-11-29 06:55:46.468 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:55:46 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 01:55:46 np0005539505 nova_compute[186958]: 2025-11-29 06:55:46.519 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:55:46 np0005539505 nova_compute[186958]: 2025-11-29 06:55:46.520 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.744s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:55:48 np0005539505 nova_compute[186958]: 2025-11-29 06:55:48.282 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:48 np0005539505 nova_compute[186958]: 2025-11-29 06:55:48.625 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:49 np0005539505 nova_compute[186958]: 2025-11-29 06:55:49.520 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:49 np0005539505 nova_compute[186958]: 2025-11-29 06:55:49.522 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:49 np0005539505 nova_compute[186958]: 2025-11-29 06:55:49.522 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:55:50 np0005539505 nova_compute[186958]: 2025-11-29 06:55:50.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:50 np0005539505 nova_compute[186958]: 2025-11-29 06:55:50.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:55:50 np0005539505 nova_compute[186958]: 2025-11-29 06:55:50.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:55:50 np0005539505 nova_compute[186958]: 2025-11-29 06:55:50.400 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:55:50 np0005539505 nova_compute[186958]: 2025-11-29 06:55:50.401 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:55:50 np0005539505 nova_compute[186958]: 2025-11-29 06:55:50.401 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:55:50 np0005539505 nova_compute[186958]: 2025-11-29 06:55:50.402 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a7c7d375-ef91-4869-987b-662d0c1de55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:55:51 np0005539505 nova_compute[186958]: 2025-11-29 06:55:51.456 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:55:52 np0005539505 nova_compute[186958]: 2025-11-29 06:55:52.284 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:55:52 np0005539505 nova_compute[186958]: 2025-11-29 06:55:52.302 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:55:52 np0005539505 nova_compute[186958]: 2025-11-29 06:55:52.303 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:55:52 np0005539505 nova_compute[186958]: 2025-11-29 06:55:52.304 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:53 np0005539505 nova_compute[186958]: 2025-11-29 06:55:53.284 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:53 np0005539505 nova_compute[186958]: 2025-11-29 06:55:53.627 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:54 np0005539505 podman[219243]: 2025-11-29 06:55:54.780411559 +0000 UTC m=+0.097646464 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 01:55:54 np0005539505 podman[219242]: 2025-11-29 06:55:54.796774053 +0000 UTC m=+0.113852693 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 01:55:55 np0005539505 nova_compute[186958]: 2025-11-29 06:55:55.132 186962 DEBUG nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 01:55:55 np0005539505 nova_compute[186958]: 2025-11-29 06:55:55.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:55:57 np0005539505 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000023.scope: Deactivated successfully.
Nov 29 01:55:57 np0005539505 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000023.scope: Consumed 12.757s CPU time.
Nov 29 01:55:57 np0005539505 systemd-machined[153285]: Machine qemu-16-instance-00000023 terminated.
Nov 29 01:55:58 np0005539505 nova_compute[186958]: 2025-11-29 06:55:58.152 186962 INFO nova.virt.libvirt.driver [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance shutdown successfully after 13 seconds.#033[00m
Nov 29 01:55:58 np0005539505 nova_compute[186958]: 2025-11-29 06:55:58.160 186962 INFO nova.virt.libvirt.driver [-] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance destroyed successfully.#033[00m
Nov 29 01:55:58 np0005539505 nova_compute[186958]: 2025-11-29 06:55:58.165 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:58 np0005539505 nova_compute[186958]: 2025-11-29 06:55:58.264 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:58 np0005539505 nova_compute[186958]: 2025-11-29 06:55:58.266 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:58 np0005539505 nova_compute[186958]: 2025-11-29 06:55:58.288 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:58 np0005539505 nova_compute[186958]: 2025-11-29 06:55:58.324 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:58 np0005539505 nova_compute[186958]: 2025-11-29 06:55:58.326 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Copying file /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c_resize/disk to 192.168.122.101:/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 01:55:58 np0005539505 nova_compute[186958]: 2025-11-29 06:55:58.327 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c_resize/disk 192.168.122.101:/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:58 np0005539505 nova_compute[186958]: 2025-11-29 06:55:58.631 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:55:58 np0005539505 podman[219304]: 2025-11-29 06:55:58.783246574 +0000 UTC m=+0.107384266 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:55:59 np0005539505 nova_compute[186958]: 2025-11-29 06:55:59.199 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "scp -r /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c_resize/disk 192.168.122.101:/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk" returned: 0 in 0.872s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:59 np0005539505 nova_compute[186958]: 2025-11-29 06:55:59.200 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Copying file /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 01:55:59 np0005539505 nova_compute[186958]: 2025-11-29 06:55:59.201 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c_resize/disk.config 192.168.122.101:/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:59 np0005539505 nova_compute[186958]: 2025-11-29 06:55:59.440 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "scp -C -r /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c_resize/disk.config 192.168.122.101:/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.config" returned: 0 in 0.240s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:59 np0005539505 nova_compute[186958]: 2025-11-29 06:55:59.442 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Copying file /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 01:55:59 np0005539505 nova_compute[186958]: 2025-11-29 06:55:59.442 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c_resize/disk.info 192.168.122.101:/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:55:59 np0005539505 nova_compute[186958]: 2025-11-29 06:55:59.706 186962 DEBUG oslo_concurrency.processutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] CMD "scp -C -r /var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c_resize/disk.info 192.168.122.101:/var/lib/nova/instances/a7c7d375-ef91-4869-987b-662d0c1de55c/disk.info" returned: 0 in 0.263s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:55:59 np0005539505 nova_compute[186958]: 2025-11-29 06:55:59.834 186962 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "a7c7d375-ef91-4869-987b-662d0c1de55c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:55:59 np0005539505 nova_compute[186958]: 2025-11-29 06:55:59.834 186962 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "a7c7d375-ef91-4869-987b-662d0c1de55c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:55:59 np0005539505 nova_compute[186958]: 2025-11-29 06:55:59.835 186962 DEBUG oslo_concurrency.lockutils [None req-385ec15d-d89f-49ad-9083-0c57d9ac5918 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "a7c7d375-ef91-4869-987b-662d0c1de55c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:03 np0005539505 ovn_controller[95143]: 2025-11-29T06:56:03Z|00139|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 29 01:56:03 np0005539505 nova_compute[186958]: 2025-11-29 06:56:03.290 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:03 np0005539505 nova_compute[186958]: 2025-11-29 06:56:03.636 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:04 np0005539505 nova_compute[186958]: 2025-11-29 06:56:04.124 186962 DEBUG oslo_concurrency.lockutils [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "a7c7d375-ef91-4869-987b-662d0c1de55c" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:04 np0005539505 nova_compute[186958]: 2025-11-29 06:56:04.125 186962 DEBUG oslo_concurrency.lockutils [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "a7c7d375-ef91-4869-987b-662d0c1de55c" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:04 np0005539505 nova_compute[186958]: 2025-11-29 06:56:04.126 186962 DEBUG nova.compute.manager [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Going to confirm migration 8 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 29 01:56:04 np0005539505 nova_compute[186958]: 2025-11-29 06:56:04.164 186962 DEBUG nova.objects.instance [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'info_cache' on Instance uuid a7c7d375-ef91-4869-987b-662d0c1de55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:04 np0005539505 nova_compute[186958]: 2025-11-29 06:56:04.695 186962 DEBUG oslo_concurrency.lockutils [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:04 np0005539505 nova_compute[186958]: 2025-11-29 06:56:04.695 186962 DEBUG oslo_concurrency.lockutils [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:04 np0005539505 nova_compute[186958]: 2025-11-29 06:56:04.696 186962 DEBUG nova.network.neutron [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:56:04 np0005539505 nova_compute[186958]: 2025-11-29 06:56:04.897 186962 DEBUG nova.network.neutron [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:56:05 np0005539505 nova_compute[186958]: 2025-11-29 06:56:05.467 186962 DEBUG nova.network.neutron [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:05 np0005539505 nova_compute[186958]: 2025-11-29 06:56:05.560 186962 DEBUG oslo_concurrency.lockutils [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-a7c7d375-ef91-4869-987b-662d0c1de55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:05 np0005539505 nova_compute[186958]: 2025-11-29 06:56:05.561 186962 DEBUG nova.objects.instance [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'migration_context' on Instance uuid a7c7d375-ef91-4869-987b-662d0c1de55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:05 np0005539505 nova_compute[186958]: 2025-11-29 06:56:05.611 186962 DEBUG oslo_concurrency.lockutils [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:05 np0005539505 nova_compute[186958]: 2025-11-29 06:56:05.612 186962 DEBUG oslo_concurrency.lockutils [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:05 np0005539505 podman[219328]: 2025-11-29 06:56:05.72456256 +0000 UTC m=+0.049170643 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:56:05 np0005539505 nova_compute[186958]: 2025-11-29 06:56:05.917 186962 DEBUG nova.compute.provider_tree [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:56:05 np0005539505 nova_compute[186958]: 2025-11-29 06:56:05.974 186962 DEBUG nova.scheduler.client.report [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:56:06 np0005539505 nova_compute[186958]: 2025-11-29 06:56:06.042 186962 DEBUG oslo_concurrency.lockutils [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:06 np0005539505 nova_compute[186958]: 2025-11-29 06:56:06.224 186962 INFO nova.scheduler.client.report [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Deleted allocation for migration 6f1aff30-664d-4ffd-b820-c05fab910a30#033[00m
Nov 29 01:56:06 np0005539505 nova_compute[186958]: 2025-11-29 06:56:06.308 186962 DEBUG oslo_concurrency.lockutils [None req-4d01ccdc-f888-4759-aab8-a87848293c06 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "a7c7d375-ef91-4869-987b-662d0c1de55c" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:06 np0005539505 podman[219351]: 2025-11-29 06:56:06.798264723 +0000 UTC m=+0.125089618 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 01:56:08 np0005539505 nova_compute[186958]: 2025-11-29 06:56:08.292 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:08 np0005539505 nova_compute[186958]: 2025-11-29 06:56:08.639 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:10 np0005539505 podman[219377]: 2025-11-29 06:56:10.776654781 +0000 UTC m=+0.099096837 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 01:56:12 np0005539505 nova_compute[186958]: 2025-11-29 06:56:12.611 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399357.6096232, a7c7d375-ef91-4869-987b-662d0c1de55c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:12 np0005539505 nova_compute[186958]: 2025-11-29 06:56:12.612 186962 INFO nova.compute.manager [-] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:56:12 np0005539505 nova_compute[186958]: 2025-11-29 06:56:12.638 186962 DEBUG nova.compute.manager [None req-8f313ec2-107e-467a-afe6-cbacc4627650 - - - - - -] [instance: a7c7d375-ef91-4869-987b-662d0c1de55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:13 np0005539505 nova_compute[186958]: 2025-11-29 06:56:13.294 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:13 np0005539505 nova_compute[186958]: 2025-11-29 06:56:13.641 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:14 np0005539505 podman[219397]: 2025-11-29 06:56:14.75718967 +0000 UTC m=+0.084618037 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:56:18 np0005539505 nova_compute[186958]: 2025-11-29 06:56:18.297 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:18 np0005539505 nova_compute[186958]: 2025-11-29 06:56:18.644 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:23 np0005539505 nova_compute[186958]: 2025-11-29 06:56:23.298 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:23 np0005539505 nova_compute[186958]: 2025-11-29 06:56:23.646 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:25 np0005539505 podman[219418]: 2025-11-29 06:56:25.728168237 +0000 UTC m=+0.054277140 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:56:25 np0005539505 podman[219417]: 2025-11-29 06:56:25.757068792 +0000 UTC m=+0.085573635 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Nov 29 01:56:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:26.933 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:26.936 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:26.936 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:28 np0005539505 nova_compute[186958]: 2025-11-29 06:56:28.300 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:28 np0005539505 nova_compute[186958]: 2025-11-29 06:56:28.649 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:29 np0005539505 podman[219461]: 2025-11-29 06:56:29.71295014 +0000 UTC m=+0.050940364 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:56:33 np0005539505 nova_compute[186958]: 2025-11-29 06:56:33.303 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:33 np0005539505 nova_compute[186958]: 2025-11-29 06:56:33.652 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:36 np0005539505 podman[219480]: 2025-11-29 06:56:36.758327854 +0000 UTC m=+0.082851116 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:56:37 np0005539505 podman[219504]: 2025-11-29 06:56:37.817564391 +0000 UTC m=+0.142809120 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:56:38 np0005539505 nova_compute[186958]: 2025-11-29 06:56:38.341 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:38 np0005539505 nova_compute[186958]: 2025-11-29 06:56:38.654 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:41 np0005539505 nova_compute[186958]: 2025-11-29 06:56:41.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:41 np0005539505 podman[219533]: 2025-11-29 06:56:41.764713376 +0000 UTC m=+0.097901362 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm)
Nov 29 01:56:42 np0005539505 nova_compute[186958]: 2025-11-29 06:56:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.343 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.399 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.400 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.424 186962 DEBUG nova.compute.manager [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.537 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.538 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.546 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.546 186962 INFO nova.compute.claims [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.656 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.679 186962 DEBUG nova.compute.provider_tree [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.696 186962 DEBUG nova.scheduler.client.report [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.724 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.725 186962 DEBUG nova.compute.manager [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.797 186962 DEBUG nova.compute.manager [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.798 186962 DEBUG nova.network.neutron [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.820 186962 INFO nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.846 186962 DEBUG nova.compute.manager [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.979 186962 DEBUG nova.compute.manager [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.981 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.982 186962 INFO nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Creating image(s)#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.984 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "/var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.984 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "/var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:43 np0005539505 nova_compute[186958]: 2025-11-29 06:56:43.986 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "/var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.014 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.087 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.090 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.091 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.114 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.190 186962 DEBUG nova.policy [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9e492001fe194231b630bba63bb7b39b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'df8cb7dc75584513bc47dd0afa74c82a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.206 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.207 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.397 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.991 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk 1073741824" returned: 0 in 0.784s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.992 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:44 np0005539505 nova_compute[186958]: 2025-11-29 06:56:44.993 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.071 186962 DEBUG nova.network.neutron [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Successfully created port: 2641b5e5-04d4-4190-9adf-40f11057bd93 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.085 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.086 186962 DEBUG nova.virt.disk.api [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Checking if we can resize image /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.087 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.156 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.157 186962 DEBUG nova.virt.disk.api [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Cannot resize image /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.158 186962 DEBUG nova.objects.instance [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lazy-loading 'migration_context' on Instance uuid 373bf1d0-aa63-4995-87c5-d6a01e995a40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.186 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.186 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Ensure instance console log exists: /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.186 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.187 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.187 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.389 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:45 np0005539505 podman[219568]: 2025-11-29 06:56:45.749615852 +0000 UTC m=+0.080229711 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:56:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:45.780 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:56:45 np0005539505 nova_compute[186958]: 2025-11-29 06:56:45.781 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:45.782 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:56:46 np0005539505 nova_compute[186958]: 2025-11-29 06:56:46.142 186962 DEBUG nova.network.neutron [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Successfully updated port: 2641b5e5-04d4-4190-9adf-40f11057bd93 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:56:46 np0005539505 nova_compute[186958]: 2025-11-29 06:56:46.174 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:46 np0005539505 nova_compute[186958]: 2025-11-29 06:56:46.175 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquired lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:46 np0005539505 nova_compute[186958]: 2025-11-29 06:56:46.176 186962 DEBUG nova.network.neutron [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:56:46 np0005539505 nova_compute[186958]: 2025-11-29 06:56:46.391 186962 DEBUG nova.compute.manager [req-304c8c59-0fe9-4ce8-8007-f8a367025dde req-5ff17e15-05da-482a-92ba-00e21021a2df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-changed-2641b5e5-04d4-4190-9adf-40f11057bd93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:46 np0005539505 nova_compute[186958]: 2025-11-29 06:56:46.392 186962 DEBUG nova.compute.manager [req-304c8c59-0fe9-4ce8-8007-f8a367025dde req-5ff17e15-05da-482a-92ba-00e21021a2df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Refreshing instance network info cache due to event network-changed-2641b5e5-04d4-4190-9adf-40f11057bd93. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:56:46 np0005539505 nova_compute[186958]: 2025-11-29 06:56:46.392 186962 DEBUG oslo_concurrency.lockutils [req-304c8c59-0fe9-4ce8-8007-f8a367025dde req-5ff17e15-05da-482a-92ba-00e21021a2df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:56:46 np0005539505 nova_compute[186958]: 2025-11-29 06:56:46.725 186962 DEBUG nova.network.neutron [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.390 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.390 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.418 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.418 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.419 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.419 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.430 186962 DEBUG nova.network.neutron [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Updating instance_info_cache with network_info: [{"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.466 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Releasing lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.467 186962 DEBUG nova.compute.manager [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Instance network_info: |[{"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.468 186962 DEBUG oslo_concurrency.lockutils [req-304c8c59-0fe9-4ce8-8007-f8a367025dde req-5ff17e15-05da-482a-92ba-00e21021a2df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.468 186962 DEBUG nova.network.neutron [req-304c8c59-0fe9-4ce8-8007-f8a367025dde req-5ff17e15-05da-482a-92ba-00e21021a2df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Refreshing network info cache for port 2641b5e5-04d4-4190-9adf-40f11057bd93 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.472 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Start _get_guest_xml network_info=[{"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.478 186962 WARNING nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.485 186962 DEBUG nova.virt.libvirt.host [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.487 186962 DEBUG nova.virt.libvirt.host [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.498 186962 DEBUG nova.virt.libvirt.host [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.500 186962 DEBUG nova.virt.libvirt.host [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.501 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.502 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.502 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.502 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.503 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.503 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.503 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.504 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.504 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.504 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.505 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.505 186962 DEBUG nova.virt.hardware [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.510 186962 DEBUG nova.virt.libvirt.vif [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:56:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-703092939',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-703092939',id=39,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='df8cb7dc75584513bc47dd0afa74c82a',ramdisk_id='',reservation_id='r-048isott',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-769246837',owner_user_name='tempest-AttachInterfacesV270Test-769246837-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:56:43Z,user_data=None,user_id='9e492001fe194231b630bba63bb7b39b',uuid=373bf1d0-aa63-4995-87c5-d6a01e995a40,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.511 186962 DEBUG nova.network.os_vif_util [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converting VIF {"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.512 186962 DEBUG nova.network.os_vif_util [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:e7:8f,bridge_name='br-int',has_traffic_filtering=True,id=2641b5e5-04d4-4190-9adf-40f11057bd93,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2641b5e5-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.513 186962 DEBUG nova.objects.instance [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lazy-loading 'pci_devices' on Instance uuid 373bf1d0-aa63-4995-87c5-d6a01e995a40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.551 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  <uuid>373bf1d0-aa63-4995-87c5-d6a01e995a40</uuid>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  <name>instance-00000027</name>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <nova:name>tempest-AttachInterfacesV270Test-server-703092939</nova:name>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:56:47</nova:creationTime>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:        <nova:user uuid="9e492001fe194231b630bba63bb7b39b">tempest-AttachInterfacesV270Test-769246837-project-member</nova:user>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:        <nova:project uuid="df8cb7dc75584513bc47dd0afa74c82a">tempest-AttachInterfacesV270Test-769246837</nova:project>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:        <nova:port uuid="2641b5e5-04d4-4190-9adf-40f11057bd93">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <entry name="serial">373bf1d0-aa63-4995-87c5-d6a01e995a40</entry>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <entry name="uuid">373bf1d0-aa63-4995-87c5-d6a01e995a40</entry>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk.config"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:71:e7:8f"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <target dev="tap2641b5e5-04"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/console.log" append="off"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:56:47 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:56:47 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:56:47 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:56:47 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.552 186962 DEBUG nova.compute.manager [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Preparing to wait for external event network-vif-plugged-2641b5e5-04d4-4190-9adf-40f11057bd93 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.553 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.553 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.553 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.554 186962 DEBUG nova.virt.libvirt.vif [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:56:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-703092939',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-703092939',id=39,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='df8cb7dc75584513bc47dd0afa74c82a',ramdisk_id='',reservation_id='r-048isott',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-769246837',owner_user_name='tempest-AttachInterfacesV270Test-769246837-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:56:43Z,user_data=None,user_id='9e492001fe194231b630bba63bb7b39b',uuid=373bf1d0-aa63-4995-87c5-d6a01e995a40,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.554 186962 DEBUG nova.network.os_vif_util [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converting VIF {"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.555 186962 DEBUG nova.network.os_vif_util [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:e7:8f,bridge_name='br-int',has_traffic_filtering=True,id=2641b5e5-04d4-4190-9adf-40f11057bd93,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2641b5e5-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.555 186962 DEBUG os_vif [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:e7:8f,bridge_name='br-int',has_traffic_filtering=True,id=2641b5e5-04d4-4190-9adf-40f11057bd93,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2641b5e5-04') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.556 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.556 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.557 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.562 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.562 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2641b5e5-04, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.563 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2641b5e5-04, col_values=(('external_ids', {'iface-id': '2641b5e5-04d4-4190-9adf-40f11057bd93', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:e7:8f', 'vm-uuid': '373bf1d0-aa63-4995-87c5-d6a01e995a40'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.627 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:47 np0005539505 NetworkManager[55134]: <info>  [1764399407.6289] manager: (tap2641b5e5-04): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.630 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.636 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.638 186962 INFO os_vif [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:e7:8f,bridge_name='br-int',has_traffic_filtering=True,id=2641b5e5-04d4-4190-9adf-40f11057bd93,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2641b5e5-04')#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.729 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.730 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5749MB free_disk=73.26831436157227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.731 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.731 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.789 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.790 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.790 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.853 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.867 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.898 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.899 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.957 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.957 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.958 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] No VIF found with MAC fa:16:3e:71:e7:8f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:56:47 np0005539505 nova_compute[186958]: 2025-11-29 06:56:47.958 186962 INFO nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Using config drive#033[00m
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.084 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '373bf1d0-aa63-4995-87c5-d6a01e995a40', 'name': 'tempest-AttachInterfacesV270Test-server-703092939', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000027', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'df8cb7dc75584513bc47dd0afa74c82a', 'user_id': '9e492001fe194231b630bba63bb7b39b', 'hostId': '48ca8fa3c9fb01993cf81fbe1904bd90644bc230116df710c8b2ea2a', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.085 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.087 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.089 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.089 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.091 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.091 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.092 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.093 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.095 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.096 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.096 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.097 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.098 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.098 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-703092939>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-703092939>]
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.099 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.099 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-703092939>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-703092939>]
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.099 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.100 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.101 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.103 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.103 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.103 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-703092939>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-703092939>]
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.104 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.106 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.106 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.107 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.108 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.109 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.109 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.110 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-703092939>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-703092939>]
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.111 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.111 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.112 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.113 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.115 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 01:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:56:48.116 12 DEBUG ceilometer.compute.pollsters [-] Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000027, id=373bf1d0-aa63-4995-87c5-d6a01e995a40>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 01:56:48 np0005539505 nova_compute[186958]: 2025-11-29 06:56:48.346 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:48 np0005539505 nova_compute[186958]: 2025-11-29 06:56:48.366 186962 INFO nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Creating config drive at /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk.config#033[00m
Nov 29 01:56:48 np0005539505 nova_compute[186958]: 2025-11-29 06:56:48.383 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps78cs4mr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:56:48 np0005539505 nova_compute[186958]: 2025-11-29 06:56:48.525 186962 DEBUG oslo_concurrency.processutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps78cs4mr" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:56:48 np0005539505 kernel: tap2641b5e5-04: entered promiscuous mode
Nov 29 01:56:48 np0005539505 NetworkManager[55134]: <info>  [1764399408.6256] manager: (tap2641b5e5-04): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Nov 29 01:56:48 np0005539505 ovn_controller[95143]: 2025-11-29T06:56:48Z|00140|binding|INFO|Claiming lport 2641b5e5-04d4-4190-9adf-40f11057bd93 for this chassis.
Nov 29 01:56:48 np0005539505 ovn_controller[95143]: 2025-11-29T06:56:48Z|00141|binding|INFO|2641b5e5-04d4-4190-9adf-40f11057bd93: Claiming fa:16:3e:71:e7:8f 10.100.0.13
Nov 29 01:56:48 np0005539505 nova_compute[186958]: 2025-11-29 06:56:48.626 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:48 np0005539505 nova_compute[186958]: 2025-11-29 06:56:48.637 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:48 np0005539505 nova_compute[186958]: 2025-11-29 06:56:48.641 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.649 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:e7:8f 10.100.0.13'], port_security=['fa:16:3e:71:e7:8f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '373bf1d0-aa63-4995-87c5-d6a01e995a40', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'df8cb7dc75584513bc47dd0afa74c82a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '87e0cd78-22d4-4402-b109-2d6faaf67db4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a40acb2c-1f95-4f79-a33d-c84f0d8f72fe, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=2641b5e5-04d4-4190-9adf-40f11057bd93) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.652 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 2641b5e5-04d4-4190-9adf-40f11057bd93 in datapath 04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f bound to our chassis#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.656 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f#033[00m
Nov 29 01:56:48 np0005539505 systemd-udevd[219609]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.686 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2387e218-e841-44af-869a-788462d7588b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.688 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap04d895aa-71 in ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:56:48 np0005539505 systemd-machined[153285]: New machine qemu-17-instance-00000027.
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.692 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap04d895aa-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.692 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f9eea104-9112-43e1-b771-f3f9de49697a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.693 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[38852220-eeb6-4455-b5f1-17efea8cd9aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 NetworkManager[55134]: <info>  [1764399408.7086] device (tap2641b5e5-04): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:56:48 np0005539505 NetworkManager[55134]: <info>  [1764399408.7095] device (tap2641b5e5-04): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.718 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[50028fe2-b3bc-4906-8b01-b005d47ce7be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 nova_compute[186958]: 2025-11-29 06:56:48.731 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:48 np0005539505 ovn_controller[95143]: 2025-11-29T06:56:48Z|00142|binding|INFO|Setting lport 2641b5e5-04d4-4190-9adf-40f11057bd93 ovn-installed in OVS
Nov 29 01:56:48 np0005539505 ovn_controller[95143]: 2025-11-29T06:56:48Z|00143|binding|INFO|Setting lport 2641b5e5-04d4-4190-9adf-40f11057bd93 up in Southbound
Nov 29 01:56:48 np0005539505 nova_compute[186958]: 2025-11-29 06:56:48.737 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:48 np0005539505 systemd[1]: Started Virtual Machine qemu-17-instance-00000027.
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.750 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7389ccf1-3659-4d0b-842b-cba5438a1229]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.784 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.793 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9fab4d-295f-44ce-a495-3445db9d324c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.805 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[51a7ee04-2bde-4f9e-aaa3-ad57ced2a37b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 NetworkManager[55134]: <info>  [1764399408.8069] manager: (tap04d895aa-70): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.859 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[efbdf9cc-ee90-461b-8a3a-0f2f44b65d84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.864 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[69759b58-d974-4867-8942-a0bd059a3280]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 NetworkManager[55134]: <info>  [1764399408.9036] device (tap04d895aa-70): carrier: link connected
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.908 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[535ed6a4-aef5-4e67-b7e0-34cbdc760793]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.933 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[903273db-826d-4e89-99db-76d6034b9b68]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04d895aa-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:e6:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485648, 'reachable_time': 15467, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219642, 'error': None, 'target': 'ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.949 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2db4da99-c08e-4d52-9b2d-0a2ac43c8e01]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe65:e61a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485648, 'tstamp': 485648}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219643, 'error': None, 'target': 'ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.965 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8b2d3bfc-1f34-4caa-a714-8d5b39c1812a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04d895aa-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:e6:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485648, 'reachable_time': 15467, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219644, 'error': None, 'target': 'ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:48.992 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ec541eed-e719-4b85-b650-86c0d23429fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:49.067 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[671fdba9-ec9c-4b86-9052-e577a42e651d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:49.069 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04d895aa-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:49.069 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:49.070 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04d895aa-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.071 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:49 np0005539505 NetworkManager[55134]: <info>  [1764399409.0720] manager: (tap04d895aa-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Nov 29 01:56:49 np0005539505 kernel: tap04d895aa-70: entered promiscuous mode
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.331 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:49.334 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap04d895aa-70, col_values=(('external_ids', {'iface-id': '818f85b8-72bf-4339-84fb-0c72b9e71340'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:56:49 np0005539505 ovn_controller[95143]: 2025-11-29T06:56:49Z|00144|binding|INFO|Releasing lport 818f85b8-72bf-4339-84fb-0c72b9e71340 from this chassis (sb_readonly=0)
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.335 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.336 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:49.339 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:49.355 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[33960804-1a55-440f-82eb-8a64f504460d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:49.357 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f.pid.haproxy
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:56:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:56:49.357 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'env', 'PROCESS_TAG=haproxy-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.359 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.490 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399409.489144, 373bf1d0-aa63-4995-87c5-d6a01e995a40 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.490 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] VM Started (Lifecycle Event)#033[00m
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.620 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.630 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399409.4907148, 373bf1d0-aa63-4995-87c5-d6a01e995a40 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.631 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.809 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.816 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:56:49 np0005539505 nova_compute[186958]: 2025-11-29 06:56:49.855 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:56:49 np0005539505 podman[219683]: 2025-11-29 06:56:49.806950172 +0000 UTC m=+0.043430117 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:56:50 np0005539505 nova_compute[186958]: 2025-11-29 06:56:50.887 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:50 np0005539505 nova_compute[186958]: 2025-11-29 06:56:50.889 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:56:51 np0005539505 podman[219683]: 2025-11-29 06:56:51.375125863 +0000 UTC m=+1.611605748 container create d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.413 186962 DEBUG nova.network.neutron [req-304c8c59-0fe9-4ce8-8007-f8a367025dde req-5ff17e15-05da-482a-92ba-00e21021a2df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Updated VIF entry in instance network info cache for port 2641b5e5-04d4-4190-9adf-40f11057bd93. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.414 186962 DEBUG nova.network.neutron [req-304c8c59-0fe9-4ce8-8007-f8a367025dde req-5ff17e15-05da-482a-92ba-00e21021a2df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Updating instance_info_cache with network_info: [{"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:56:51 np0005539505 systemd[1]: Started libpod-conmon-d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51.scope.
Nov 29 01:56:51 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:56:51 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81ed4416ce6f3d19cc669ce3ccf7432b55294a1169fa3e7791777e2d8d48ebb1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.954 186962 DEBUG nova.compute.manager [req-da3e1b0b-4738-4dad-a6b1-32aa6b38a3d1 req-2f3a7ca0-7808-4ef9-9216-c2cfa4bd9e3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-plugged-2641b5e5-04d4-4190-9adf-40f11057bd93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.956 186962 DEBUG oslo_concurrency.lockutils [req-da3e1b0b-4738-4dad-a6b1-32aa6b38a3d1 req-2f3a7ca0-7808-4ef9-9216-c2cfa4bd9e3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.957 186962 DEBUG oslo_concurrency.lockutils [req-da3e1b0b-4738-4dad-a6b1-32aa6b38a3d1 req-2f3a7ca0-7808-4ef9-9216-c2cfa4bd9e3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.958 186962 DEBUG oslo_concurrency.lockutils [req-da3e1b0b-4738-4dad-a6b1-32aa6b38a3d1 req-2f3a7ca0-7808-4ef9-9216-c2cfa4bd9e3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.959 186962 DEBUG nova.compute.manager [req-da3e1b0b-4738-4dad-a6b1-32aa6b38a3d1 req-2f3a7ca0-7808-4ef9-9216-c2cfa4bd9e3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Processing event network-vif-plugged-2641b5e5-04d4-4190-9adf-40f11057bd93 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.962 186962 DEBUG nova.compute.manager [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.963 186962 DEBUG oslo_concurrency.lockutils [req-304c8c59-0fe9-4ce8-8007-f8a367025dde req-5ff17e15-05da-482a-92ba-00e21021a2df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.970 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.972 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.973 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399411.9725773, 373bf1d0-aa63-4995-87c5-d6a01e995a40 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.973 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.978 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.980 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.980 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.982 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.988 186962 INFO nova.virt.libvirt.driver [-] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Instance spawned successfully.#033[00m
Nov 29 01:56:51 np0005539505 nova_compute[186958]: 2025-11-29 06:56:51.989 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:56:52 np0005539505 podman[219683]: 2025-11-29 06:56:52.02489316 +0000 UTC m=+2.261373085 container init d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 01:56:52 np0005539505 podman[219683]: 2025-11-29 06:56:52.030551344 +0000 UTC m=+2.267031229 container start d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 01:56:52 np0005539505 neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f[219698]: [NOTICE]   (219702) : New worker (219704) forked
Nov 29 01:56:52 np0005539505 neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f[219698]: [NOTICE]   (219702) : Loading success.
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.194 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.198 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.288 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.295 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.295 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.296 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.297 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.298 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.299 186962 DEBUG nova.virt.libvirt.driver [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.600 186962 INFO nova.compute.manager [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Took 8.62 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.601 186962 DEBUG nova.compute.manager [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.627 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:52 np0005539505 nova_compute[186958]: 2025-11-29 06:56:52.677 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:53 np0005539505 nova_compute[186958]: 2025-11-29 06:56:53.348 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:53 np0005539505 nova_compute[186958]: 2025-11-29 06:56:53.690 186962 INFO nova.compute.manager [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Took 10.21 seconds to build instance.#033[00m
Nov 29 01:56:53 np0005539505 nova_compute[186958]: 2025-11-29 06:56:53.994 186962 DEBUG oslo_concurrency.lockutils [None req-f9c546b8-2a01-4273-b8ab-3bf39212a03f 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:54 np0005539505 nova_compute[186958]: 2025-11-29 06:56:54.097 186962 DEBUG nova.compute.manager [req-a91ab79f-c4d4-474b-9c86-ebf5a5afa9a9 req-52717663-d6ef-425a-940f-64343b86c43f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-plugged-2641b5e5-04d4-4190-9adf-40f11057bd93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:56:54 np0005539505 nova_compute[186958]: 2025-11-29 06:56:54.098 186962 DEBUG oslo_concurrency.lockutils [req-a91ab79f-c4d4-474b-9c86-ebf5a5afa9a9 req-52717663-d6ef-425a-940f-64343b86c43f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:54 np0005539505 nova_compute[186958]: 2025-11-29 06:56:54.098 186962 DEBUG oslo_concurrency.lockutils [req-a91ab79f-c4d4-474b-9c86-ebf5a5afa9a9 req-52717663-d6ef-425a-940f-64343b86c43f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:54 np0005539505 nova_compute[186958]: 2025-11-29 06:56:54.098 186962 DEBUG oslo_concurrency.lockutils [req-a91ab79f-c4d4-474b-9c86-ebf5a5afa9a9 req-52717663-d6ef-425a-940f-64343b86c43f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:56:54 np0005539505 nova_compute[186958]: 2025-11-29 06:56:54.099 186962 DEBUG nova.compute.manager [req-a91ab79f-c4d4-474b-9c86-ebf5a5afa9a9 req-52717663-d6ef-425a-940f-64343b86c43f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] No waiting events found dispatching network-vif-plugged-2641b5e5-04d4-4190-9adf-40f11057bd93 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:56:54 np0005539505 nova_compute[186958]: 2025-11-29 06:56:54.099 186962 WARNING nova.compute.manager [req-a91ab79f-c4d4-474b-9c86-ebf5a5afa9a9 req-52717663-d6ef-425a-940f-64343b86c43f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received unexpected event network-vif-plugged-2641b5e5-04d4-4190-9adf-40f11057bd93 for instance with vm_state active and task_state None.#033[00m
Nov 29 01:56:54 np0005539505 nova_compute[186958]: 2025-11-29 06:56:54.647 186962 DEBUG oslo_concurrency.lockutils [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "interface-373bf1d0-aa63-4995-87c5-d6a01e995a40-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:56:54 np0005539505 nova_compute[186958]: 2025-11-29 06:56:54.648 186962 DEBUG oslo_concurrency.lockutils [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "interface-373bf1d0-aa63-4995-87c5-d6a01e995a40-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:56:54 np0005539505 nova_compute[186958]: 2025-11-29 06:56:54.649 186962 DEBUG nova.objects.instance [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lazy-loading 'flavor' on Instance uuid 373bf1d0-aa63-4995-87c5-d6a01e995a40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:54 np0005539505 nova_compute[186958]: 2025-11-29 06:56:54.850 186962 DEBUG nova.objects.instance [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lazy-loading 'pci_requests' on Instance uuid 373bf1d0-aa63-4995-87c5-d6a01e995a40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:56:54 np0005539505 nova_compute[186958]: 2025-11-29 06:56:54.880 186962 DEBUG nova.network.neutron [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:56:55 np0005539505 nova_compute[186958]: 2025-11-29 06:56:55.587 186962 DEBUG nova.policy [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9e492001fe194231b630bba63bb7b39b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'df8cb7dc75584513bc47dd0afa74c82a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:56:56 np0005539505 podman[219714]: 2025-11-29 06:56:56.758042811 +0000 UTC m=+0.074694471 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:56:56 np0005539505 podman[219713]: 2025-11-29 06:56:56.765187407 +0000 UTC m=+0.088504560 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Nov 29 01:56:57 np0005539505 nova_compute[186958]: 2025-11-29 06:56:57.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:56:57 np0005539505 nova_compute[186958]: 2025-11-29 06:56:57.662 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:58 np0005539505 nova_compute[186958]: 2025-11-29 06:56:58.351 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:56:58 np0005539505 nova_compute[186958]: 2025-11-29 06:56:58.527 186962 DEBUG nova.network.neutron [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Successfully created port: d02e4190-7cb4-4954-91e8-510380d2319d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:57:00 np0005539505 podman[219757]: 2025-11-29 06:57:00.743498722 +0000 UTC m=+0.074011101 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 29 01:57:01 np0005539505 nova_compute[186958]: 2025-11-29 06:57:01.634 186962 DEBUG nova.network.neutron [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Successfully updated port: d02e4190-7cb4-4954-91e8-510380d2319d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:57:01 np0005539505 nova_compute[186958]: 2025-11-29 06:57:01.889 186962 DEBUG nova.compute.manager [req-66b21076-17b9-45f0-b240-40e9fb54e12d req-a1c21be9-fbc1-4a37-bb74-82cb2362aeeb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-changed-d02e4190-7cb4-4954-91e8-510380d2319d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:01 np0005539505 nova_compute[186958]: 2025-11-29 06:57:01.890 186962 DEBUG nova.compute.manager [req-66b21076-17b9-45f0-b240-40e9fb54e12d req-a1c21be9-fbc1-4a37-bb74-82cb2362aeeb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Refreshing instance network info cache due to event network-changed-d02e4190-7cb4-4954-91e8-510380d2319d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:57:01 np0005539505 nova_compute[186958]: 2025-11-29 06:57:01.890 186962 DEBUG oslo_concurrency.lockutils [req-66b21076-17b9-45f0-b240-40e9fb54e12d req-a1c21be9-fbc1-4a37-bb74-82cb2362aeeb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:01 np0005539505 nova_compute[186958]: 2025-11-29 06:57:01.890 186962 DEBUG oslo_concurrency.lockutils [req-66b21076-17b9-45f0-b240-40e9fb54e12d req-a1c21be9-fbc1-4a37-bb74-82cb2362aeeb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:01 np0005539505 nova_compute[186958]: 2025-11-29 06:57:01.891 186962 DEBUG nova.network.neutron [req-66b21076-17b9-45f0-b240-40e9fb54e12d req-a1c21be9-fbc1-4a37-bb74-82cb2362aeeb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Refreshing network info cache for port d02e4190-7cb4-4954-91e8-510380d2319d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:57:01 np0005539505 nova_compute[186958]: 2025-11-29 06:57:01.896 186962 DEBUG oslo_concurrency.lockutils [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:02 np0005539505 nova_compute[186958]: 2025-11-29 06:57:02.700 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:03 np0005539505 nova_compute[186958]: 2025-11-29 06:57:03.353 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:03 np0005539505 nova_compute[186958]: 2025-11-29 06:57:03.700 186962 DEBUG nova.network.neutron [req-66b21076-17b9-45f0-b240-40e9fb54e12d req-a1c21be9-fbc1-4a37-bb74-82cb2362aeeb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Added VIF to instance network info cache for port d02e4190-7cb4-4954-91e8-510380d2319d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3489#033[00m
Nov 29 01:57:03 np0005539505 nova_compute[186958]: 2025-11-29 06:57:03.701 186962 DEBUG nova.network.neutron [req-66b21076-17b9-45f0-b240-40e9fb54e12d req-a1c21be9-fbc1-4a37-bb74-82cb2362aeeb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Updating instance_info_cache with network_info: [{"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d02e4190-7cb4-4954-91e8-510380d2319d", "address": "fa:16:3e:cf:67:b7", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e4190-7c", "ovs_interfaceid": "d02e4190-7cb4-4954-91e8-510380d2319d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:03 np0005539505 nova_compute[186958]: 2025-11-29 06:57:03.722 186962 DEBUG oslo_concurrency.lockutils [req-66b21076-17b9-45f0-b240-40e9fb54e12d req-a1c21be9-fbc1-4a37-bb74-82cb2362aeeb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:03 np0005539505 nova_compute[186958]: 2025-11-29 06:57:03.723 186962 DEBUG oslo_concurrency.lockutils [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquired lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:03 np0005539505 nova_compute[186958]: 2025-11-29 06:57:03.723 186962 DEBUG nova.network.neutron [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:57:03 np0005539505 nova_compute[186958]: 2025-11-29 06:57:03.855 186962 WARNING nova.network.neutron [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] 04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f already exists in list: networks containing: ['04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f']. ignoring it#033[00m
Nov 29 01:57:03 np0005539505 nova_compute[186958]: 2025-11-29 06:57:03.855 186962 WARNING nova.network.neutron [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] 04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f already exists in list: networks containing: ['04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f']. ignoring it#033[00m
Nov 29 01:57:03 np0005539505 nova_compute[186958]: 2025-11-29 06:57:03.856 186962 WARNING nova.network.neutron [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] d02e4190-7cb4-4954-91e8-510380d2319d already exists in list: port_ids containing: ['d02e4190-7cb4-4954-91e8-510380d2319d']. ignoring it#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.234 186962 DEBUG nova.compute.manager [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.385 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.386 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.418 186962 DEBUG nova.objects.instance [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'pci_requests' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.438 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.439 186962 INFO nova.compute.claims [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.439 186962 DEBUG nova.objects.instance [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'resources' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.459 186962 DEBUG nova.objects.instance [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'numa_topology' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.479 186962 DEBUG nova.objects.instance [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'pci_devices' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.533 186962 INFO nova.compute.resource_tracker [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Updating resource usage from migration 32315c80-25a1-4b88-8a75-5380619fbfaf#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.533 186962 DEBUG nova.compute.resource_tracker [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Starting to track incoming migration 32315c80-25a1-4b88-8a75-5380619fbfaf with flavor 1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.696 186962 DEBUG nova.compute.provider_tree [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.716 186962 DEBUG nova.scheduler.client.report [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.742 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.743 186962 INFO nova.compute.manager [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Migrating#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.751 186962 DEBUG nova.network.neutron [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Updating instance_info_cache with network_info: [{"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d02e4190-7cb4-4954-91e8-510380d2319d", "address": "fa:16:3e:cf:67:b7", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e4190-7c", "ovs_interfaceid": "d02e4190-7cb4-4954-91e8-510380d2319d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.779 186962 DEBUG oslo_concurrency.lockutils [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Releasing lock "refresh_cache-373bf1d0-aa63-4995-87c5-d6a01e995a40" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.783 186962 DEBUG nova.virt.libvirt.vif [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:56:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-703092939',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-703092939',id=39,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:56:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='df8cb7dc75584513bc47dd0afa74c82a',ramdisk_id='',reservation_id='r-048isott',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-769246837',owner_user_name='tempest-AttachInterfacesV270Test-769246837-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:56:52Z,user_data=None,user_id='9e492001fe194231b630bba63bb7b39b',uuid=373bf1d0-aa63-4995-87c5-d6a01e995a40,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d02e4190-7cb4-4954-91e8-510380d2319d", "address": "fa:16:3e:cf:67:b7", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e4190-7c", "ovs_interfaceid": "d02e4190-7cb4-4954-91e8-510380d2319d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.784 186962 DEBUG nova.network.os_vif_util [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converting VIF {"id": "d02e4190-7cb4-4954-91e8-510380d2319d", "address": "fa:16:3e:cf:67:b7", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e4190-7c", "ovs_interfaceid": "d02e4190-7cb4-4954-91e8-510380d2319d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.785 186962 DEBUG nova.network.os_vif_util [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:67:b7,bridge_name='br-int',has_traffic_filtering=True,id=d02e4190-7cb4-4954-91e8-510380d2319d,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd02e4190-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.785 186962 DEBUG os_vif [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:67:b7,bridge_name='br-int',has_traffic_filtering=True,id=d02e4190-7cb4-4954-91e8-510380d2319d,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd02e4190-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.787 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.787 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.788 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.792 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.793 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd02e4190-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.793 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd02e4190-7c, col_values=(('external_ids', {'iface-id': 'd02e4190-7cb4-4954-91e8-510380d2319d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:67:b7', 'vm-uuid': '373bf1d0-aa63-4995-87c5-d6a01e995a40'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.796 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:05 np0005539505 NetworkManager[55134]: <info>  [1764399425.7966] manager: (tapd02e4190-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.798 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.803 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.805 186962 INFO os_vif [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:67:b7,bridge_name='br-int',has_traffic_filtering=True,id=d02e4190-7cb4-4954-91e8-510380d2319d,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd02e4190-7c')#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.806 186962 DEBUG nova.virt.libvirt.vif [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:56:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-703092939',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-703092939',id=39,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:56:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='df8cb7dc75584513bc47dd0afa74c82a',ramdisk_id='',reservation_id='r-048isott',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-769246837',owner_user_name='tempest-AttachInterfacesV270Test-769246837-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:56:52Z,user_data=None,user_id='9e492001fe194231b630bba63bb7b39b',uuid=373bf1d0-aa63-4995-87c5-d6a01e995a40,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d02e4190-7cb4-4954-91e8-510380d2319d", "address": "fa:16:3e:cf:67:b7", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e4190-7c", "ovs_interfaceid": "d02e4190-7cb4-4954-91e8-510380d2319d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.807 186962 DEBUG nova.network.os_vif_util [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converting VIF {"id": "d02e4190-7cb4-4954-91e8-510380d2319d", "address": "fa:16:3e:cf:67:b7", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e4190-7c", "ovs_interfaceid": "d02e4190-7cb4-4954-91e8-510380d2319d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.808 186962 DEBUG nova.network.os_vif_util [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:67:b7,bridge_name='br-int',has_traffic_filtering=True,id=d02e4190-7cb4-4954-91e8-510380d2319d,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd02e4190-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.812 186962 DEBUG nova.virt.libvirt.guest [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] attach device xml: <interface type="ethernet">
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:cf:67:b7"/>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <target dev="tapd02e4190-7c"/>
Nov 29 01:57:05 np0005539505 nova_compute[186958]: </interface>
Nov 29 01:57:05 np0005539505 nova_compute[186958]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 29 01:57:05 np0005539505 kernel: tapd02e4190-7c: entered promiscuous mode
Nov 29 01:57:05 np0005539505 NetworkManager[55134]: <info>  [1764399425.8257] manager: (tapd02e4190-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.826 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:05 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:05Z|00145|binding|INFO|Claiming lport d02e4190-7cb4-4954-91e8-510380d2319d for this chassis.
Nov 29 01:57:05 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:05Z|00146|binding|INFO|d02e4190-7cb4-4954-91e8-510380d2319d: Claiming fa:16:3e:cf:67:b7 10.100.0.8
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.834 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:67:b7 10.100.0.8'], port_security=['fa:16:3e:cf:67:b7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '373bf1d0-aa63-4995-87c5-d6a01e995a40', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'df8cb7dc75584513bc47dd0afa74c82a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '87e0cd78-22d4-4402-b109-2d6faaf67db4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a40acb2c-1f95-4f79-a33d-c84f0d8f72fe, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=d02e4190-7cb4-4954-91e8-510380d2319d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.836 104094 INFO neutron.agent.ovn.metadata.agent [-] Port d02e4190-7cb4-4954-91e8-510380d2319d in datapath 04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f bound to our chassis#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.838 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f#033[00m
Nov 29 01:57:05 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:05Z|00147|binding|INFO|Setting lport d02e4190-7cb4-4954-91e8-510380d2319d ovn-installed in OVS
Nov 29 01:57:05 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:05Z|00148|binding|INFO|Setting lport d02e4190-7cb4-4954-91e8-510380d2319d up in Southbound
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.841 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.843 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.856 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3f7e91c7-47e0-4019-9b87-2bccab0e8f57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:05 np0005539505 systemd-udevd[219806]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:57:05 np0005539505 NetworkManager[55134]: <info>  [1764399425.8830] device (tapd02e4190-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:57:05 np0005539505 NetworkManager[55134]: <info>  [1764399425.8853] device (tapd02e4190-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.893 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[35fbe353-a6c7-4704-94e5-336197417dce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.898 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[86d5310b-b60c-45d5-a353-e5805254b34a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.916 186962 DEBUG nova.virt.libvirt.driver [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.917 186962 DEBUG nova.virt.libvirt.driver [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.917 186962 DEBUG nova.virt.libvirt.driver [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] No VIF found with MAC fa:16:3e:71:e7:8f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.918 186962 DEBUG nova.virt.libvirt.driver [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] No VIF found with MAC fa:16:3e:cf:67:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.930 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e7db0146-4fe1-4c09-85fb-a0372823527e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.948 186962 DEBUG nova.virt.libvirt.guest [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesV270Test-server-703092939</nova:name>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 06:57:05</nova:creationTime>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 01:57:05 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:    <nova:user uuid="9e492001fe194231b630bba63bb7b39b">tempest-AttachInterfacesV270Test-769246837-project-member</nova:user>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:    <nova:project uuid="df8cb7dc75584513bc47dd0afa74c82a">tempest-AttachInterfacesV270Test-769246837</nova:project>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:    <nova:port uuid="2641b5e5-04d4-4190-9adf-40f11057bd93">
Nov 29 01:57:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:    <nova:port uuid="d02e4190-7cb4-4954-91e8-510380d2319d">
Nov 29 01:57:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 01:57:05 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 01:57:05 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 01:57:05 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.952 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[054a259e-75d1-4ece-b7ac-c337f971fa6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04d895aa-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:e6:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485648, 'reachable_time': 15467, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219812, 'error': None, 'target': 'ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.970 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[31c05abc-9f9d-4c4c-84b4-85c646e0cadc]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap04d895aa-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485660, 'tstamp': 485660}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219813, 'error': None, 'target': 'ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap04d895aa-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485664, 'tstamp': 485664}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219813, 'error': None, 'target': 'ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.973 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04d895aa-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.976 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.977 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04d895aa-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.977 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.977 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap04d895aa-70, col_values=(('external_ids', {'iface-id': '818f85b8-72bf-4339-84fb-0c72b9e71340'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:05.978 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:05 np0005539505 nova_compute[186958]: 2025-11-29 06:57:05.979 186962 DEBUG oslo_concurrency.lockutils [None req-92d2d598-9898-4894-a96e-85dbed8ea1ec 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "interface-373bf1d0-aa63-4995-87c5-d6a01e995a40-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 11.330s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:05 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:05Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:71:e7:8f 10.100.0.13
Nov 29 01:57:05 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:05Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:71:e7:8f 10.100.0.13
Nov 29 01:57:06 np0005539505 nova_compute[186958]: 2025-11-29 06:57:06.496 186962 DEBUG nova.compute.manager [req-24888d04-1af9-4cf6-b4a6-a0a987795403 req-69215889-6c1b-45f4-8016-50d4e67025b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-plugged-d02e4190-7cb4-4954-91e8-510380d2319d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:06 np0005539505 nova_compute[186958]: 2025-11-29 06:57:06.497 186962 DEBUG oslo_concurrency.lockutils [req-24888d04-1af9-4cf6-b4a6-a0a987795403 req-69215889-6c1b-45f4-8016-50d4e67025b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:06 np0005539505 nova_compute[186958]: 2025-11-29 06:57:06.498 186962 DEBUG oslo_concurrency.lockutils [req-24888d04-1af9-4cf6-b4a6-a0a987795403 req-69215889-6c1b-45f4-8016-50d4e67025b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:06 np0005539505 nova_compute[186958]: 2025-11-29 06:57:06.498 186962 DEBUG oslo_concurrency.lockutils [req-24888d04-1af9-4cf6-b4a6-a0a987795403 req-69215889-6c1b-45f4-8016-50d4e67025b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:06 np0005539505 nova_compute[186958]: 2025-11-29 06:57:06.498 186962 DEBUG nova.compute.manager [req-24888d04-1af9-4cf6-b4a6-a0a987795403 req-69215889-6c1b-45f4-8016-50d4e67025b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] No waiting events found dispatching network-vif-plugged-d02e4190-7cb4-4954-91e8-510380d2319d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:06 np0005539505 nova_compute[186958]: 2025-11-29 06:57:06.499 186962 WARNING nova.compute.manager [req-24888d04-1af9-4cf6-b4a6-a0a987795403 req-69215889-6c1b-45f4-8016-50d4e67025b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received unexpected event network-vif-plugged-d02e4190-7cb4-4954-91e8-510380d2319d for instance with vm_state active and task_state None.#033[00m
Nov 29 01:57:07 np0005539505 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:57:07 np0005539505 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:57:07 np0005539505 systemd-logind[794]: New session 33 of user nova.
Nov 29 01:57:07 np0005539505 podman[219816]: 2025-11-29 06:57:07.22590644 +0000 UTC m=+0.063433875 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 01:57:07 np0005539505 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:57:07 np0005539505 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:57:07 np0005539505 systemd[219842]: Queued start job for default target Main User Target.
Nov 29 01:57:07 np0005539505 systemd[219842]: Created slice User Application Slice.
Nov 29 01:57:07 np0005539505 systemd[219842]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:57:07 np0005539505 systemd[219842]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:57:07 np0005539505 systemd[219842]: Reached target Paths.
Nov 29 01:57:07 np0005539505 systemd[219842]: Reached target Timers.
Nov 29 01:57:07 np0005539505 systemd[219842]: Starting D-Bus User Message Bus Socket...
Nov 29 01:57:07 np0005539505 systemd[219842]: Starting Create User's Volatile Files and Directories...
Nov 29 01:57:07 np0005539505 systemd[219842]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:57:07 np0005539505 systemd[219842]: Reached target Sockets.
Nov 29 01:57:07 np0005539505 systemd[219842]: Finished Create User's Volatile Files and Directories.
Nov 29 01:57:07 np0005539505 systemd[219842]: Reached target Basic System.
Nov 29 01:57:07 np0005539505 systemd[219842]: Reached target Main User Target.
Nov 29 01:57:07 np0005539505 systemd[219842]: Startup finished in 168ms.
Nov 29 01:57:07 np0005539505 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:57:07 np0005539505 systemd[1]: Started Session 33 of User nova.
Nov 29 01:57:07 np0005539505 systemd[1]: session-33.scope: Deactivated successfully.
Nov 29 01:57:07 np0005539505 systemd-logind[794]: Session 33 logged out. Waiting for processes to exit.
Nov 29 01:57:07 np0005539505 systemd-logind[794]: Removed session 33.
Nov 29 01:57:07 np0005539505 systemd-logind[794]: New session 35 of user nova.
Nov 29 01:57:07 np0005539505 systemd[1]: Started Session 35 of User nova.
Nov 29 01:57:07 np0005539505 systemd[1]: session-35.scope: Deactivated successfully.
Nov 29 01:57:07 np0005539505 systemd-logind[794]: Session 35 logged out. Waiting for processes to exit.
Nov 29 01:57:07 np0005539505 systemd-logind[794]: Removed session 35.
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.854 186962 DEBUG oslo_concurrency.lockutils [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.857 186962 DEBUG oslo_concurrency.lockutils [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.858 186962 DEBUG oslo_concurrency.lockutils [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.858 186962 DEBUG oslo_concurrency.lockutils [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.858 186962 DEBUG oslo_concurrency.lockutils [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.873 186962 INFO nova.compute.manager [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Terminating instance#033[00m
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.888 186962 DEBUG nova.compute.manager [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:57:07 np0005539505 kernel: tap2641b5e5-04 (unregistering): left promiscuous mode
Nov 29 01:57:07 np0005539505 NetworkManager[55134]: <info>  [1764399427.9200] device (tap2641b5e5-04): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:57:07 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:07Z|00149|binding|INFO|Releasing lport 2641b5e5-04d4-4190-9adf-40f11057bd93 from this chassis (sb_readonly=0)
Nov 29 01:57:07 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:07Z|00150|binding|INFO|Setting lport 2641b5e5-04d4-4190-9adf-40f11057bd93 down in Southbound
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.932 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:07 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:07Z|00151|binding|INFO|Removing iface tap2641b5e5-04 ovn-installed in OVS
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.934 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:07.941 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:e7:8f 10.100.0.13'], port_security=['fa:16:3e:71:e7:8f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '373bf1d0-aa63-4995-87c5-d6a01e995a40', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'df8cb7dc75584513bc47dd0afa74c82a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '87e0cd78-22d4-4402-b109-2d6faaf67db4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a40acb2c-1f95-4f79-a33d-c84f0d8f72fe, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=2641b5e5-04d4-4190-9adf-40f11057bd93) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:07.942 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 2641b5e5-04d4-4190-9adf-40f11057bd93 in datapath 04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f unbound from our chassis#033[00m
Nov 29 01:57:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:07.944 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f#033[00m
Nov 29 01:57:07 np0005539505 kernel: tapd02e4190-7c (unregistering): left promiscuous mode
Nov 29 01:57:07 np0005539505 NetworkManager[55134]: <info>  [1764399427.9502] device (tapd02e4190-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.954 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:07.969 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9dc2a98a-8f2c-43bd-b5ff-3f1f9fc74209]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.970 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:07 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:07Z|00152|binding|INFO|Releasing lport d02e4190-7cb4-4954-91e8-510380d2319d from this chassis (sb_readonly=0)
Nov 29 01:57:07 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:07Z|00153|binding|INFO|Setting lport d02e4190-7cb4-4954-91e8-510380d2319d down in Southbound
Nov 29 01:57:07 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:07Z|00154|binding|INFO|Removing iface tapd02e4190-7c ovn-installed in OVS
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.973 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:07.983 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:67:b7 10.100.0.8'], port_security=['fa:16:3e:cf:67:b7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '373bf1d0-aa63-4995-87c5-d6a01e995a40', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'df8cb7dc75584513bc47dd0afa74c82a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '87e0cd78-22d4-4402-b109-2d6faaf67db4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a40acb2c-1f95-4f79-a33d-c84f0d8f72fe, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=d02e4190-7cb4-4954-91e8-510380d2319d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:07 np0005539505 nova_compute[186958]: 2025-11-29 06:57:07.990 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:08 np0005539505 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000027.scope: Deactivated successfully.
Nov 29 01:57:08 np0005539505 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000027.scope: Consumed 13.876s CPU time.
Nov 29 01:57:08 np0005539505 systemd-machined[153285]: Machine qemu-17-instance-00000027 terminated.
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.004 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3083ad55-7b4a-4976-948d-da373e9d04bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.011 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[8ce959ec-96ea-484f-9732-8b00bb850c22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.042 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[76f31eea-07ca-413d-9161-bb25638f92fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.062 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b9810fc9-da99-4489-b59a-4bd7d2f2aef0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04d895aa-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:65:e6:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485648, 'reachable_time': 15467, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219902, 'error': None, 'target': 'ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.085 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[902f1361-eef2-4709-a95a-1b64f04829b3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap04d895aa-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485660, 'tstamp': 485660}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219906, 'error': None, 'target': 'ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap04d895aa-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485664, 'tstamp': 485664}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219906, 'error': None, 'target': 'ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.088 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04d895aa-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.089 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:08 np0005539505 podman[219864]: 2025-11-29 06:57:08.096897992 +0000 UTC m=+0.144490139 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.097 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.097 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04d895aa-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.098 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.098 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap04d895aa-70, col_values=(('external_ids', {'iface-id': '818f85b8-72bf-4339-84fb-0c72b9e71340'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.099 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.101 104094 INFO neutron.agent.ovn.metadata.agent [-] Port d02e4190-7cb4-4954-91e8-510380d2319d in datapath 04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f unbound from our chassis#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.102 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.103 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[92ed4416-f76a-4afd-aeaa-0733fb070d4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:08.105 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f namespace which is not needed anymore#033[00m
Nov 29 01:57:08 np0005539505 NetworkManager[55134]: <info>  [1764399428.1133] manager: (tap2641b5e5-04): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.116 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:08 np0005539505 NetworkManager[55134]: <info>  [1764399428.1249] manager: (tapd02e4190-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.125 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.178 186962 INFO nova.virt.libvirt.driver [-] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Instance destroyed successfully.#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.178 186962 DEBUG nova.objects.instance [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lazy-loading 'resources' on Instance uuid 373bf1d0-aa63-4995-87c5-d6a01e995a40 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.204 186962 DEBUG nova.virt.libvirt.vif [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:56:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-703092939',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-703092939',id=39,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:56:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='df8cb7dc75584513bc47dd0afa74c82a',ramdisk_id='',reservation_id='r-048isott',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-769246837',owner_user_name='tempest-AttachInterfacesV270Test-769246837-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:56:52Z,user_data=None,user_id='9e492001fe194231b630bba63bb7b39b',uuid=373bf1d0-aa63-4995-87c5-d6a01e995a40,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.205 186962 DEBUG nova.network.os_vif_util [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converting VIF {"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.206 186962 DEBUG nova.network.os_vif_util [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:71:e7:8f,bridge_name='br-int',has_traffic_filtering=True,id=2641b5e5-04d4-4190-9adf-40f11057bd93,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2641b5e5-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.206 186962 DEBUG os_vif [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:e7:8f,bridge_name='br-int',has_traffic_filtering=True,id=2641b5e5-04d4-4190-9adf-40f11057bd93,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2641b5e5-04') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.208 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.208 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2641b5e5-04, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.210 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.212 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.216 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.220 186962 INFO os_vif [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:71:e7:8f,bridge_name='br-int',has_traffic_filtering=True,id=2641b5e5-04d4-4190-9adf-40f11057bd93,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2641b5e5-04')#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.221 186962 DEBUG nova.virt.libvirt.vif [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:56:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-703092939',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-703092939',id=39,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:56:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='df8cb7dc75584513bc47dd0afa74c82a',ramdisk_id='',reservation_id='r-048isott',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-769246837',owner_user_name='tempest-AttachInterfacesV270Test-769246837-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:56:52Z,user_data=None,user_id='9e492001fe194231b630bba63bb7b39b',uuid=373bf1d0-aa63-4995-87c5-d6a01e995a40,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d02e4190-7cb4-4954-91e8-510380d2319d", "address": "fa:16:3e:cf:67:b7", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e4190-7c", "ovs_interfaceid": "d02e4190-7cb4-4954-91e8-510380d2319d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.221 186962 DEBUG nova.network.os_vif_util [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converting VIF {"id": "d02e4190-7cb4-4954-91e8-510380d2319d", "address": "fa:16:3e:cf:67:b7", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd02e4190-7c", "ovs_interfaceid": "d02e4190-7cb4-4954-91e8-510380d2319d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.222 186962 DEBUG nova.network.os_vif_util [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:67:b7,bridge_name='br-int',has_traffic_filtering=True,id=d02e4190-7cb4-4954-91e8-510380d2319d,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd02e4190-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.222 186962 DEBUG os_vif [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:67:b7,bridge_name='br-int',has_traffic_filtering=True,id=d02e4190-7cb4-4954-91e8-510380d2319d,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd02e4190-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.224 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.224 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd02e4190-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.226 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.230 186962 INFO os_vif [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:67:b7,bridge_name='br-int',has_traffic_filtering=True,id=d02e4190-7cb4-4954-91e8-510380d2319d,network=Network(04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd02e4190-7c')#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.231 186962 INFO nova.virt.libvirt.driver [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Deleting instance files /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40_del#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.232 186962 INFO nova.virt.libvirt.driver [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Deletion of /var/lib/nova/instances/373bf1d0-aa63-4995-87c5-d6a01e995a40_del complete#033[00m
Nov 29 01:57:08 np0005539505 neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f[219698]: [NOTICE]   (219702) : haproxy version is 2.8.14-c23fe91
Nov 29 01:57:08 np0005539505 neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f[219698]: [NOTICE]   (219702) : path to executable is /usr/sbin/haproxy
Nov 29 01:57:08 np0005539505 neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f[219698]: [WARNING]  (219702) : Exiting Master process...
Nov 29 01:57:08 np0005539505 neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f[219698]: [ALERT]    (219702) : Current worker (219704) exited with code 143 (Terminated)
Nov 29 01:57:08 np0005539505 neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f[219698]: [WARNING]  (219702) : All workers exited. Exiting... (0)
Nov 29 01:57:08 np0005539505 systemd[1]: libpod-d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51.scope: Deactivated successfully.
Nov 29 01:57:08 np0005539505 podman[219955]: 2025-11-29 06:57:08.267394392 +0000 UTC m=+0.059003417 container died d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.297 186962 INFO nova.compute.manager [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.298 186962 DEBUG oslo.service.loopingcall [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.298 186962 DEBUG nova.compute.manager [-] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.299 186962 DEBUG nova.network.neutron [-] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.355 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:08 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51-userdata-shm.mount: Deactivated successfully.
Nov 29 01:57:08 np0005539505 systemd[1]: var-lib-containers-storage-overlay-81ed4416ce6f3d19cc669ce3ccf7432b55294a1169fa3e7791777e2d8d48ebb1-merged.mount: Deactivated successfully.
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.597 186962 DEBUG nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-plugged-d02e4190-7cb4-4954-91e8-510380d2319d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.598 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.598 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.598 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.599 186962 DEBUG nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] No waiting events found dispatching network-vif-plugged-d02e4190-7cb4-4954-91e8-510380d2319d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.599 186962 WARNING nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received unexpected event network-vif-plugged-d02e4190-7cb4-4954-91e8-510380d2319d for instance with vm_state active and task_state deleting.#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.599 186962 DEBUG nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-unplugged-2641b5e5-04d4-4190-9adf-40f11057bd93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.599 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.599 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.599 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.600 186962 DEBUG nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] No waiting events found dispatching network-vif-unplugged-2641b5e5-04d4-4190-9adf-40f11057bd93 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.600 186962 DEBUG nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-unplugged-2641b5e5-04d4-4190-9adf-40f11057bd93 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.600 186962 DEBUG nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-plugged-2641b5e5-04d4-4190-9adf-40f11057bd93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.600 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.601 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.601 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.601 186962 DEBUG nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] No waiting events found dispatching network-vif-plugged-2641b5e5-04d4-4190-9adf-40f11057bd93 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.601 186962 WARNING nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received unexpected event network-vif-plugged-2641b5e5-04d4-4190-9adf-40f11057bd93 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.602 186962 DEBUG nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-unplugged-d02e4190-7cb4-4954-91e8-510380d2319d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.602 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.602 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.602 186962 DEBUG oslo_concurrency.lockutils [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.602 186962 DEBUG nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] No waiting events found dispatching network-vif-unplugged-d02e4190-7cb4-4954-91e8-510380d2319d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.603 186962 DEBUG nova.compute.manager [req-b035fa67-d226-4db0-b35c-fa7bc33e1cd2 req-fa031f44-bdfb-457e-8378-f5fe8240f6e3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-unplugged-d02e4190-7cb4-4954-91e8-510380d2319d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.952 186962 DEBUG nova.compute.manager [req-69943946-3193-4725-b663-923ff08f61d4 req-a71f84eb-6e63-40bf-a6b9-7c37ea8fa4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-deleted-d02e4190-7cb4-4954-91e8-510380d2319d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.952 186962 INFO nova.compute.manager [req-69943946-3193-4725-b663-923ff08f61d4 req-a71f84eb-6e63-40bf-a6b9-7c37ea8fa4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Neutron deleted interface d02e4190-7cb4-4954-91e8-510380d2319d; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.953 186962 DEBUG nova.network.neutron [req-69943946-3193-4725-b663-923ff08f61d4 req-a71f84eb-6e63-40bf-a6b9-7c37ea8fa4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Updating instance_info_cache with network_info: [{"id": "2641b5e5-04d4-4190-9adf-40f11057bd93", "address": "fa:16:3e:71:e7:8f", "network": {"id": "04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-1234877217-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "df8cb7dc75584513bc47dd0afa74c82a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2641b5e5-04", "ovs_interfaceid": "2641b5e5-04d4-4190-9adf-40f11057bd93", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:08 np0005539505 nova_compute[186958]: 2025-11-29 06:57:08.973 186962 DEBUG nova.compute.manager [req-69943946-3193-4725-b663-923ff08f61d4 req-a71f84eb-6e63-40bf-a6b9-7c37ea8fa4ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Detach interface failed, port_id=d02e4190-7cb4-4954-91e8-510380d2319d, reason: Instance 373bf1d0-aa63-4995-87c5-d6a01e995a40 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 01:57:09 np0005539505 podman[219955]: 2025-11-29 06:57:09.086395062 +0000 UTC m=+0.878004067 container cleanup d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:57:09 np0005539505 systemd[1]: libpod-conmon-d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51.scope: Deactivated successfully.
Nov 29 01:57:09 np0005539505 nova_compute[186958]: 2025-11-29 06:57:09.194 186962 DEBUG nova.network.neutron [-] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:09 np0005539505 nova_compute[186958]: 2025-11-29 06:57:09.212 186962 INFO nova.compute.manager [-] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Took 0.91 seconds to deallocate network for instance.#033[00m
Nov 29 01:57:09 np0005539505 nova_compute[186958]: 2025-11-29 06:57:09.301 186962 DEBUG oslo_concurrency.lockutils [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:09 np0005539505 nova_compute[186958]: 2025-11-29 06:57:09.301 186962 DEBUG oslo_concurrency.lockutils [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:09 np0005539505 nova_compute[186958]: 2025-11-29 06:57:09.376 186962 DEBUG nova.compute.provider_tree [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:09 np0005539505 nova_compute[186958]: 2025-11-29 06:57:09.395 186962 DEBUG nova.scheduler.client.report [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:09 np0005539505 nova_compute[186958]: 2025-11-29 06:57:09.417 186962 DEBUG oslo_concurrency.lockutils [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:09 np0005539505 nova_compute[186958]: 2025-11-29 06:57:09.439 186962 INFO nova.scheduler.client.report [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Deleted allocations for instance 373bf1d0-aa63-4995-87c5-d6a01e995a40#033[00m
Nov 29 01:57:09 np0005539505 nova_compute[186958]: 2025-11-29 06:57:09.527 186962 DEBUG oslo_concurrency.lockutils [None req-ac0bfd18-236f-4288-9751-e49816a8cd43 9e492001fe194231b630bba63bb7b39b df8cb7dc75584513bc47dd0afa74c82a - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:10 np0005539505 podman[219989]: 2025-11-29 06:57:10.144432324 +0000 UTC m=+1.038455297 container remove d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 01:57:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:10.151 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4588f40e-246f-4a5d-a396-dc186fd15879]: (4, ('Sat Nov 29 06:57:08 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f (d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51)\nd041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51\nSat Nov 29 06:57:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f (d041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51)\nd041d85adaef6392279c37c2849d0b2496d1ea501d6b1f5ba9b49614eb191b51\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:10.153 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[daf39458-9586-4529-85ab-4533311fae79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:10.154 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04d895aa-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.157 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:10 np0005539505 kernel: tap04d895aa-70: left promiscuous mode
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.168 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:10.172 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e220f5dc-7b32-473d-9186-4a2b6fefa7af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:10.183 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6b587b-bec8-476d-a4ca-b1babb2e0db1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:10.185 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[872f192d-a6ff-4ebc-9856-d85070f17ef7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:10.201 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4b47c3e1-6465-4aed-a3bc-fbd892a950e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485636, 'reachable_time': 42456, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220005, 'error': None, 'target': 'ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:10 np0005539505 systemd[1]: run-netns-ovnmeta\x2d04d895aa\x2d7f4f\x2d4bfc\x2dad50\x2d9ec39ad4ac3f.mount: Deactivated successfully.
Nov 29 01:57:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:10.206 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-04d895aa-7f4f-4bfc-ad50-9ec39ad4ac3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:57:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:10.206 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[deff3984-6d0d-478f-9247-c7bfb82804cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.686 186962 DEBUG nova.compute.manager [req-b9da6171-212f-42db-8fba-182647f9259a req-07df3edd-a090-4ccf-9680-9e98c99abb7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-plugged-d02e4190-7cb4-4954-91e8-510380d2319d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.686 186962 DEBUG oslo_concurrency.lockutils [req-b9da6171-212f-42db-8fba-182647f9259a req-07df3edd-a090-4ccf-9680-9e98c99abb7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.687 186962 DEBUG oslo_concurrency.lockutils [req-b9da6171-212f-42db-8fba-182647f9259a req-07df3edd-a090-4ccf-9680-9e98c99abb7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.687 186962 DEBUG oslo_concurrency.lockutils [req-b9da6171-212f-42db-8fba-182647f9259a req-07df3edd-a090-4ccf-9680-9e98c99abb7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "373bf1d0-aa63-4995-87c5-d6a01e995a40-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.687 186962 DEBUG nova.compute.manager [req-b9da6171-212f-42db-8fba-182647f9259a req-07df3edd-a090-4ccf-9680-9e98c99abb7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] No waiting events found dispatching network-vif-plugged-d02e4190-7cb4-4954-91e8-510380d2319d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.687 186962 WARNING nova.compute.manager [req-b9da6171-212f-42db-8fba-182647f9259a req-07df3edd-a090-4ccf-9680-9e98c99abb7a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received unexpected event network-vif-plugged-d02e4190-7cb4-4954-91e8-510380d2319d for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.817 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.818 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.834 186962 DEBUG nova.compute.manager [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.922 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.923 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.928 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:57:10 np0005539505 nova_compute[186958]: 2025-11-29 06:57:10.929 186962 INFO nova.compute.claims [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.064 186962 DEBUG nova.compute.provider_tree [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.074 186962 DEBUG nova.compute.manager [req-1a46aceb-bfc9-4836-8134-3e7b308d6d20 req-4bcdcbc9-76d9-4011-be66-3b897111426c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Received event network-vif-deleted-2641b5e5-04d4-4190-9adf-40f11057bd93 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.078 186962 DEBUG nova.scheduler.client.report [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.095 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.095 186962 DEBUG nova.compute.manager [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.143 186962 DEBUG nova.compute.manager [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.143 186962 DEBUG nova.network.neutron [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.158 186962 INFO nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.172 186962 DEBUG nova.compute.manager [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.302 186962 DEBUG nova.compute.manager [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.304 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.304 186962 INFO nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Creating image(s)#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.305 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.305 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.306 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.322 186962 DEBUG nova.policy [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b509e6a04cd147779a714856e3cd95ab', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32234968781646cf869d42134e62b91c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.327 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.395 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.397 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.398 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.414 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.472 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.474 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.618 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk 1073741824" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.621 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.622 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.692 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.693 186962 DEBUG nova.virt.disk.api [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Checking if we can resize image /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.694 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.754 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.755 186962 DEBUG nova.virt.disk.api [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Cannot resize image /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.756 186962 DEBUG nova.objects.instance [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lazy-loading 'migration_context' on Instance uuid 2d8568dc-c82e-43a5-a9f1-46434e7873a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.770 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.771 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Ensure instance console log exists: /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.771 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.772 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.772 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.943 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.944 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:11 np0005539505 nova_compute[186958]: 2025-11-29 06:57:11.965 186962 DEBUG nova.compute.manager [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.027 186962 DEBUG nova.network.neutron [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Successfully created port: 6c054d6e-43d7-4885-81ce-27d38953d930 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.060 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.061 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.070 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.071 186962 INFO nova.compute.claims [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.248 186962 DEBUG nova.compute.provider_tree [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.264 186962 DEBUG nova.scheduler.client.report [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.287 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.288 186962 DEBUG nova.compute.manager [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.336 186962 DEBUG nova.compute.manager [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.337 186962 DEBUG nova.network.neutron [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.356 186962 INFO nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.374 186962 DEBUG nova.compute.manager [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.493 186962 DEBUG nova.compute.manager [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.495 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.495 186962 INFO nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Creating image(s)#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.496 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.496 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.497 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.509 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.571 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.572 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.573 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.584 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.652 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.653 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:12 np0005539505 nova_compute[186958]: 2025-11-29 06:57:12.673 186962 DEBUG nova.policy [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:57:12 np0005539505 podman[220029]: 2025-11-29 06:57:12.748368532 +0000 UTC m=+0.072120096 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.209 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk 1073741824" returned: 0 in 0.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.210 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.211 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.234 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.284 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.285 186962 DEBUG nova.virt.disk.api [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Checking if we can resize image /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.285 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.342 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.343 186962 DEBUG nova.virt.disk.api [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Cannot resize image /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.344 186962 DEBUG nova.objects.instance [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.356 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.361 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.361 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Ensure instance console log exists: /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.362 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.362 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.363 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.734 186962 DEBUG nova.network.neutron [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Successfully created port: 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.760 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.870 186962 DEBUG nova.network.neutron [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Successfully updated port: 6c054d6e-43d7-4885-81ce-27d38953d930 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.888 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.889 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquired lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.889 186962 DEBUG nova.network.neutron [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.989 186962 DEBUG nova.compute.manager [req-2f221ecc-0200-4e7b-a7a7-1ffe06185533 req-2febd6e2-c4d5-49ae-8943-cdc93a59e71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-changed-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.990 186962 DEBUG nova.compute.manager [req-2f221ecc-0200-4e7b-a7a7-1ffe06185533 req-2febd6e2-c4d5-49ae-8943-cdc93a59e71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Refreshing instance network info cache due to event network-changed-6c054d6e-43d7-4885-81ce-27d38953d930. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:57:13 np0005539505 nova_compute[186958]: 2025-11-29 06:57:13.991 186962 DEBUG oslo_concurrency.lockutils [req-2f221ecc-0200-4e7b-a7a7-1ffe06185533 req-2febd6e2-c4d5-49ae-8943-cdc93a59e71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:14 np0005539505 nova_compute[186958]: 2025-11-29 06:57:14.466 186962 DEBUG nova.network.neutron [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.299 186962 DEBUG nova.network.neutron [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Successfully updated port: 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.315 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "refresh_cache-4d501671-0357-4018-80f9-31d43293d107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.316 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquired lock "refresh_cache-4d501671-0357-4018-80f9-31d43293d107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.316 186962 DEBUG nova.network.neutron [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.562 186962 DEBUG nova.network.neutron [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.777 186962 DEBUG nova.network.neutron [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Updating instance_info_cache with network_info: [{"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.800 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Releasing lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.801 186962 DEBUG nova.compute.manager [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Instance network_info: |[{"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.802 186962 DEBUG oslo_concurrency.lockutils [req-2f221ecc-0200-4e7b-a7a7-1ffe06185533 req-2febd6e2-c4d5-49ae-8943-cdc93a59e71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.802 186962 DEBUG nova.network.neutron [req-2f221ecc-0200-4e7b-a7a7-1ffe06185533 req-2febd6e2-c4d5-49ae-8943-cdc93a59e71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Refreshing network info cache for port 6c054d6e-43d7-4885-81ce-27d38953d930 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.807 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Start _get_guest_xml network_info=[{"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.820 186962 WARNING nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.826 186962 DEBUG nova.virt.libvirt.host [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.828 186962 DEBUG nova.virt.libvirt.host [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.832 186962 DEBUG nova.virt.libvirt.host [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.833 186962 DEBUG nova.virt.libvirt.host [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.834 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.835 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.835 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.835 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.836 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.836 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.836 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.836 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.837 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.837 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.837 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.837 186962 DEBUG nova.virt.hardware [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.841 186962 DEBUG nova.virt.libvirt.vif [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2017235252',display_name='tempest-SecurityGroupsTestJSON-server-2017235252',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2017235252',id=41,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32234968781646cf869d42134e62b91c',ramdisk_id='',reservation_id='r-0qmctx9u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-1623643057',owner_user_name='tempest-SecurityGroupsTestJSON-1623643057-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:57:11Z,user_data=None,user_id='b509e6a04cd147779a714856e3cd95ab',uuid=2d8568dc-c82e-43a5-a9f1-46434e7873a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.842 186962 DEBUG nova.network.os_vif_util [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converting VIF {"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.843 186962 DEBUG nova.network.os_vif_util [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.844 186962 DEBUG nova.objects.instance [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lazy-loading 'pci_devices' on Instance uuid 2d8568dc-c82e-43a5-a9f1-46434e7873a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.861 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  <uuid>2d8568dc-c82e-43a5-a9f1-46434e7873a1</uuid>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  <name>instance-00000029</name>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <nova:name>tempest-SecurityGroupsTestJSON-server-2017235252</nova:name>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:57:15</nova:creationTime>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:        <nova:user uuid="b509e6a04cd147779a714856e3cd95ab">tempest-SecurityGroupsTestJSON-1623643057-project-member</nova:user>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:        <nova:project uuid="32234968781646cf869d42134e62b91c">tempest-SecurityGroupsTestJSON-1623643057</nova:project>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:        <nova:port uuid="6c054d6e-43d7-4885-81ce-27d38953d930">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <entry name="serial">2d8568dc-c82e-43a5-a9f1-46434e7873a1</entry>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <entry name="uuid">2d8568dc-c82e-43a5-a9f1-46434e7873a1</entry>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.config"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:2d:70:bf"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <target dev="tap6c054d6e-43"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/console.log" append="off"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:57:15 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:57:15 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:57:15 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:57:15 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.862 186962 DEBUG nova.compute.manager [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Preparing to wait for external event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.863 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.863 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.863 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.864 186962 DEBUG nova.virt.libvirt.vif [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2017235252',display_name='tempest-SecurityGroupsTestJSON-server-2017235252',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2017235252',id=41,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='32234968781646cf869d42134e62b91c',ramdisk_id='',reservation_id='r-0qmctx9u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-1623643057',owner_user_name='tempest-SecurityGroupsTestJSON-1623643057-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:57:11Z,user_data=None,user_id='b509e6a04cd147779a714856e3cd95ab',uuid=2d8568dc-c82e-43a5-a9f1-46434e7873a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.864 186962 DEBUG nova.network.os_vif_util [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converting VIF {"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.865 186962 DEBUG nova.network.os_vif_util [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.865 186962 DEBUG os_vif [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.866 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.866 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.867 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.870 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.871 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c054d6e-43, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.871 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6c054d6e-43, col_values=(('external_ids', {'iface-id': '6c054d6e-43d7-4885-81ce-27d38953d930', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:70:bf', 'vm-uuid': '2d8568dc-c82e-43a5-a9f1-46434e7873a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.873 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:15 np0005539505 NetworkManager[55134]: <info>  [1764399435.8738] manager: (tap6c054d6e-43): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.875 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.882 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:15 np0005539505 nova_compute[186958]: 2025-11-29 06:57:15.884 186962 INFO os_vif [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43')#033[00m
Nov 29 01:57:15 np0005539505 podman[220063]: 2025-11-29 06:57:15.969138595 +0000 UTC m=+0.058028429 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd)
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.106 186962 DEBUG nova.compute.manager [req-9be730be-8c94-474a-b7bf-9c51ab3fe16c req-114d8848-1e0c-4bb3-86d2-30aef752fc64 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received event network-changed-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.106 186962 DEBUG nova.compute.manager [req-9be730be-8c94-474a-b7bf-9c51ab3fe16c req-114d8848-1e0c-4bb3-86d2-30aef752fc64 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Refreshing instance network info cache due to event network-changed-4dfd4ca1-34cd-46b3-9509-5d99ceaee255. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.106 186962 DEBUG oslo_concurrency.lockutils [req-9be730be-8c94-474a-b7bf-9c51ab3fe16c req-114d8848-1e0c-4bb3-86d2-30aef752fc64 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-4d501671-0357-4018-80f9-31d43293d107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.366 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.366 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.367 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] No VIF found with MAC fa:16:3e:2d:70:bf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.368 186962 INFO nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Using config drive#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.733 186962 DEBUG nova.network.neutron [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Updating instance_info_cache with network_info: [{"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.763 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Releasing lock "refresh_cache-4d501671-0357-4018-80f9-31d43293d107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.763 186962 DEBUG nova.compute.manager [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance network_info: |[{"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.764 186962 DEBUG oslo_concurrency.lockutils [req-9be730be-8c94-474a-b7bf-9c51ab3fe16c req-114d8848-1e0c-4bb3-86d2-30aef752fc64 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-4d501671-0357-4018-80f9-31d43293d107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.764 186962 DEBUG nova.network.neutron [req-9be730be-8c94-474a-b7bf-9c51ab3fe16c req-114d8848-1e0c-4bb3-86d2-30aef752fc64 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Refreshing network info cache for port 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.767 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Start _get_guest_xml network_info=[{"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.771 186962 WARNING nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.775 186962 DEBUG nova.virt.libvirt.host [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.776 186962 DEBUG nova.virt.libvirt.host [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.781 186962 INFO nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Creating config drive at /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.config#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.786 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7akd8vyk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.808 186962 DEBUG nova.virt.libvirt.host [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.810 186962 DEBUG nova.virt.libvirt.host [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.814 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.814 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.815 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.815 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.815 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.816 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.816 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.817 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.817 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.817 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.818 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.818 186962 DEBUG nova.virt.hardware [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.825 186962 DEBUG nova.virt.libvirt.vif [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:57:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1768107239',display_name='tempest-tempest.common.compute-instance-1768107239',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1768107239',id=42,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-gh9wve4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:57:12Z,user_data=None,user_id='812d926ee4ed4159b2e88b7a69990423',uuid=4d501671-0357-4018-80f9-31d43293d107,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.826 186962 DEBUG nova.network.os_vif_util [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.828 186962 DEBUG nova.network.os_vif_util [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.829 186962 DEBUG nova.objects.instance [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.856 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  <uuid>4d501671-0357-4018-80f9-31d43293d107</uuid>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  <name>instance-0000002a</name>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <nova:name>tempest-tempest.common.compute-instance-1768107239</nova:name>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:57:16</nova:creationTime>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:        <nova:user uuid="812d926ee4ed4159b2e88b7a69990423">tempest-ServerActionsTestOtherA-229564135-project-member</nova:user>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:        <nova:project uuid="4362be0b90a64d63b2294bbc495486d3">tempest-ServerActionsTestOtherA-229564135</nova:project>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:        <nova:port uuid="4dfd4ca1-34cd-46b3-9509-5d99ceaee255">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <entry name="serial">4d501671-0357-4018-80f9-31d43293d107</entry>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <entry name="uuid">4d501671-0357-4018-80f9-31d43293d107</entry>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.config"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:61:c9:0d"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <target dev="tap4dfd4ca1-34"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/console.log" append="off"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:57:16 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:57:16 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:57:16 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:57:16 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.858 186962 DEBUG nova.compute.manager [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Preparing to wait for external event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.858 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.858 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.858 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.859 186962 DEBUG nova.virt.libvirt.vif [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:57:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1768107239',display_name='tempest-tempest.common.compute-instance-1768107239',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1768107239',id=42,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-gh9wve4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:57:12Z,user_data=None,user_id='812d926ee4ed4159b2e88b7a69990423',uuid=4d501671-0357-4018-80f9-31d43293d107,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.860 186962 DEBUG nova.network.os_vif_util [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.861 186962 DEBUG nova.network.os_vif_util [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.861 186962 DEBUG os_vif [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.862 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.862 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.863 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.866 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.867 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4dfd4ca1-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.867 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4dfd4ca1-34, col_values=(('external_ids', {'iface-id': '4dfd4ca1-34cd-46b3-9509-5d99ceaee255', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:c9:0d', 'vm-uuid': '4d501671-0357-4018-80f9-31d43293d107'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.869 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:16 np0005539505 NetworkManager[55134]: <info>  [1764399436.8698] manager: (tap4dfd4ca1-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.871 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.878 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.879 186962 INFO os_vif [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34')#033[00m
Nov 29 01:57:16 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.921 186962 DEBUG oslo_concurrency.processutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7akd8vyk" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:16 np0005539505 kernel: tap6c054d6e-43: entered promiscuous mode
Nov 29 01:57:16 np0005539505 NetworkManager[55134]: <info>  [1764399436.9896] manager: (tap6c054d6e-43): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Nov 29 01:57:16 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:16Z|00155|binding|INFO|Claiming lport 6c054d6e-43d7-4885-81ce-27d38953d930 for this chassis.
Nov 29 01:57:16 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:16Z|00156|binding|INFO|6c054d6e-43d7-4885-81ce-27d38953d930: Claiming fa:16:3e:2d:70:bf 10.100.0.6
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:16.998 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.011 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:70:bf 10.100.0.6'], port_security=['fa:16:3e:2d:70:bf 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2d8568dc-c82e-43a5-a9f1-46434e7873a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09d1caf9-4b04-433c-8535-2cd6d44437db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32234968781646cf869d42134e62b91c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1fb16d9c-331a-41c7-a6da-9b1479dbf50c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46d714e3-e817-4d0b-99e9-4cc2314001af, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=6c054d6e-43d7-4885-81ce-27d38953d930) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.012 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 6c054d6e-43d7-4885-81ce-27d38953d930 in datapath 09d1caf9-4b04-433c-8535-2cd6d44437db bound to our chassis#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.014 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09d1caf9-4b04-433c-8535-2cd6d44437db#033[00m
Nov 29 01:57:17 np0005539505 systemd-udevd[220104]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.028 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7a5a4220-d6f2-423f-80d0-8c6f851460ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.029 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09d1caf9-41 in ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.031 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09d1caf9-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.031 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c56832ef-dc28-4bec-80c3-04812cf8a8b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.032 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[937407b3-a0e1-4d92-b4eb-865d63f6b6f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 systemd-machined[153285]: New machine qemu-18-instance-00000029.
Nov 29 01:57:17 np0005539505 NetworkManager[55134]: <info>  [1764399437.0394] device (tap6c054d6e-43): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:57:17 np0005539505 NetworkManager[55134]: <info>  [1764399437.0406] device (tap6c054d6e-43): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.045 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[e1cb8017-aba7-4e21-8ebc-1d769385de8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.050 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 systemd[1]: Started Virtual Machine qemu-18-instance-00000029.
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.063 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:17Z|00157|binding|INFO|Setting lport 6c054d6e-43d7-4885-81ce-27d38953d930 ovn-installed in OVS
Nov 29 01:57:17 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:17Z|00158|binding|INFO|Setting lport 6c054d6e-43d7-4885-81ce-27d38953d930 up in Southbound
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.084 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f53ccc7b-79b9-4fbd-b472-8343f8b0b102]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.085 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.115 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[18323cdd-7049-4182-b163-d00b263cf714]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 systemd-udevd[220108]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.121 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2659c019-83dd-492a-8ac1-114d8ddf8beb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 NetworkManager[55134]: <info>  [1764399437.1224] manager: (tap09d1caf9-40): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.158 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ee9689-67f3-4e90-b275-a8c07aa18a1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.162 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1953d8-fee2-47ba-a775-7462d91828d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 NetworkManager[55134]: <info>  [1764399437.1896] device (tap09d1caf9-40): carrier: link connected
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.195 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[61f4735f-6505-4667-ab54-5d02782716ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.217 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2a1ac588-9e29-410c-9070-c97437a51174]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09d1caf9-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a3:d9:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488477, 'reachable_time': 29671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220142, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.234 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.234 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.234 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No VIF found with MAC fa:16:3e:61:c9:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.235 186962 INFO nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Using config drive#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.237 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7a8e8b-d4ce-439d-941a-eb0bceaa9cf8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea3:d9b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488477, 'tstamp': 488477}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220143, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.254 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4ead7f7e-7ed1-4587-a6a5-849ec7fdfb62]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09d1caf9-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a3:d9:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488477, 'reachable_time': 29671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220144, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.288 186962 DEBUG nova.network.neutron [req-2f221ecc-0200-4e7b-a7a7-1ffe06185533 req-2febd6e2-c4d5-49ae-8943-cdc93a59e71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Updated VIF entry in instance network info cache for port 6c054d6e-43d7-4885-81ce-27d38953d930. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.289 186962 DEBUG nova.network.neutron [req-2f221ecc-0200-4e7b-a7a7-1ffe06185533 req-2febd6e2-c4d5-49ae-8943-cdc93a59e71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Updating instance_info_cache with network_info: [{"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.299 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3ad06dfc-2104-4e81-b752-1da2948b599f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.305 186962 DEBUG oslo_concurrency.lockutils [req-2f221ecc-0200-4e7b-a7a7-1ffe06185533 req-2febd6e2-c4d5-49ae-8943-cdc93a59e71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.364 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[249c42a5-a607-4896-8f4e-6b94c2383931]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.365 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09d1caf9-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.366 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.366 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09d1caf9-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.368 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 NetworkManager[55134]: <info>  [1764399437.3689] manager: (tap09d1caf9-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Nov 29 01:57:17 np0005539505 kernel: tap09d1caf9-40: entered promiscuous mode
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.373 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.374 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09d1caf9-40, col_values=(('external_ids', {'iface-id': '4919a746-8c41-4e1b-93e2-17dfe2a5b063'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.375 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:17Z|00159|binding|INFO|Releasing lport 4919a746-8c41-4e1b-93e2-17dfe2a5b063 from this chassis (sb_readonly=0)
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.384 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399437.3835888, 2d8568dc-c82e-43a5-a9f1-46434e7873a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.384 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] VM Started (Lifecycle Event)#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.389 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.393 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.394 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09d1caf9-4b04-433c-8535-2cd6d44437db.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09d1caf9-4b04-433c-8535-2cd6d44437db.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.395 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4999d756-f493-4f7c-9d29-4b0886780841]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.397 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-09d1caf9-4b04-433c-8535-2cd6d44437db
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/09d1caf9-4b04-433c-8535-2cd6d44437db.pid.haproxy
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 09d1caf9-4b04-433c-8535-2cd6d44437db
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.398 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'env', 'PROCESS_TAG=haproxy-09d1caf9-4b04-433c-8535-2cd6d44437db', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09d1caf9-4b04-433c-8535-2cd6d44437db.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.409 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.440 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399437.383835, 2d8568dc-c82e-43a5-a9f1-46434e7873a1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.440 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.458 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.461 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.484 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.564 186962 INFO nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Creating config drive at /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.config#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.571 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplt9jauzi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.654 186962 DEBUG nova.compute.manager [req-9665f8db-958c-4273-8955-038a6deda2a7 req-d4d28e5f-47e4-4150-9ac2-e37ab86b7a7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.655 186962 DEBUG oslo_concurrency.lockutils [req-9665f8db-958c-4273-8955-038a6deda2a7 req-d4d28e5f-47e4-4150-9ac2-e37ab86b7a7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.655 186962 DEBUG oslo_concurrency.lockutils [req-9665f8db-958c-4273-8955-038a6deda2a7 req-d4d28e5f-47e4-4150-9ac2-e37ab86b7a7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.655 186962 DEBUG oslo_concurrency.lockutils [req-9665f8db-958c-4273-8955-038a6deda2a7 req-d4d28e5f-47e4-4150-9ac2-e37ab86b7a7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.655 186962 DEBUG nova.compute.manager [req-9665f8db-958c-4273-8955-038a6deda2a7 req-d4d28e5f-47e4-4150-9ac2-e37ab86b7a7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Processing event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.656 186962 DEBUG nova.compute.manager [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.661 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.662 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399437.66153, 2d8568dc-c82e-43a5-a9f1-46434e7873a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.662 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.667 186962 INFO nova.virt.libvirt.driver [-] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Instance spawned successfully.#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.667 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.691 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.702 186962 DEBUG oslo_concurrency.processutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplt9jauzi" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.706 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.713 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.713 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.714 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.714 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.715 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.715 186962 DEBUG nova.virt.libvirt.driver [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.740 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:57:17 np0005539505 NetworkManager[55134]: <info>  [1764399437.7681] manager: (tap4dfd4ca1-34): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Nov 29 01:57:17 np0005539505 kernel: tap4dfd4ca1-34: entered promiscuous mode
Nov 29 01:57:17 np0005539505 systemd-udevd[220132]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.784 186962 INFO nova.compute.manager [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Took 6.48 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.784 186962 DEBUG nova.compute.manager [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:17 np0005539505 NetworkManager[55134]: <info>  [1764399437.7931] device (tap4dfd4ca1-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:57:17 np0005539505 NetworkManager[55134]: <info>  [1764399437.7940] device (tap4dfd4ca1-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:57:17 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:17Z|00160|binding|INFO|Claiming lport 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 for this chassis.
Nov 29 01:57:17 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:17Z|00161|binding|INFO|4dfd4ca1-34cd-46b3-9509-5d99ceaee255: Claiming fa:16:3e:61:c9:0d 10.100.0.5
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.802 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.805 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 NetworkManager[55134]: <info>  [1764399437.8062] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Nov 29 01:57:17 np0005539505 NetworkManager[55134]: <info>  [1764399437.8066] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Nov 29 01:57:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:17.819 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:c9:0d 10.100.0.5'], port_security=['fa:16:3e:61:c9:0d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4d501671-0357-4018-80f9-31d43293d107', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db691b6b-17b7-42a9-9fd2-162233da0513', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4362be0b90a64d63b2294bbc495486d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7810e1a5-baa0-4375-99a8-632ac4dab559', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03ee1f45-6435-43da-9a98-5273904b0bb0, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=4dfd4ca1-34cd-46b3-9509-5d99ceaee255) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:17 np0005539505 systemd-machined[153285]: New machine qemu-19-instance-0000002a.
Nov 29 01:57:17 np0005539505 systemd[1]: Started Virtual Machine qemu-19-instance-0000002a.
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.879 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:57:17 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:17Z|00162|binding|INFO|Releasing lport 4919a746-8c41-4e1b-93e2-17dfe2a5b063 from this chassis (sb_readonly=0)
Nov 29 01:57:17 np0005539505 systemd[219842]: Activating special unit Exit the Session...
Nov 29 01:57:17 np0005539505 systemd[219842]: Stopped target Main User Target.
Nov 29 01:57:17 np0005539505 systemd[219842]: Stopped target Basic System.
Nov 29 01:57:17 np0005539505 systemd[219842]: Stopped target Paths.
Nov 29 01:57:17 np0005539505 systemd[219842]: Stopped target Sockets.
Nov 29 01:57:17 np0005539505 systemd[219842]: Stopped target Timers.
Nov 29 01:57:17 np0005539505 systemd[219842]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:57:17 np0005539505 systemd[219842]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:57:17 np0005539505 systemd[219842]: Closed D-Bus User Message Bus Socket.
Nov 29 01:57:17 np0005539505 systemd[219842]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:57:17 np0005539505 systemd[219842]: Removed slice User Application Slice.
Nov 29 01:57:17 np0005539505 systemd[219842]: Reached target Shutdown.
Nov 29 01:57:17 np0005539505 systemd[219842]: Finished Exit the Session.
Nov 29 01:57:17 np0005539505 systemd[219842]: Reached target Exit the Session.
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.889 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:57:17 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:17Z|00163|binding|INFO|Setting lport 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 ovn-installed in OVS
Nov 29 01:57:17 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:17Z|00164|binding|INFO|Setting lport 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 up in Southbound
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.903 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:17 np0005539505 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:57:17 np0005539505 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.919 186962 INFO nova.compute.manager [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Took 7.03 seconds to build instance.#033[00m
Nov 29 01:57:17 np0005539505 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:57:17 np0005539505 podman[220204]: 2025-11-29 06:57:17.848506594 +0000 UTC m=+0.045055644 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:57:17 np0005539505 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:57:17 np0005539505 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:57:17 np0005539505 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:57:17 np0005539505 nova_compute[186958]: 2025-11-29 06:57:17.947 186962 DEBUG oslo_concurrency.lockutils [None req-d846b2bb-1129-4980-85b1-50ed2b65f45a b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.222 186962 DEBUG nova.compute.manager [req-95cf77be-2693-443f-85a8-45cb9d2dcc3f req-7859fb17-3571-4238-87c5-49edb7b06edb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.223 186962 DEBUG oslo_concurrency.lockutils [req-95cf77be-2693-443f-85a8-45cb9d2dcc3f req-7859fb17-3571-4238-87c5-49edb7b06edb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.223 186962 DEBUG oslo_concurrency.lockutils [req-95cf77be-2693-443f-85a8-45cb9d2dcc3f req-7859fb17-3571-4238-87c5-49edb7b06edb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.223 186962 DEBUG oslo_concurrency.lockutils [req-95cf77be-2693-443f-85a8-45cb9d2dcc3f req-7859fb17-3571-4238-87c5-49edb7b06edb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.223 186962 DEBUG nova.compute.manager [req-95cf77be-2693-443f-85a8-45cb9d2dcc3f req-7859fb17-3571-4238-87c5-49edb7b06edb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Processing event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.337 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399438.336567, 4d501671-0357-4018-80f9-31d43293d107 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.337 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] VM Started (Lifecycle Event)#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.340 186962 DEBUG nova.compute.manager [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.349 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.352 186962 INFO nova.virt.libvirt.driver [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance spawned successfully.#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.352 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.357 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.361 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.364 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.377 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.377 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.378 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.378 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.378 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.378 186962 DEBUG nova.virt.libvirt.driver [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.390 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.390 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399438.336985, 4d501671-0357-4018-80f9-31d43293d107 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.390 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.430 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.434 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399438.3441067, 4d501671-0357-4018-80f9-31d43293d107 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.435 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.464 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.469 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.491 186962 DEBUG nova.network.neutron [req-9be730be-8c94-474a-b7bf-9c51ab3fe16c req-114d8848-1e0c-4bb3-86d2-30aef752fc64 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Updated VIF entry in instance network info cache for port 4dfd4ca1-34cd-46b3-9509-5d99ceaee255. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.491 186962 DEBUG nova.network.neutron [req-9be730be-8c94-474a-b7bf-9c51ab3fe16c req-114d8848-1e0c-4bb3-86d2-30aef752fc64 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Updating instance_info_cache with network_info: [{"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.504 186962 INFO nova.compute.manager [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Took 6.01 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.505 186962 DEBUG nova.compute.manager [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.516 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.518 186962 DEBUG oslo_concurrency.lockutils [req-9be730be-8c94-474a-b7bf-9c51ab3fe16c req-114d8848-1e0c-4bb3-86d2-30aef752fc64 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-4d501671-0357-4018-80f9-31d43293d107" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.602 186962 INFO nova.compute.manager [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Took 6.57 seconds to build instance.#033[00m
Nov 29 01:57:18 np0005539505 nova_compute[186958]: 2025-11-29 06:57:18.623 186962 DEBUG oslo_concurrency.lockutils [None req-0bac1733-fb02-4b18-81e3-2f8091b2d54e 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:19 np0005539505 podman[220204]: 2025-11-29 06:57:19.670990729 +0000 UTC m=+1.867539779 container create 18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.764 186962 DEBUG nova.compute.manager [req-b2c93f90-0017-4e37-a158-947bf5518d54 req-43248477-9425-44cb-acfc-3873607643a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.764 186962 DEBUG oslo_concurrency.lockutils [req-b2c93f90-0017-4e37-a158-947bf5518d54 req-43248477-9425-44cb-acfc-3873607643a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.764 186962 DEBUG oslo_concurrency.lockutils [req-b2c93f90-0017-4e37-a158-947bf5518d54 req-43248477-9425-44cb-acfc-3873607643a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.765 186962 DEBUG oslo_concurrency.lockutils [req-b2c93f90-0017-4e37-a158-947bf5518d54 req-43248477-9425-44cb-acfc-3873607643a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.765 186962 DEBUG nova.compute.manager [req-b2c93f90-0017-4e37-a158-947bf5518d54 req-43248477-9425-44cb-acfc-3873607643a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] No waiting events found dispatching network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.765 186962 WARNING nova.compute.manager [req-b2c93f90-0017-4e37-a158-947bf5518d54 req-43248477-9425-44cb-acfc-3873607643a2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received unexpected event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 for instance with vm_state active and task_state None.#033[00m
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.829 186962 DEBUG oslo_concurrency.lockutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.829 186962 DEBUG oslo_concurrency.lockutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.830 186962 INFO nova.compute.manager [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Rebooting instance#033[00m
Nov 29 01:57:19 np0005539505 systemd[1]: Started libpod-conmon-18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639.scope.
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.858 186962 DEBUG oslo_concurrency.lockutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.858 186962 DEBUG oslo_concurrency.lockutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquired lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:19 np0005539505 nova_compute[186958]: 2025-11-29 06:57:19.858 186962 DEBUG nova.network.neutron [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:57:19 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:57:19 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1ae359c14af6e705de99d3e263ba9d01277a310d74070dc4878944098ee1c974/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:57:19 np0005539505 podman[220204]: 2025-11-29 06:57:19.988545329 +0000 UTC m=+2.185094409 container init 18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 01:57:19 np0005539505 podman[220204]: 2025-11-29 06:57:19.996898744 +0000 UTC m=+2.193447784 container start 18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 01:57:20 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220235]: [NOTICE]   (220239) : New worker (220241) forked
Nov 29 01:57:20 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220235]: [NOTICE]   (220239) : Loading success.
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.300 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 in datapath db691b6b-17b7-42a9-9fd2-162233da0513 unbound from our chassis#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.302 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network db691b6b-17b7-42a9-9fd2-162233da0513#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.319 186962 DEBUG oslo_concurrency.lockutils [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.320 186962 DEBUG oslo_concurrency.lockutils [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.321 186962 DEBUG nova.compute.manager [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.325 186962 DEBUG nova.compute.manager [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.325 186962 DEBUG oslo_concurrency.lockutils [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.325 186962 DEBUG oslo_concurrency.lockutils [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.325 186962 DEBUG oslo_concurrency.lockutils [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.326 186962 DEBUG nova.compute.manager [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] No waiting events found dispatching network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.326 186962 WARNING nova.compute.manager [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received unexpected event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 for instance with vm_state active and task_state powering-off.#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.326 186962 DEBUG nova.compute.manager [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-changed-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.326 186962 DEBUG nova.compute.manager [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Refreshing instance network info cache due to event network-changed-6c054d6e-43d7-4885-81ce-27d38953d930. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.326 186962 DEBUG oslo_concurrency.lockutils [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.327 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3d197eae-e871-4973-917d-f63f1f604262]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.330 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdb691b6b-11 in ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.332 186962 DEBUG nova.compute.manager [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.333 186962 DEBUG nova.objects.instance [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'flavor' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.337 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdb691b6b-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.337 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e9760527-0dee-4cf2-adab-5dca6e67fedd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.339 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[58b12f44-cb47-40a0-933f-5a374520cbf2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.359 186962 DEBUG nova.objects.instance [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'info_cache' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.361 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[8a5baabe-f70d-4a0b-ad87-b30563a2576d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.382 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a35d5b1a-a3b3-4898-94de-6ed523d4fa35]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.407 186962 DEBUG nova.virt.libvirt.driver [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.440 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e68e368b-84bc-4092-bee8-227fb7a60859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 NetworkManager[55134]: <info>  [1764399440.4482] manager: (tapdb691b6b-10): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.446 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[41a0f599-da0a-4346-a962-18b6b4bbd666]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.492 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[6a31beb5-7e5d-4f65-a48d-b1c7189ad323]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 systemd-udevd[220257]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.497 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1c76a7ae-c240-4a05-8f87-cb23fa4d1147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 NetworkManager[55134]: <info>  [1764399440.5376] device (tapdb691b6b-10): carrier: link connected
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.545 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c939fc98-243c-442c-98bc-a56f1b95f43e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.565 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[df3cfc56-e71b-4a26-82a3-1df3b85760d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb691b6b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:ad:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488812, 'reachable_time': 37413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220276, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.585 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cba2cec9-f2dc-4336-b640-b60ea7f0e068]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe61:ad90'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488812, 'tstamp': 488812}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220277, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.601 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b23c55-5a2f-498b-89e9-20c0b0fe1f06]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb691b6b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:ad:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488812, 'reachable_time': 37413, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220278, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.645 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b203bb-0ed6-43fb-b322-8beac60d78c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.717 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ff291393-6533-4c9e-8dc0-ea65876f0ff4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.720 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb691b6b-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.720 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.721 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb691b6b-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:20 np0005539505 kernel: tapdb691b6b-10: entered promiscuous mode
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.722 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:20 np0005539505 NetworkManager[55134]: <info>  [1764399440.7235] manager: (tapdb691b6b-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.728 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.733 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdb691b6b-10, col_values=(('external_ids', {'iface-id': '4035feb9-29a5-4ae9-8490-a44f1379821c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.735 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.736 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:20 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:20Z|00165|binding|INFO|Releasing lport 4035feb9-29a5-4ae9-8490-a44f1379821c from this chassis (sb_readonly=0)
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.752 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.753 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dc30c1de-a941-4454-89e6-2284e6a41edc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:20 np0005539505 nova_compute[186958]: 2025-11-29 06:57:20.755 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.757 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-db691b6b-17b7-42a9-9fd2-162233da0513
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID db691b6b-17b7-42a9-9fd2-162233da0513
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:57:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:20.759 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'env', 'PROCESS_TAG=haproxy-db691b6b-17b7-42a9-9fd2-162233da0513', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/db691b6b-17b7-42a9-9fd2-162233da0513.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.091 186962 DEBUG nova.network.neutron [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Updating instance_info_cache with network_info: [{"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.117 186962 DEBUG oslo_concurrency.lockutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Releasing lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.120 186962 DEBUG oslo_concurrency.lockutils [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.120 186962 DEBUG nova.network.neutron [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Refreshing network info cache for port 6c054d6e-43d7-4885-81ce-27d38953d930 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.163 186962 DEBUG nova.compute.manager [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.177 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:21 np0005539505 podman[220311]: 2025-11-29 06:57:21.181902752 +0000 UTC m=+0.034880115 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:57:21 np0005539505 kernel: tap6c054d6e-43 (unregistering): left promiscuous mode
Nov 29 01:57:21 np0005539505 NetworkManager[55134]: <info>  [1764399441.3461] device (tap6c054d6e-43): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:57:21 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:21Z|00166|binding|INFO|Releasing lport 6c054d6e-43d7-4885-81ce-27d38953d930 from this chassis (sb_readonly=0)
Nov 29 01:57:21 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:21Z|00167|binding|INFO|Setting lport 6c054d6e-43d7-4885-81ce-27d38953d930 down in Southbound
Nov 29 01:57:21 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:21Z|00168|binding|INFO|Removing iface tap6c054d6e-43 ovn-installed in OVS
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.358 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.370 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:21.372 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:70:bf 10.100.0.6'], port_security=['fa:16:3e:2d:70:bf 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2d8568dc-c82e-43a5-a9f1-46434e7873a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09d1caf9-4b04-433c-8535-2cd6d44437db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32234968781646cf869d42134e62b91c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0d44aa5e-0377-42f3-8452-9dc4fcc1fcc2 1fb16d9c-331a-41c7-a6da-9b1479dbf50c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46d714e3-e817-4d0b-99e9-4cc2314001af, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=6c054d6e-43d7-4885-81ce-27d38953d930) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:21 np0005539505 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000029.scope: Deactivated successfully.
Nov 29 01:57:21 np0005539505 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000029.scope: Consumed 3.872s CPU time.
Nov 29 01:57:21 np0005539505 systemd-machined[153285]: Machine qemu-18-instance-00000029 terminated.
Nov 29 01:57:21 np0005539505 podman[220311]: 2025-11-29 06:57:21.513266387 +0000 UTC m=+0.366243730 container create 4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:57:21 np0005539505 systemd[1]: Started libpod-conmon-4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e.scope.
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.595 186962 INFO nova.virt.libvirt.driver [-] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Instance destroyed successfully.#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.596 186962 DEBUG nova.objects.instance [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lazy-loading 'resources' on Instance uuid 2d8568dc-c82e-43a5-a9f1-46434e7873a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:21 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:57:21 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cf97fbe655a30fdb638b1b1b815079426660c9939fa7cdbb6711b394ce7ab81/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.622 186962 DEBUG nova.virt.libvirt.vif [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2017235252',display_name='tempest-SecurityGroupsTestJSON-server-2017235252',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2017235252',id=41,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32234968781646cf869d42134e62b91c',ramdisk_id='',reservation_id='r-0qmctx9u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-1623643057',owner_user_name='tempest-SecurityGroupsTestJSON-1623643057-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:57:21Z,user_data=None,user_id='b509e6a04cd147779a714856e3cd95ab',uuid=2d8568dc-c82e-43a5-a9f1-46434e7873a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.622 186962 DEBUG nova.network.os_vif_util [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converting VIF {"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.623 186962 DEBUG nova.network.os_vif_util [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.624 186962 DEBUG os_vif [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.627 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.627 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c054d6e-43, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.629 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.632 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.635 186962 INFO os_vif [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43')#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.642 186962 DEBUG nova.virt.libvirt.driver [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Start _get_guest_xml network_info=[{"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.647 186962 WARNING nova.virt.libvirt.driver [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.657 186962 DEBUG nova.virt.libvirt.host [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.659 186962 DEBUG nova.virt.libvirt.host [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.663 186962 DEBUG nova.virt.libvirt.host [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.663 186962 DEBUG nova.virt.libvirt.host [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.664 186962 DEBUG nova.virt.libvirt.driver [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.665 186962 DEBUG nova.virt.hardware [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.665 186962 DEBUG nova.virt.hardware [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.665 186962 DEBUG nova.virt.hardware [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.665 186962 DEBUG nova.virt.hardware [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.666 186962 DEBUG nova.virt.hardware [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.666 186962 DEBUG nova.virt.hardware [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.666 186962 DEBUG nova.virt.hardware [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.666 186962 DEBUG nova.virt.hardware [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.667 186962 DEBUG nova.virt.hardware [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.667 186962 DEBUG nova.virt.hardware [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.668 186962 DEBUG nova.virt.hardware [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.668 186962 DEBUG nova.objects.instance [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2d8568dc-c82e-43a5-a9f1-46434e7873a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.688 186962 DEBUG oslo_concurrency.processutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.761 186962 DEBUG oslo_concurrency.processutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.config --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.763 186962 DEBUG oslo_concurrency.lockutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.763 186962 DEBUG oslo_concurrency.lockutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.764 186962 DEBUG oslo_concurrency.lockutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.765 186962 DEBUG nova.virt.libvirt.vif [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2017235252',display_name='tempest-SecurityGroupsTestJSON-server-2017235252',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2017235252',id=41,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32234968781646cf869d42134e62b91c',ramdisk_id='',reservation_id='r-0qmctx9u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-1623643057',owner_user_name='tempest-SecurityGroupsTestJSON-1623643057-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:57:21Z,user_data=None,user_id='b509e6a04cd147779a714856e3cd95ab',uuid=2d8568dc-c82e-43a5-a9f1-46434e7873a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.765 186962 DEBUG nova.network.os_vif_util [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converting VIF {"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.766 186962 DEBUG nova.network.os_vif_util [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.768 186962 DEBUG nova.objects.instance [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lazy-loading 'pci_devices' on Instance uuid 2d8568dc-c82e-43a5-a9f1-46434e7873a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.784 186962 DEBUG nova.virt.libvirt.driver [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  <uuid>2d8568dc-c82e-43a5-a9f1-46434e7873a1</uuid>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  <name>instance-00000029</name>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <nova:name>tempest-SecurityGroupsTestJSON-server-2017235252</nova:name>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:57:21</nova:creationTime>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:        <nova:user uuid="b509e6a04cd147779a714856e3cd95ab">tempest-SecurityGroupsTestJSON-1623643057-project-member</nova:user>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:        <nova:project uuid="32234968781646cf869d42134e62b91c">tempest-SecurityGroupsTestJSON-1623643057</nova:project>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:        <nova:port uuid="6c054d6e-43d7-4885-81ce-27d38953d930">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <entry name="serial">2d8568dc-c82e-43a5-a9f1-46434e7873a1</entry>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <entry name="uuid">2d8568dc-c82e-43a5-a9f1-46434e7873a1</entry>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk.config"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:2d:70:bf"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <target dev="tap6c054d6e-43"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/console.log" append="off"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <input type="keyboard" bus="usb"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:57:21 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:57:21 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:57:21 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:57:21 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.785 186962 DEBUG oslo_concurrency.processutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:21 np0005539505 podman[220311]: 2025-11-29 06:57:21.809335325 +0000 UTC m=+0.662312698 container init 4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 29 01:57:21 np0005539505 podman[220311]: 2025-11-29 06:57:21.81769987 +0000 UTC m=+0.670677213 container start 4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 01:57:21 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[220341]: [NOTICE]   (220352) : New worker (220354) forked
Nov 29 01:57:21 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[220341]: [NOTICE]   (220352) : Loading success.
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.854 186962 DEBUG oslo_concurrency.processutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.856 186962 DEBUG oslo_concurrency.processutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.926 186962 DEBUG oslo_concurrency.processutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.928 186962 DEBUG nova.objects.instance [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2d8568dc-c82e-43a5-a9f1-46434e7873a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:21 np0005539505 nova_compute[186958]: 2025-11-29 06:57:21.960 186962 DEBUG oslo_concurrency.processutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:21.962 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 6c054d6e-43d7-4885-81ce-27d38953d930 in datapath 09d1caf9-4b04-433c-8535-2cd6d44437db unbound from our chassis#033[00m
Nov 29 01:57:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:21.964 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09d1caf9-4b04-433c-8535-2cd6d44437db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:57:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:21.966 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cd98d25a-8348-42e7-8a2d-4c124c6a3bf4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:21.967 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db namespace which is not needed anymore#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.026 186962 DEBUG oslo_concurrency.processutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.028 186962 DEBUG nova.virt.disk.api [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Checking if we can resize image /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.028 186962 DEBUG oslo_concurrency.processutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.084 186962 DEBUG oslo_concurrency.processutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.085 186962 DEBUG nova.virt.disk.api [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Cannot resize image /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.085 186962 DEBUG nova.objects.instance [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lazy-loading 'migration_context' on Instance uuid 2d8568dc-c82e-43a5-a9f1-46434e7873a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.108 186962 DEBUG nova.virt.libvirt.vif [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2017235252',display_name='tempest-SecurityGroupsTestJSON-server-2017235252',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2017235252',id=41,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='32234968781646cf869d42134e62b91c',ramdisk_id='',reservation_id='r-0qmctx9u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-1623643057',owner_user_name='tempest-SecurityGroupsTestJSON-1623643057-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:57:21Z,user_data=None,user_id='b509e6a04cd147779a714856e3cd95ab',uuid=2d8568dc-c82e-43a5-a9f1-46434e7873a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.109 186962 DEBUG nova.network.os_vif_util [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converting VIF {"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.109 186962 DEBUG nova.network.os_vif_util [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.110 186962 DEBUG os_vif [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.110 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.111 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.112 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.115 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.115 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c054d6e-43, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.115 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6c054d6e-43, col_values=(('external_ids', {'iface-id': '6c054d6e-43d7-4885-81ce-27d38953d930', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:70:bf', 'vm-uuid': '2d8568dc-c82e-43a5-a9f1-46434e7873a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.117 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:22 np0005539505 NetworkManager[55134]: <info>  [1764399442.1180] manager: (tap6c054d6e-43): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.119 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.122 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.123 186962 INFO os_vif [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43')#033[00m
Nov 29 01:57:22 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220235]: [NOTICE]   (220239) : haproxy version is 2.8.14-c23fe91
Nov 29 01:57:22 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220235]: [NOTICE]   (220239) : path to executable is /usr/sbin/haproxy
Nov 29 01:57:22 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220235]: [WARNING]  (220239) : Exiting Master process...
Nov 29 01:57:22 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220235]: [ALERT]    (220239) : Current worker (220241) exited with code 143 (Terminated)
Nov 29 01:57:22 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220235]: [WARNING]  (220239) : All workers exited. Exiting... (0)
Nov 29 01:57:22 np0005539505 systemd[1]: libpod-18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639.scope: Deactivated successfully.
Nov 29 01:57:22 np0005539505 podman[220386]: 2025-11-29 06:57:22.264084661 +0000 UTC m=+0.205261485 container died 18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:57:22 np0005539505 kernel: tap6c054d6e-43: entered promiscuous mode
Nov 29 01:57:22 np0005539505 NetworkManager[55134]: <info>  [1764399442.3107] manager: (tap6c054d6e-43): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Nov 29 01:57:22 np0005539505 systemd-udevd[220267]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:57:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:22Z|00169|binding|INFO|Claiming lport 6c054d6e-43d7-4885-81ce-27d38953d930 for this chassis.
Nov 29 01:57:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:22Z|00170|binding|INFO|6c054d6e-43d7-4885-81ce-27d38953d930: Claiming fa:16:3e:2d:70:bf 10.100.0.6
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.316 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.325 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:70:bf 10.100.0.6'], port_security=['fa:16:3e:2d:70:bf 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2d8568dc-c82e-43a5-a9f1-46434e7873a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09d1caf9-4b04-433c-8535-2cd6d44437db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32234968781646cf869d42134e62b91c', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0d44aa5e-0377-42f3-8452-9dc4fcc1fcc2 1fb16d9c-331a-41c7-a6da-9b1479dbf50c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46d714e3-e817-4d0b-99e9-4cc2314001af, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=6c054d6e-43d7-4885-81ce-27d38953d930) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:22 np0005539505 NetworkManager[55134]: <info>  [1764399442.3279] device (tap6c054d6e-43): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:57:22 np0005539505 NetworkManager[55134]: <info>  [1764399442.3295] device (tap6c054d6e-43): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:57:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:22Z|00171|binding|INFO|Setting lport 6c054d6e-43d7-4885-81ce-27d38953d930 ovn-installed in OVS
Nov 29 01:57:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:22Z|00172|binding|INFO|Setting lport 6c054d6e-43d7-4885-81ce-27d38953d930 up in Southbound
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.331 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:22 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639-userdata-shm.mount: Deactivated successfully.
Nov 29 01:57:22 np0005539505 systemd[1]: var-lib-containers-storage-overlay-1ae359c14af6e705de99d3e263ba9d01277a310d74070dc4878944098ee1c974-merged.mount: Deactivated successfully.
Nov 29 01:57:22 np0005539505 systemd-machined[153285]: New machine qemu-20-instance-00000029.
Nov 29 01:57:22 np0005539505 systemd[1]: Started Virtual Machine qemu-20-instance-00000029.
Nov 29 01:57:22 np0005539505 podman[220386]: 2025-11-29 06:57:22.40884711 +0000 UTC m=+0.350023904 container cleanup 18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 01:57:22 np0005539505 systemd[1]: libpod-conmon-18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639.scope: Deactivated successfully.
Nov 29 01:57:22 np0005539505 podman[220442]: 2025-11-29 06:57:22.545063948 +0000 UTC m=+0.100657866 container remove 18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.552 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[76deefc3-192a-4f2a-8384-f515439206fc]: (4, ('Sat Nov 29 06:57:22 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db (18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639)\n18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639\nSat Nov 29 06:57:22 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db (18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639)\n18787d98086c13dd007d057d46cf27a761d1deeaf625f4c97e34c017b1031639\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.555 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2d75139c-5fe8-4647-9922-27ec29f06d1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.557 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09d1caf9-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.559 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:22 np0005539505 kernel: tap09d1caf9-40: left promiscuous mode
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.573 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.576 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d07d8b05-da68-4991-970b-2237bb9d84f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.595 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9ba33de6-01f0-4640-b769-4540e7ca0468]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.599 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe99368-e17c-4ed2-bbd0-9a5291da032a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.601 186962 DEBUG nova.network.neutron [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Updated VIF entry in instance network info cache for port 6c054d6e-43d7-4885-81ce-27d38953d930. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.601 186962 DEBUG nova.network.neutron [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Updating instance_info_cache with network_info: [{"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.620 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[359413a0-1fd6-4273-86c6-e6015f63de97]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488469, 'reachable_time': 28076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220456, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 systemd[1]: run-netns-ovnmeta\x2d09d1caf9\x2d4b04\x2d433c\x2d8535\x2d2cd6d44437db.mount: Deactivated successfully.
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.627 186962 DEBUG oslo_concurrency.lockutils [req-29816aff-1b02-413e-9fac-8e7ded17f4aa req-eb54bea5-eec6-47f3-8126-7a76d86a3396 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.631 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.632 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0369ab-3282-403c-9300-b7c48b07356a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.633 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 6c054d6e-43d7-4885-81ce-27d38953d930 in datapath 09d1caf9-4b04-433c-8535-2cd6d44437db unbound from our chassis#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.635 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 09d1caf9-4b04-433c-8535-2cd6d44437db#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.645 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6dd51866-80f1-4122-a124-1697c7527514]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.649 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap09d1caf9-41 in ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.651 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap09d1caf9-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.651 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[03c1649b-e473-4853-9fd6-7c6bdce7b27d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.652 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0c19aac9-84a7-4dfb-99c0-2e64489b614e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.667 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[4c409f9e-dafa-4f4d-8add-d35468ecba25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.699 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[27928bc8-4b9c-4419-96cc-cdab7677c902]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.721 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Removed pending event for 2d8568dc-c82e-43a5-a9f1-46434e7873a1 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.723 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399442.721078, 2d8568dc-c82e-43a5-a9f1-46434e7873a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.723 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.727 186962 DEBUG nova.compute.manager [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.733 186962 INFO nova.virt.libvirt.driver [-] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Instance rebooted successfully.#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.734 186962 DEBUG nova.compute.manager [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.736 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[40433ded-9a7a-4f5e-8fb6-0026126d241a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.744 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8a66b94e-1ffb-4564-9ad8-ddb463df68a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 NetworkManager[55134]: <info>  [1764399442.7465] manager: (tap09d1caf9-40): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.747 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.753 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.779 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[dbde2153-d90c-4209-b5f8-0c8733383f00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.784 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.785 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399442.7262979, 2d8568dc-c82e-43a5-a9f1-46434e7873a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.785 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] VM Started (Lifecycle Event)#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.785 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[08861cb5-e64c-4b49-90bb-da256c2b54ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.808 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.813 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:22 np0005539505 NetworkManager[55134]: <info>  [1764399442.8199] device (tap09d1caf9-40): carrier: link connected
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.825 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a4f255-85a6-4fc2-a013-2a92679e445c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 nova_compute[186958]: 2025-11-29 06:57:22.839 186962 DEBUG oslo_concurrency.lockutils [None req-8baa2d02-ad40-429b-9cc4-524dea199cd1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 3.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.845 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2719fdb1-43e4-4a73-9b53-dc6748c61ac1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09d1caf9-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a3:d9:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489040, 'reachable_time': 36750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220474, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.868 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[666b1fb0-dd34-4e7d-a733-b1261bd72efa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea3:d9b0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 489040, 'tstamp': 489040}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220475, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.891 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a0035ff5-0f43-4ef1-9185-215f493c78f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap09d1caf9-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a3:d9:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489040, 'reachable_time': 36750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220476, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:22.931 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d372a547-2feb-47cf-9062-2c525a2052b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:23.012 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[73172d16-2f77-48fe-bebb-35f265197b23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:23.013 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09d1caf9-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:23.014 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:23.014 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap09d1caf9-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:23 np0005539505 NetworkManager[55134]: <info>  [1764399443.0169] manager: (tap09d1caf9-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Nov 29 01:57:23 np0005539505 nova_compute[186958]: 2025-11-29 06:57:23.016 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:23 np0005539505 kernel: tap09d1caf9-40: entered promiscuous mode
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:23.019 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap09d1caf9-40, col_values=(('external_ids', {'iface-id': '4919a746-8c41-4e1b-93e2-17dfe2a5b063'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:23 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:23Z|00173|binding|INFO|Releasing lport 4919a746-8c41-4e1b-93e2-17dfe2a5b063 from this chassis (sb_readonly=0)
Nov 29 01:57:23 np0005539505 nova_compute[186958]: 2025-11-29 06:57:23.034 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:23.035 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/09d1caf9-4b04-433c-8535-2cd6d44437db.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/09d1caf9-4b04-433c-8535-2cd6d44437db.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:23.037 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea07eed-6782-42d8-93fc-26f2abd89059]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:23.037 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-09d1caf9-4b04-433c-8535-2cd6d44437db
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/09d1caf9-4b04-433c-8535-2cd6d44437db.pid.haproxy
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 09d1caf9-4b04-433c-8535-2cd6d44437db
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:57:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:23.038 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'env', 'PROCESS_TAG=haproxy-09d1caf9-4b04-433c-8535-2cd6d44437db', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/09d1caf9-4b04-433c-8535-2cd6d44437db.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:57:23 np0005539505 nova_compute[186958]: 2025-11-29 06:57:23.176 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399428.1757023, 373bf1d0-aa63-4995-87c5-d6a01e995a40 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:23 np0005539505 nova_compute[186958]: 2025-11-29 06:57:23.177 186962 INFO nova.compute.manager [-] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:57:23 np0005539505 nova_compute[186958]: 2025-11-29 06:57:23.392 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:23 np0005539505 nova_compute[186958]: 2025-11-29 06:57:23.450 186962 DEBUG nova.compute.manager [None req-214b4b81-680a-4554-9943-ede5d5e98092 - - - - - -] [instance: 373bf1d0-aa63-4995-87c5-d6a01e995a40] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:23 np0005539505 podman[220508]: 2025-11-29 06:57:23.49067524 +0000 UTC m=+0.062316230 container create 0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:57:23 np0005539505 systemd[1]: Started libpod-conmon-0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545.scope.
Nov 29 01:57:23 np0005539505 podman[220508]: 2025-11-29 06:57:23.458618849 +0000 UTC m=+0.030259859 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:57:23 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:57:23 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a975e923e8870c1dc664c59d055794f65a1fd7e9bdca14049368ad4869bb044/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:57:23 np0005539505 podman[220508]: 2025-11-29 06:57:23.598244557 +0000 UTC m=+0.169885547 container init 0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:57:23 np0005539505 podman[220508]: 2025-11-29 06:57:23.604980795 +0000 UTC m=+0.176621785 container start 0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 01:57:23 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220523]: [NOTICE]   (220527) : New worker (220529) forked
Nov 29 01:57:23 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220523]: [NOTICE]   (220527) : Loading success.
Nov 29 01:57:26 np0005539505 nova_compute[186958]: 2025-11-29 06:57:26.547 186962 DEBUG nova.compute.manager [req-1107fd7c-ecce-4d88-b06e-b4291fdc6bbd req-28eb3392-d9ae-4f65-aba2-6de9b2bae779 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-vif-unplugged-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:26 np0005539505 nova_compute[186958]: 2025-11-29 06:57:26.548 186962 DEBUG oslo_concurrency.lockutils [req-1107fd7c-ecce-4d88-b06e-b4291fdc6bbd req-28eb3392-d9ae-4f65-aba2-6de9b2bae779 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:26 np0005539505 nova_compute[186958]: 2025-11-29 06:57:26.549 186962 DEBUG oslo_concurrency.lockutils [req-1107fd7c-ecce-4d88-b06e-b4291fdc6bbd req-28eb3392-d9ae-4f65-aba2-6de9b2bae779 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:26 np0005539505 nova_compute[186958]: 2025-11-29 06:57:26.549 186962 DEBUG oslo_concurrency.lockutils [req-1107fd7c-ecce-4d88-b06e-b4291fdc6bbd req-28eb3392-d9ae-4f65-aba2-6de9b2bae779 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:26 np0005539505 nova_compute[186958]: 2025-11-29 06:57:26.550 186962 DEBUG nova.compute.manager [req-1107fd7c-ecce-4d88-b06e-b4291fdc6bbd req-28eb3392-d9ae-4f65-aba2-6de9b2bae779 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] No waiting events found dispatching network-vif-unplugged-6c054d6e-43d7-4885-81ce-27d38953d930 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:26 np0005539505 nova_compute[186958]: 2025-11-29 06:57:26.550 186962 WARNING nova.compute.manager [req-1107fd7c-ecce-4d88-b06e-b4291fdc6bbd req-28eb3392-d9ae-4f65-aba2-6de9b2bae779 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received unexpected event network-vif-unplugged-6c054d6e-43d7-4885-81ce-27d38953d930 for instance with vm_state active and task_state None.#033[00m
Nov 29 01:57:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:26.934 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:26.935 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:26.935 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:27 np0005539505 nova_compute[186958]: 2025-11-29 06:57:27.117 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:27 np0005539505 nova_compute[186958]: 2025-11-29 06:57:27.141 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:27 np0005539505 nova_compute[186958]: 2025-11-29 06:57:27.456 186962 DEBUG nova.compute.manager [req-e5f2f9bb-acb7-4713-addb-951d9aa7944d req-b112144f-45b0-400e-92aa-fd220e1185c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-changed-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:27 np0005539505 nova_compute[186958]: 2025-11-29 06:57:27.456 186962 DEBUG nova.compute.manager [req-e5f2f9bb-acb7-4713-addb-951d9aa7944d req-b112144f-45b0-400e-92aa-fd220e1185c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Refreshing instance network info cache due to event network-changed-6c054d6e-43d7-4885-81ce-27d38953d930. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:57:27 np0005539505 nova_compute[186958]: 2025-11-29 06:57:27.456 186962 DEBUG oslo_concurrency.lockutils [req-e5f2f9bb-acb7-4713-addb-951d9aa7944d req-b112144f-45b0-400e-92aa-fd220e1185c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:27 np0005539505 nova_compute[186958]: 2025-11-29 06:57:27.456 186962 DEBUG oslo_concurrency.lockutils [req-e5f2f9bb-acb7-4713-addb-951d9aa7944d req-b112144f-45b0-400e-92aa-fd220e1185c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:27 np0005539505 nova_compute[186958]: 2025-11-29 06:57:27.457 186962 DEBUG nova.network.neutron [req-e5f2f9bb-acb7-4713-addb-951d9aa7944d req-b112144f-45b0-400e-92aa-fd220e1185c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Refreshing network info cache for port 6c054d6e-43d7-4885-81ce-27d38953d930 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:57:27 np0005539505 podman[220539]: 2025-11-29 06:57:27.743804362 +0000 UTC m=+0.067146662 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:57:27 np0005539505 podman[220538]: 2025-11-29 06:57:27.745963565 +0000 UTC m=+0.066774201 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal)
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.161 186962 DEBUG oslo_concurrency.lockutils [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.162 186962 DEBUG oslo_concurrency.lockutils [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.162 186962 DEBUG oslo_concurrency.lockutils [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.162 186962 DEBUG oslo_concurrency.lockutils [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.163 186962 DEBUG oslo_concurrency.lockutils [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.173 186962 INFO nova.compute.manager [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Terminating instance#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.185 186962 DEBUG nova.compute.manager [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:57:28 np0005539505 kernel: tap6c054d6e-43 (unregistering): left promiscuous mode
Nov 29 01:57:28 np0005539505 NetworkManager[55134]: <info>  [1764399448.2058] device (tap6c054d6e-43): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:57:28 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:28Z|00174|binding|INFO|Releasing lport 6c054d6e-43d7-4885-81ce-27d38953d930 from this chassis (sb_readonly=0)
Nov 29 01:57:28 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:28Z|00175|binding|INFO|Setting lport 6c054d6e-43d7-4885-81ce-27d38953d930 down in Southbound
Nov 29 01:57:28 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:28Z|00176|binding|INFO|Removing iface tap6c054d6e-43 ovn-installed in OVS
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.217 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.219 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:28.229 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:70:bf 10.100.0.6'], port_security=['fa:16:3e:2d:70:bf 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2d8568dc-c82e-43a5-a9f1-46434e7873a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09d1caf9-4b04-433c-8535-2cd6d44437db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32234968781646cf869d42134e62b91c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0d44aa5e-0377-42f3-8452-9dc4fcc1fcc2 1fb16d9c-331a-41c7-a6da-9b1479dbf50c 2d542f68-03f4-4e14-8b62-0a8778b9e944', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46d714e3-e817-4d0b-99e9-4cc2314001af, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=6c054d6e-43d7-4885-81ce-27d38953d930) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:28.231 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 6c054d6e-43d7-4885-81ce-27d38953d930 in datapath 09d1caf9-4b04-433c-8535-2cd6d44437db unbound from our chassis#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.231 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:28.233 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09d1caf9-4b04-433c-8535-2cd6d44437db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:57:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:28.234 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd9ff87-d55d-40b2-be21-a4b27543b859]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:28.235 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db namespace which is not needed anymore#033[00m
Nov 29 01:57:28 np0005539505 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000029.scope: Deactivated successfully.
Nov 29 01:57:28 np0005539505 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000029.scope: Consumed 6.020s CPU time.
Nov 29 01:57:28 np0005539505 systemd-machined[153285]: Machine qemu-20-instance-00000029 terminated.
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.415 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.423 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.461 186962 INFO nova.virt.libvirt.driver [-] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Instance destroyed successfully.#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.462 186962 DEBUG nova.objects.instance [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lazy-loading 'resources' on Instance uuid 2d8568dc-c82e-43a5-a9f1-46434e7873a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.483 186962 DEBUG nova.virt.libvirt.vif [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:57:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2017235252',display_name='tempest-SecurityGroupsTestJSON-server-2017235252',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2017235252',id=41,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:57:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32234968781646cf869d42134e62b91c',ramdisk_id='',reservation_id='r-0qmctx9u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-1623643057',owner_user_name='tempest-SecurityGroupsTestJSON-1623643057-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:57:22Z,user_data=None,user_id='b509e6a04cd147779a714856e3cd95ab',uuid=2d8568dc-c82e-43a5-a9f1-46434e7873a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.484 186962 DEBUG nova.network.os_vif_util [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converting VIF {"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.485 186962 DEBUG nova.network.os_vif_util [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.485 186962 DEBUG os_vif [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.487 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.487 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c054d6e-43, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.491 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.492 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.495 186962 INFO os_vif [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2d:70:bf,bridge_name='br-int',has_traffic_filtering=True,id=6c054d6e-43d7-4885-81ce-27d38953d930,network=Network(09d1caf9-4b04-433c-8535-2cd6d44437db),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c054d6e-43')#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.496 186962 INFO nova.virt.libvirt.driver [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Deleting instance files /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1_del#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.497 186962 INFO nova.virt.libvirt.driver [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Deletion of /var/lib/nova/instances/2d8568dc-c82e-43a5-a9f1-46434e7873a1_del complete#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.568 186962 INFO nova.compute.manager [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.569 186962 DEBUG oslo.service.loopingcall [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.570 186962 DEBUG nova.compute.manager [-] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.571 186962 DEBUG nova.network.neutron [-] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.643 186962 DEBUG nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.644 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.644 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.644 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.645 186962 DEBUG nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] No waiting events found dispatching network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.645 186962 WARNING nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received unexpected event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.645 186962 DEBUG nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.645 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.646 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.646 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.646 186962 DEBUG nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] No waiting events found dispatching network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.647 186962 WARNING nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received unexpected event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.647 186962 DEBUG nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.647 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.648 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.648 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.648 186962 DEBUG nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] No waiting events found dispatching network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.648 186962 WARNING nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received unexpected event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.649 186962 DEBUG nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-vif-unplugged-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.649 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.649 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.650 186962 DEBUG oslo_concurrency.lockutils [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.650 186962 DEBUG nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] No waiting events found dispatching network-vif-unplugged-6c054d6e-43d7-4885-81ce-27d38953d930 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:28 np0005539505 nova_compute[186958]: 2025-11-29 06:57:28.650 186962 DEBUG nova.compute.manager [req-1b49dadc-fab9-4599-a008-352507384d59 req-2695b105-a11e-45da-b65a-931856be1456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-vif-unplugged-6c054d6e-43d7-4885-81ce-27d38953d930 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:57:28 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220523]: [NOTICE]   (220527) : haproxy version is 2.8.14-c23fe91
Nov 29 01:57:28 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220523]: [NOTICE]   (220527) : path to executable is /usr/sbin/haproxy
Nov 29 01:57:28 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220523]: [WARNING]  (220527) : Exiting Master process...
Nov 29 01:57:28 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220523]: [WARNING]  (220527) : Exiting Master process...
Nov 29 01:57:28 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220523]: [ALERT]    (220527) : Current worker (220529) exited with code 143 (Terminated)
Nov 29 01:57:28 np0005539505 neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db[220523]: [WARNING]  (220527) : All workers exited. Exiting... (0)
Nov 29 01:57:28 np0005539505 systemd[1]: libpod-0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545.scope: Deactivated successfully.
Nov 29 01:57:28 np0005539505 podman[220601]: 2025-11-29 06:57:28.683300184 +0000 UTC m=+0.340748591 container died 0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:57:29 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545-userdata-shm.mount: Deactivated successfully.
Nov 29 01:57:29 np0005539505 systemd[1]: var-lib-containers-storage-overlay-7a975e923e8870c1dc664c59d055794f65a1fd7e9bdca14049368ad4869bb044-merged.mount: Deactivated successfully.
Nov 29 01:57:29 np0005539505 podman[220601]: 2025-11-29 06:57:29.263479691 +0000 UTC m=+0.920928098 container cleanup 0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 01:57:29 np0005539505 systemd[1]: libpod-conmon-0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545.scope: Deactivated successfully.
Nov 29 01:57:29 np0005539505 podman[220643]: 2025-11-29 06:57:29.746682372 +0000 UTC m=+0.456143648 container remove 0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:57:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:29.755 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f5add2-8cb2-44a5-9cab-89a20b24ce12]: (4, ('Sat Nov 29 06:57:28 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db (0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545)\n0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545\nSat Nov 29 06:57:29 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db (0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545)\n0737cdd0272b665fce45697543eacf7e0b62fa3f377896222cb95ac5e8342545\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:29.758 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[20920d91-39f7-4523-9063-41002b48d86a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:29.760 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap09d1caf9-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:29 np0005539505 nova_compute[186958]: 2025-11-29 06:57:29.763 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:29 np0005539505 kernel: tap09d1caf9-40: left promiscuous mode
Nov 29 01:57:29 np0005539505 nova_compute[186958]: 2025-11-29 06:57:29.779 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:29.783 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ff834659-82a9-4973-93cd-9be90967a95d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:29.796 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1d6a62ec-f1e1-432a-9db6-ed409a8e5304]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:29.799 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[22220728-243d-42a3-9fdb-008740c8d195]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:29.825 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d67acd-8aeb-4a32-b756-a7278164a516]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 489031, 'reachable_time': 18074, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220659, 'error': None, 'target': 'ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:29 np0005539505 systemd[1]: run-netns-ovnmeta\x2d09d1caf9\x2d4b04\x2d433c\x2d8535\x2d2cd6d44437db.mount: Deactivated successfully.
Nov 29 01:57:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:29.831 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-09d1caf9-4b04-433c-8535-2cd6d44437db deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:57:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:29.831 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[b4122028-6349-4073-9439-54fcb035133c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:29 np0005539505 nova_compute[186958]: 2025-11-29 06:57:29.855 186962 DEBUG nova.network.neutron [req-e5f2f9bb-acb7-4713-addb-951d9aa7944d req-b112144f-45b0-400e-92aa-fd220e1185c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Updated VIF entry in instance network info cache for port 6c054d6e-43d7-4885-81ce-27d38953d930. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:57:29 np0005539505 nova_compute[186958]: 2025-11-29 06:57:29.857 186962 DEBUG nova.network.neutron [req-e5f2f9bb-acb7-4713-addb-951d9aa7944d req-b112144f-45b0-400e-92aa-fd220e1185c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Updating instance_info_cache with network_info: [{"id": "6c054d6e-43d7-4885-81ce-27d38953d930", "address": "fa:16:3e:2d:70:bf", "network": {"id": "09d1caf9-4b04-433c-8535-2cd6d44437db", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-548045966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32234968781646cf869d42134e62b91c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c054d6e-43", "ovs_interfaceid": "6c054d6e-43d7-4885-81ce-27d38953d930", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:29 np0005539505 nova_compute[186958]: 2025-11-29 06:57:29.879 186962 DEBUG oslo_concurrency.lockutils [req-e5f2f9bb-acb7-4713-addb-951d9aa7944d req-b112144f-45b0-400e-92aa-fd220e1185c8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2d8568dc-c82e-43a5-a9f1-46434e7873a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.003 186962 DEBUG nova.network.neutron [-] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.022 186962 INFO nova.compute.manager [-] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Took 1.45 seconds to deallocate network for instance.#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.081 186962 DEBUG nova.compute.manager [req-53f08112-10c5-4c3e-8823-91101d62e3fb req-5fd15372-2196-4be4-8c17-615e4c9ac196 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-vif-deleted-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.131 186962 DEBUG oslo_concurrency.lockutils [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.132 186962 DEBUG oslo_concurrency.lockutils [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.266 186962 DEBUG nova.compute.provider_tree [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.302 186962 DEBUG nova.scheduler.client.report [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.336 186962 DEBUG oslo_concurrency.lockutils [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.370 186962 INFO nova.scheduler.client.report [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Deleted allocations for instance 2d8568dc-c82e-43a5-a9f1-46434e7873a1#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.465 186962 DEBUG oslo_concurrency.lockutils [None req-a03eced8-2ca8-4831-8032-48efd2d9b2f1 b509e6a04cd147779a714856e3cd95ab 32234968781646cf869d42134e62b91c - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.480 186962 DEBUG nova.virt.libvirt.driver [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.736 186962 DEBUG nova.compute.manager [req-f0044cff-bbfc-401c-b280-840f3152d9a0 req-78df680a-9d03-4569-8cfc-eded924de6f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.737 186962 DEBUG oslo_concurrency.lockutils [req-f0044cff-bbfc-401c-b280-840f3152d9a0 req-78df680a-9d03-4569-8cfc-eded924de6f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.737 186962 DEBUG oslo_concurrency.lockutils [req-f0044cff-bbfc-401c-b280-840f3152d9a0 req-78df680a-9d03-4569-8cfc-eded924de6f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.737 186962 DEBUG oslo_concurrency.lockutils [req-f0044cff-bbfc-401c-b280-840f3152d9a0 req-78df680a-9d03-4569-8cfc-eded924de6f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2d8568dc-c82e-43a5-a9f1-46434e7873a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.737 186962 DEBUG nova.compute.manager [req-f0044cff-bbfc-401c-b280-840f3152d9a0 req-78df680a-9d03-4569-8cfc-eded924de6f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] No waiting events found dispatching network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:30 np0005539505 nova_compute[186958]: 2025-11-29 06:57:30.738 186962 WARNING nova.compute.manager [req-f0044cff-bbfc-401c-b280-840f3152d9a0 req-78df680a-9d03-4569-8cfc-eded924de6f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Received unexpected event network-vif-plugged-6c054d6e-43d7-4885-81ce-27d38953d930 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:57:31 np0005539505 podman[220675]: 2025-11-29 06:57:31.731567176 +0000 UTC m=+0.061498476 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 01:57:32 np0005539505 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 01:57:32 np0005539505 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 01:57:32 np0005539505 systemd-logind[794]: New session 36 of user nova.
Nov 29 01:57:32 np0005539505 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 01:57:32 np0005539505 systemd[1]: Starting User Manager for UID 42436...
Nov 29 01:57:32 np0005539505 systemd[220698]: Queued start job for default target Main User Target.
Nov 29 01:57:32 np0005539505 systemd[220698]: Created slice User Application Slice.
Nov 29 01:57:32 np0005539505 systemd[220698]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:57:32 np0005539505 systemd[220698]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 01:57:32 np0005539505 systemd[220698]: Reached target Paths.
Nov 29 01:57:32 np0005539505 systemd[220698]: Reached target Timers.
Nov 29 01:57:32 np0005539505 systemd[220698]: Starting D-Bus User Message Bus Socket...
Nov 29 01:57:32 np0005539505 systemd[220698]: Starting Create User's Volatile Files and Directories...
Nov 29 01:57:32 np0005539505 systemd[220698]: Listening on D-Bus User Message Bus Socket.
Nov 29 01:57:32 np0005539505 systemd[220698]: Reached target Sockets.
Nov 29 01:57:32 np0005539505 systemd[220698]: Finished Create User's Volatile Files and Directories.
Nov 29 01:57:32 np0005539505 systemd[220698]: Reached target Basic System.
Nov 29 01:57:32 np0005539505 systemd[220698]: Reached target Main User Target.
Nov 29 01:57:32 np0005539505 systemd[220698]: Startup finished in 151ms.
Nov 29 01:57:32 np0005539505 systemd[1]: Started User Manager for UID 42436.
Nov 29 01:57:32 np0005539505 systemd[1]: Started Session 36 of User nova.
Nov 29 01:57:32 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:32Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:61:c9:0d 10.100.0.5
Nov 29 01:57:32 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:32Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:61:c9:0d 10.100.0.5
Nov 29 01:57:33 np0005539505 nova_compute[186958]: 2025-11-29 06:57:33.425 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:33 np0005539505 nova_compute[186958]: 2025-11-29 06:57:33.491 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:33 np0005539505 systemd[1]: session-36.scope: Deactivated successfully.
Nov 29 01:57:33 np0005539505 systemd-logind[794]: Session 36 logged out. Waiting for processes to exit.
Nov 29 01:57:33 np0005539505 systemd-logind[794]: Removed session 36.
Nov 29 01:57:33 np0005539505 systemd-logind[794]: New session 38 of user nova.
Nov 29 01:57:33 np0005539505 systemd[1]: Started Session 38 of User nova.
Nov 29 01:57:33 np0005539505 nova_compute[186958]: 2025-11-29 06:57:33.765 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Acquiring lock "c8ab194e-e936-4110-aef3-1eb79dc427c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:33 np0005539505 nova_compute[186958]: 2025-11-29 06:57:33.766 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:33 np0005539505 nova_compute[186958]: 2025-11-29 06:57:33.787 186962 DEBUG nova.compute.manager [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:57:33 np0005539505 systemd[1]: session-38.scope: Deactivated successfully.
Nov 29 01:57:33 np0005539505 systemd-logind[794]: Session 38 logged out. Waiting for processes to exit.
Nov 29 01:57:33 np0005539505 systemd-logind[794]: Removed session 38.
Nov 29 01:57:33 np0005539505 systemd-logind[794]: New session 39 of user nova.
Nov 29 01:57:33 np0005539505 systemd[1]: Started Session 39 of User nova.
Nov 29 01:57:34 np0005539505 systemd[1]: session-39.scope: Deactivated successfully.
Nov 29 01:57:34 np0005539505 systemd-logind[794]: Session 39 logged out. Waiting for processes to exit.
Nov 29 01:57:34 np0005539505 systemd-logind[794]: Removed session 39.
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.093 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.094 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.102 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.102 186962 INFO nova.compute.claims [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.294 186962 DEBUG nova.compute.provider_tree [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.310 186962 DEBUG nova.scheduler.client.report [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.336 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.337 186962 DEBUG nova.compute.manager [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.399 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.399 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquired lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.400 186962 DEBUG nova.network.neutron [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.412 186962 DEBUG nova.compute.manager [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.413 186962 DEBUG nova.network.neutron [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.428 186962 INFO nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.446 186962 DEBUG nova.compute.manager [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.574 186962 DEBUG nova.compute.manager [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.575 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.576 186962 INFO nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Creating image(s)#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.576 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Acquiring lock "/var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.576 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "/var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.577 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "/var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.591 186962 DEBUG nova.network.neutron [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.595 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.636 186962 DEBUG nova.policy [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '577626ce99e245058e48fb6c5268a00f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f890cbeccf24a3cb44c8120bb217100', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.658 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.659 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.659 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.671 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.740 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.741 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.914 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk 1073741824" returned: 0 in 0.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.915 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.916 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:34 np0005539505 nova_compute[186958]: 2025-11-29 06:57:34.982 186962 DEBUG nova.network.neutron [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.008 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Releasing lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.016 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.017 186962 DEBUG nova.virt.disk.api [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Checking if we can resize image /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.017 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.081 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.082 186962 DEBUG nova.virt.disk.api [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Cannot resize image /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.082 186962 DEBUG nova.objects.instance [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lazy-loading 'migration_context' on Instance uuid c8ab194e-e936-4110-aef3-1eb79dc427c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.109 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.110 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Ensure instance console log exists: /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.110 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.110 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.111 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.155 186962 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.157 186962 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.157 186962 INFO nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Creating image(s)#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.158 186962 DEBUG nova.objects.instance [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.170 186962 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.271 186962 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.272 186962 DEBUG nova.virt.disk.api [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Checking if we can resize image /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.273 186962 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.341 186962 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.343 186962 DEBUG nova.virt.disk.api [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Cannot resize image /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.369 186962 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.370 186962 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Ensure instance console log exists: /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.370 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.370 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.371 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.372 186962 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.377 186962 WARNING nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.384 186962 DEBUG nova.virt.libvirt.host [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.384 186962 DEBUG nova.virt.libvirt.host [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.388 186962 DEBUG nova.virt.libvirt.host [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.389 186962 DEBUG nova.virt.libvirt.host [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.391 186962 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.391 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.392 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.392 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.393 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.393 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.393 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.393 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.394 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.394 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.394 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.395 186962 DEBUG nova.virt.hardware [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.395 186962 DEBUG nova.objects.instance [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.423 186962 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.509 186962 DEBUG oslo_concurrency.processutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.510 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Acquiring lock "/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.510 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.511 186962 DEBUG oslo_concurrency.lockutils [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] Lock "/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.514 186962 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  <uuid>72856fd1-9e86-48df-817f-42b206cc0bea</uuid>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  <name>instance-00000028</name>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <nova:name>tempest-MigrationsAdminTest-server-2041941217</nova:name>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:57:35</nova:creationTime>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:57:35 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:        <nova:user uuid="53ee944c04484336b9b14d84235a62b8">tempest-MigrationsAdminTest-1601255173-project-member</nova:user>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:        <nova:project uuid="890f94a625b342fdb17128922403c925">tempest-MigrationsAdminTest-1601255173</nova:project>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <nova:ports/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <entry name="serial">72856fd1-9e86-48df-817f-42b206cc0bea</entry>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <entry name="uuid">72856fd1-9e86-48df-817f-42b206cc0bea</entry>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/disk.config"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/console.log" append="off"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:57:35 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:57:35 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:57:35 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:57:35 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.621 186962 DEBUG nova.network.neutron [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Successfully created port: b9da574e-6a19-48ee-acaa-5d9843059704 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.719 186962 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.720 186962 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:35 np0005539505 nova_compute[186958]: 2025-11-29 06:57:35.721 186962 INFO nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Using config drive#033[00m
Nov 29 01:57:35 np0005539505 systemd-machined[153285]: New machine qemu-21-instance-00000028.
Nov 29 01:57:35 np0005539505 systemd[1]: Started Virtual Machine qemu-21-instance-00000028.
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.389 186962 DEBUG nova.network.neutron [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Successfully updated port: b9da574e-6a19-48ee-acaa-5d9843059704 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.412 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Acquiring lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.413 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Acquired lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.413 186962 DEBUG nova.network.neutron [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.522 186962 DEBUG nova.compute.manager [req-24321b1b-9e6c-4dc5-a3a4-3bc7c69bad6a req-e760dbf9-9f9f-471b-bd3a-e395d9d67c3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Received event network-changed-b9da574e-6a19-48ee-acaa-5d9843059704 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.523 186962 DEBUG nova.compute.manager [req-24321b1b-9e6c-4dc5-a3a4-3bc7c69bad6a req-e760dbf9-9f9f-471b-bd3a-e395d9d67c3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Refreshing instance network info cache due to event network-changed-b9da574e-6a19-48ee-acaa-5d9843059704. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.523 186962 DEBUG oslo_concurrency.lockutils [req-24321b1b-9e6c-4dc5-a3a4-3bc7c69bad6a req-e760dbf9-9f9f-471b-bd3a-e395d9d67c3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.769 186962 DEBUG nova.network.neutron [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.906 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399456.9056933, 72856fd1-9e86-48df-817f-42b206cc0bea => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.907 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.910 186962 DEBUG nova.compute.manager [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.916 186962 INFO nova.virt.libvirt.driver [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance running successfully.#033[00m
Nov 29 01:57:36 np0005539505 virtqemud[186353]: argument unsupported: QEMU guest agent is not configured
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.921 186962 DEBUG nova.virt.libvirt.guest [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 01:57:36 np0005539505 nova_compute[186958]: 2025-11-29 06:57:36.921 186962 DEBUG nova.virt.libvirt.driver [None req-50caedd9-829f-4444-89e9-6331685a0d0e 9e259d5e4d874fb6b41a43111be58b1e f4500190058d46d38cce53b732755b38 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.174 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.177 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.309 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.310 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399456.9070785, 72856fd1-9e86-48df-817f-42b206cc0bea => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.310 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] VM Started (Lifecycle Event)#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.365 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.369 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.401 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 01:57:37 np0005539505 podman[220774]: 2025-11-29 06:57:37.78183341 +0000 UTC m=+0.101459818 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.879 186962 DEBUG nova.network.neutron [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Updating instance_info_cache with network_info: [{"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.898 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Releasing lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.898 186962 DEBUG nova.compute.manager [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Instance network_info: |[{"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.899 186962 DEBUG oslo_concurrency.lockutils [req-24321b1b-9e6c-4dc5-a3a4-3bc7c69bad6a req-e760dbf9-9f9f-471b-bd3a-e395d9d67c3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.899 186962 DEBUG nova.network.neutron [req-24321b1b-9e6c-4dc5-a3a4-3bc7c69bad6a req-e760dbf9-9f9f-471b-bd3a-e395d9d67c3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Refreshing network info cache for port b9da574e-6a19-48ee-acaa-5d9843059704 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.904 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Start _get_guest_xml network_info=[{"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.912 186962 WARNING nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.922 186962 DEBUG nova.virt.libvirt.host [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.923 186962 DEBUG nova.virt.libvirt.host [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.928 186962 DEBUG nova.virt.libvirt.host [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.930 186962 DEBUG nova.virt.libvirt.host [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.932 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.932 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.933 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.933 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.933 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.933 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.934 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.934 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.934 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.934 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.935 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.935 186962 DEBUG nova.virt.hardware [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.940 186962 DEBUG nova.virt.libvirt.vif [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:57:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=43,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbmlwCnrXwgDi+kZQ82iiBrJiPqnqamk6+oVjwA2kGpXrLDsSctqAK0rWLIJYdi1rqr+9xPvmLi4KCpU3OYdjbgezY4yyAIb7mcEAWaeeb0FPPXaiNiyIN2epr0LUYUrQ==',key_name='tempest-keypair-2021864762',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f890cbeccf24a3cb44c8120bb217100',ramdisk_id='',reservation_id='r-ai19nv1a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-444178242',owner_user_name='tempest-ServersV294TestFqdnHostnames-444178242-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:57:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='577626ce99e245058e48fb6c5268a00f',uuid=c8ab194e-e936-4110-aef3-1eb79dc427c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.941 186962 DEBUG nova.network.os_vif_util [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Converting VIF {"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.942 186962 DEBUG nova.network.os_vif_util [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:28:01,bridge_name='br-int',has_traffic_filtering=True,id=b9da574e-6a19-48ee-acaa-5d9843059704,network=Network(161dc148-b0f6-438a-a30b-2e3f075bfa98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9da574e-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.947 186962 DEBUG nova.objects.instance [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lazy-loading 'pci_devices' on Instance uuid c8ab194e-e936-4110-aef3-1eb79dc427c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.973 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  <uuid>c8ab194e-e936-4110-aef3-1eb79dc427c6</uuid>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  <name>instance-0000002b</name>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <nova:name>guest-instance-1</nova:name>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:57:37</nova:creationTime>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:        <nova:user uuid="577626ce99e245058e48fb6c5268a00f">tempest-ServersV294TestFqdnHostnames-444178242-project-member</nova:user>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:        <nova:project uuid="8f890cbeccf24a3cb44c8120bb217100">tempest-ServersV294TestFqdnHostnames-444178242</nova:project>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:        <nova:port uuid="b9da574e-6a19-48ee-acaa-5d9843059704">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <entry name="serial">c8ab194e-e936-4110-aef3-1eb79dc427c6</entry>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <entry name="uuid">c8ab194e-e936-4110-aef3-1eb79dc427c6</entry>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk.config"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:1a:28:01"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <target dev="tapb9da574e-6a"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/console.log" append="off"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:57:37 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:57:37 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:57:37 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:57:37 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.975 186962 DEBUG nova.compute.manager [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Preparing to wait for external event network-vif-plugged-b9da574e-6a19-48ee-acaa-5d9843059704 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.975 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Acquiring lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.976 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.976 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.977 186962 DEBUG nova.virt.libvirt.vif [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:57:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=43,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbmlwCnrXwgDi+kZQ82iiBrJiPqnqamk6+oVjwA2kGpXrLDsSctqAK0rWLIJYdi1rqr+9xPvmLi4KCpU3OYdjbgezY4yyAIb7mcEAWaeeb0FPPXaiNiyIN2epr0LUYUrQ==',key_name='tempest-keypair-2021864762',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f890cbeccf24a3cb44c8120bb217100',ramdisk_id='',reservation_id='r-ai19nv1a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-444178242',owner_user_name='tempest-ServersV294TestFqdnHostnames-444178242-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:57:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='577626ce99e245058e48fb6c5268a00f',uuid=c8ab194e-e936-4110-aef3-1eb79dc427c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.977 186962 DEBUG nova.network.os_vif_util [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Converting VIF {"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.978 186962 DEBUG nova.network.os_vif_util [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1a:28:01,bridge_name='br-int',has_traffic_filtering=True,id=b9da574e-6a19-48ee-acaa-5d9843059704,network=Network(161dc148-b0f6-438a-a30b-2e3f075bfa98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9da574e-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.979 186962 DEBUG os_vif [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:28:01,bridge_name='br-int',has_traffic_filtering=True,id=b9da574e-6a19-48ee-acaa-5d9843059704,network=Network(161dc148-b0f6-438a-a30b-2e3f075bfa98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9da574e-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.980 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.981 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.982 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.986 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.986 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb9da574e-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.987 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb9da574e-6a, col_values=(('external_ids', {'iface-id': 'b9da574e-6a19-48ee-acaa-5d9843059704', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1a:28:01', 'vm-uuid': 'c8ab194e-e936-4110-aef3-1eb79dc427c6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.988 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:37 np0005539505 NetworkManager[55134]: <info>  [1764399457.9900] manager: (tapb9da574e-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.992 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.997 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:37 np0005539505 nova_compute[186958]: 2025-11-29 06:57:37.999 186962 INFO os_vif [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1a:28:01,bridge_name='br-int',has_traffic_filtering=True,id=b9da574e-6a19-48ee-acaa-5d9843059704,network=Network(161dc148-b0f6-438a-a30b-2e3f075bfa98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9da574e-6a')#033[00m
Nov 29 01:57:38 np0005539505 nova_compute[186958]: 2025-11-29 06:57:38.055 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:38 np0005539505 nova_compute[186958]: 2025-11-29 06:57:38.056 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:38 np0005539505 nova_compute[186958]: 2025-11-29 06:57:38.056 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] No VIF found with MAC fa:16:3e:1a:28:01, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:57:38 np0005539505 nova_compute[186958]: 2025-11-29 06:57:38.056 186962 INFO nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Using config drive#033[00m
Nov 29 01:57:38 np0005539505 nova_compute[186958]: 2025-11-29 06:57:38.344 186962 INFO nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Creating config drive at /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk.config#033[00m
Nov 29 01:57:38 np0005539505 nova_compute[186958]: 2025-11-29 06:57:38.359 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgha0ygb0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:38 np0005539505 nova_compute[186958]: 2025-11-29 06:57:38.429 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:38 np0005539505 nova_compute[186958]: 2025-11-29 06:57:38.522 186962 DEBUG oslo_concurrency.processutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgha0ygb0" returned: 0 in 0.163s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:38 np0005539505 kernel: tapb9da574e-6a: entered promiscuous mode
Nov 29 01:57:38 np0005539505 NetworkManager[55134]: <info>  [1764399458.6014] manager: (tapb9da574e-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Nov 29 01:57:38 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:38Z|00177|binding|INFO|Claiming lport b9da574e-6a19-48ee-acaa-5d9843059704 for this chassis.
Nov 29 01:57:38 np0005539505 nova_compute[186958]: 2025-11-29 06:57:38.603 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:38 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:38Z|00178|binding|INFO|b9da574e-6a19-48ee-acaa-5d9843059704: Claiming fa:16:3e:1a:28:01 10.100.0.12
Nov 29 01:57:38 np0005539505 systemd-udevd[220771]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:57:38 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:38Z|00179|binding|INFO|Setting lport b9da574e-6a19-48ee-acaa-5d9843059704 ovn-installed in OVS
Nov 29 01:57:38 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:38Z|00180|binding|INFO|Setting lport b9da574e-6a19-48ee-acaa-5d9843059704 up in Southbound
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.614 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:28:01 10.100.0.12'], port_security=['fa:16:3e:1a:28:01 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c8ab194e-e936-4110-aef3-1eb79dc427c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-161dc148-b0f6-438a-a30b-2e3f075bfa98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f890cbeccf24a3cb44c8120bb217100', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9b06df29-3b9c-4f14-a3f5-5cccbc07d13e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1dfccf9c-9f98-4928-bddf-0a33fb548540, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b9da574e-6a19-48ee-acaa-5d9843059704) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.617 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b9da574e-6a19-48ee-acaa-5d9843059704 in datapath 161dc148-b0f6-438a-a30b-2e3f075bfa98 bound to our chassis#033[00m
Nov 29 01:57:38 np0005539505 nova_compute[186958]: 2025-11-29 06:57:38.618 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:38 np0005539505 nova_compute[186958]: 2025-11-29 06:57:38.621 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.623 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 161dc148-b0f6-438a-a30b-2e3f075bfa98#033[00m
Nov 29 01:57:38 np0005539505 NetworkManager[55134]: <info>  [1764399458.6280] device (tapb9da574e-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:57:38 np0005539505 NetworkManager[55134]: <info>  [1764399458.6312] device (tapb9da574e-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.641 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4aab5a-91c8-405b-899d-0200254c0940]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.643 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap161dc148-b1 in ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.647 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap161dc148-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.647 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[82e78d42-2f02-4ecf-9830-576a7d030976]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 systemd-machined[153285]: New machine qemu-22-instance-0000002b.
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.649 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ae374b09-9e92-43d8-ac5a-e23ec4a55df4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.663 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[29ca2f94-911a-4381-8e31-a6f88550375a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 systemd[1]: Started Virtual Machine qemu-22-instance-0000002b.
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.691 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4cab2561-f12e-42d6-a6b3-b1dd2102e81d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.741 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b93d1ea1-8e6a-4ee1-9932-6d8a33a01cb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.747 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f373e900-f50b-4135-8e5b-24f5ccef8e81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 NetworkManager[55134]: <info>  [1764399458.7489] manager: (tap161dc148-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/97)
Nov 29 01:57:38 np0005539505 systemd-udevd[220822]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:57:38 np0005539505 podman[220810]: 2025-11-29 06:57:38.764458609 +0000 UTC m=+0.169697701 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.802 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[29637c94-ca9f-43f6-b3dd-4c9535b73208]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.806 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[afc81c70-6603-4aa0-a2a1-2806baf87993]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 NetworkManager[55134]: <info>  [1764399458.8341] device (tap161dc148-b0): carrier: link connected
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.843 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d6ab6e-d600-44ab-b6a4-e603f7e25e0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.864 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d15d02f2-7574-45af-8f0b-db33dcdf21eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap161dc148-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:65:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490641, 'reachable_time': 44374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220875, 'error': None, 'target': 'ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.887 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd05574-228b-448d-856a-357ff9e1a94a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:6552'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 490641, 'tstamp': 490641}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220878, 'error': None, 'target': 'ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.909 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fe299293-5239-4a22-a224-dd57ace2c71c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap161dc148-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:65:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490641, 'reachable_time': 44374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220883, 'error': None, 'target': 'ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:38.941 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[37c98b78-90f8-4609-8e21-4c363eab4947]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.015 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399459.0145829, c8ab194e-e936-4110-aef3-1eb79dc427c6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.016 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] VM Started (Lifecycle Event)#033[00m
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:39.019 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a4050d82-1c67-451c-b690-1996f050756d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:39.022 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap161dc148-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:39.023 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:39.024 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap161dc148-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:39 np0005539505 NetworkManager[55134]: <info>  [1764399459.0267] manager: (tap161dc148-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Nov 29 01:57:39 np0005539505 kernel: tap161dc148-b0: entered promiscuous mode
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.025 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:39.029 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap161dc148-b0, col_values=(('external_ids', {'iface-id': 'c217269d-26c1-4873-a3c4-259e754d3f31'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.030 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:39 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:39Z|00181|binding|INFO|Releasing lport c217269d-26c1-4873-a3c4-259e754d3f31 from this chassis (sb_readonly=0)
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.044 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:39.045 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/161dc148-b0f6-438a-a30b-2e3f075bfa98.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/161dc148-b0f6-438a-a30b-2e3f075bfa98.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:39.046 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee1857d-5f63-48fb-9d25-70f2276b92f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:39.047 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-161dc148-b0f6-438a-a30b-2e3f075bfa98
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/161dc148-b0f6-438a-a30b-2e3f075bfa98.pid.haproxy
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 161dc148-b0f6-438a-a30b-2e3f075bfa98
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.048 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:39.050 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98', 'env', 'PROCESS_TAG=haproxy-161dc148-b0f6-438a-a30b-2e3f075bfa98', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/161dc148-b0f6-438a-a30b-2e3f075bfa98.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.053 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399459.014736, c8ab194e-e936-4110-aef3-1eb79dc427c6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.053 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.078 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.083 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.104 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:57:39 np0005539505 podman[220916]: 2025-11-29 06:57:39.465279807 +0000 UTC m=+0.037136001 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.598 186962 DEBUG nova.network.neutron [req-24321b1b-9e6c-4dc5-a3a4-3bc7c69bad6a req-e760dbf9-9f9f-471b-bd3a-e395d9d67c3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Updated VIF entry in instance network info cache for port b9da574e-6a19-48ee-acaa-5d9843059704. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.599 186962 DEBUG nova.network.neutron [req-24321b1b-9e6c-4dc5-a3a4-3bc7c69bad6a req-e760dbf9-9f9f-471b-bd3a-e395d9d67c3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Updating instance_info_cache with network_info: [{"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.627 186962 DEBUG oslo_concurrency.lockutils [req-24321b1b-9e6c-4dc5-a3a4-3bc7c69bad6a req-e760dbf9-9f9f-471b-bd3a-e395d9d67c3a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.734 186962 DEBUG nova.compute.manager [req-bd9278c1-d5d5-4050-bce8-20e04be16876 req-dd6b27a3-f89d-45ab-a80e-04c7628364ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Received event network-vif-plugged-b9da574e-6a19-48ee-acaa-5d9843059704 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.735 186962 DEBUG oslo_concurrency.lockutils [req-bd9278c1-d5d5-4050-bce8-20e04be16876 req-dd6b27a3-f89d-45ab-a80e-04c7628364ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.735 186962 DEBUG oslo_concurrency.lockutils [req-bd9278c1-d5d5-4050-bce8-20e04be16876 req-dd6b27a3-f89d-45ab-a80e-04c7628364ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.736 186962 DEBUG oslo_concurrency.lockutils [req-bd9278c1-d5d5-4050-bce8-20e04be16876 req-dd6b27a3-f89d-45ab-a80e-04c7628364ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.736 186962 DEBUG nova.compute.manager [req-bd9278c1-d5d5-4050-bce8-20e04be16876 req-dd6b27a3-f89d-45ab-a80e-04c7628364ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Processing event network-vif-plugged-b9da574e-6a19-48ee-acaa-5d9843059704 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.738 186962 DEBUG nova.compute.manager [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.744 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399459.7435095, c8ab194e-e936-4110-aef3-1eb79dc427c6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.744 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.748 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.754 186962 INFO nova.virt.libvirt.driver [-] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Instance spawned successfully.#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.754 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.764 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.768 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.784 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.785 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.786 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.786 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.787 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.788 186962 DEBUG nova.virt.libvirt.driver [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.795 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.867 186962 INFO nova.compute.manager [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Took 5.29 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.868 186962 DEBUG nova.compute.manager [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.957 186962 INFO nova.compute.manager [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Took 5.91 seconds to build instance.#033[00m
Nov 29 01:57:39 np0005539505 nova_compute[186958]: 2025-11-29 06:57:39.971 186962 DEBUG oslo_concurrency.lockutils [None req-3dd6f4b7-e568-4a61-aeac-4ca4fa1de8e3 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:40 np0005539505 nova_compute[186958]: 2025-11-29 06:57:40.405 186962 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:40 np0005539505 nova_compute[186958]: 2025-11-29 06:57:40.405 186962 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquired lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:40 np0005539505 nova_compute[186958]: 2025-11-29 06:57:40.406 186962 DEBUG nova.network.neutron [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:57:40 np0005539505 nova_compute[186958]: 2025-11-29 06:57:40.577 186962 DEBUG nova.network.neutron [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:57:41 np0005539505 podman[220916]: 2025-11-29 06:57:41.453145087 +0000 UTC m=+2.025001241 container create a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 01:57:41 np0005539505 nova_compute[186958]: 2025-11-29 06:57:41.573 186962 DEBUG nova.virt.libvirt.driver [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 01:57:41 np0005539505 nova_compute[186958]: 2025-11-29 06:57:41.583 186962 DEBUG nova.network.neutron [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:41 np0005539505 nova_compute[186958]: 2025-11-29 06:57:41.610 186962 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Releasing lock "refresh_cache-72856fd1-9e86-48df-817f-42b206cc0bea" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:41 np0005539505 nova_compute[186958]: 2025-11-29 06:57:41.625 186962 DEBUG nova.virt.libvirt.driver [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Creating tmpfile /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea/tmp945raiel to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Nov 29 01:57:41 np0005539505 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000028.scope: Deactivated successfully.
Nov 29 01:57:41 np0005539505 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000028.scope: Consumed 5.402s CPU time.
Nov 29 01:57:41 np0005539505 systemd-machined[153285]: Machine qemu-21-instance-00000028 terminated.
Nov 29 01:57:41 np0005539505 nova_compute[186958]: 2025-11-29 06:57:41.908 186962 INFO nova.virt.libvirt.driver [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Instance destroyed successfully.#033[00m
Nov 29 01:57:41 np0005539505 nova_compute[186958]: 2025-11-29 06:57:41.909 186962 DEBUG nova.objects.instance [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'resources' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:41 np0005539505 nova_compute[186958]: 2025-11-29 06:57:41.922 186962 INFO nova.virt.libvirt.driver [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Deleting instance files /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_del#033[00m
Nov 29 01:57:41 np0005539505 nova_compute[186958]: 2025-11-29 06:57:41.933 186962 INFO nova.virt.libvirt.driver [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Deletion of /var/lib/nova/instances/72856fd1-9e86-48df-817f-42b206cc0bea_del complete#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.015 186962 DEBUG nova.compute.manager [req-0498d9d2-b22b-49ed-9336-2da7dc4c9e5e req-27c70bd4-6b21-4a91-99e8-0fda5ebee1f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Received event network-vif-plugged-b9da574e-6a19-48ee-acaa-5d9843059704 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.016 186962 DEBUG oslo_concurrency.lockutils [req-0498d9d2-b22b-49ed-9336-2da7dc4c9e5e req-27c70bd4-6b21-4a91-99e8-0fda5ebee1f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.016 186962 DEBUG oslo_concurrency.lockutils [req-0498d9d2-b22b-49ed-9336-2da7dc4c9e5e req-27c70bd4-6b21-4a91-99e8-0fda5ebee1f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.016 186962 DEBUG oslo_concurrency.lockutils [req-0498d9d2-b22b-49ed-9336-2da7dc4c9e5e req-27c70bd4-6b21-4a91-99e8-0fda5ebee1f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.016 186962 DEBUG nova.compute.manager [req-0498d9d2-b22b-49ed-9336-2da7dc4c9e5e req-27c70bd4-6b21-4a91-99e8-0fda5ebee1f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] No waiting events found dispatching network-vif-plugged-b9da574e-6a19-48ee-acaa-5d9843059704 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.016 186962 WARNING nova.compute.manager [req-0498d9d2-b22b-49ed-9336-2da7dc4c9e5e req-27c70bd4-6b21-4a91-99e8-0fda5ebee1f0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Received unexpected event network-vif-plugged-b9da574e-6a19-48ee-acaa-5d9843059704 for instance with vm_state active and task_state None.#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.027 186962 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.028 186962 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.043 186962 DEBUG nova.objects.instance [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lazy-loading 'migration_context' on Instance uuid 72856fd1-9e86-48df-817f-42b206cc0bea obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:42 np0005539505 systemd[1]: Started libpod-conmon-a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95.scope.
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.159 186962 DEBUG nova.compute.provider_tree [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:42 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.182 186962 DEBUG nova.scheduler.client.report [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:42 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b931443e4214ee6daf8e8b60fcfbb7b7478ae0af533b172f6f205f10c5bc9ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.241 186962 DEBUG oslo_concurrency.lockutils [None req-e162e088-2a2a-4d18-a32d-01322f357325 53ee944c04484336b9b14d84235a62b8 890f94a625b342fdb17128922403c925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:42 np0005539505 nova_compute[186958]: 2025-11-29 06:57:42.991 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:43 np0005539505 podman[220916]: 2025-11-29 06:57:43.053295959 +0000 UTC m=+3.625152213 container init a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 01:57:43 np0005539505 podman[220916]: 2025-11-29 06:57:43.060361476 +0000 UTC m=+3.632217630 container start a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:57:43 np0005539505 nova_compute[186958]: 2025-11-29 06:57:43.433 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:43 np0005539505 nova_compute[186958]: 2025-11-29 06:57:43.461 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399448.459422, 2d8568dc-c82e-43a5-a9f1-46434e7873a1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:43 np0005539505 nova_compute[186958]: 2025-11-29 06:57:43.462 186962 INFO nova.compute.manager [-] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:57:43 np0005539505 nova_compute[186958]: 2025-11-29 06:57:43.736 186962 DEBUG nova.compute.manager [None req-15672d32-a34a-4b51-b6ca-e41200305ab7 - - - - - -] [instance: 2d8568dc-c82e-43a5-a9f1-46434e7873a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:44 np0005539505 neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98[220942]: [NOTICE]   (220957) : New worker (220959) forked
Nov 29 01:57:44 np0005539505 neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98[220942]: [NOTICE]   (220957) : Loading success.
Nov 29 01:57:44 np0005539505 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 01:57:44 np0005539505 systemd[220698]: Activating special unit Exit the Session...
Nov 29 01:57:44 np0005539505 systemd[220698]: Stopped target Main User Target.
Nov 29 01:57:44 np0005539505 systemd[220698]: Stopped target Basic System.
Nov 29 01:57:44 np0005539505 systemd[220698]: Stopped target Paths.
Nov 29 01:57:44 np0005539505 systemd[220698]: Stopped target Sockets.
Nov 29 01:57:44 np0005539505 systemd[220698]: Stopped target Timers.
Nov 29 01:57:44 np0005539505 systemd[220698]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 01:57:44 np0005539505 systemd[220698]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 01:57:44 np0005539505 systemd[220698]: Closed D-Bus User Message Bus Socket.
Nov 29 01:57:44 np0005539505 systemd[220698]: Stopped Create User's Volatile Files and Directories.
Nov 29 01:57:44 np0005539505 systemd[220698]: Removed slice User Application Slice.
Nov 29 01:57:44 np0005539505 systemd[220698]: Reached target Shutdown.
Nov 29 01:57:44 np0005539505 systemd[220698]: Finished Exit the Session.
Nov 29 01:57:44 np0005539505 systemd[220698]: Reached target Exit the Session.
Nov 29 01:57:44 np0005539505 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 01:57:44 np0005539505 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 01:57:44 np0005539505 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 01:57:44 np0005539505 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 01:57:44 np0005539505 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 01:57:44 np0005539505 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 01:57:44 np0005539505 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 01:57:44 np0005539505 nova_compute[186958]: 2025-11-29 06:57:44.626 186962 DEBUG nova.compute.manager [req-42fdf83f-76d3-4a0a-8d44-d6e766824681 req-ae151f30-53b1-471f-83fa-b37096cb9338 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Received event network-changed-b9da574e-6a19-48ee-acaa-5d9843059704 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:44 np0005539505 nova_compute[186958]: 2025-11-29 06:57:44.627 186962 DEBUG nova.compute.manager [req-42fdf83f-76d3-4a0a-8d44-d6e766824681 req-ae151f30-53b1-471f-83fa-b37096cb9338 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Refreshing instance network info cache due to event network-changed-b9da574e-6a19-48ee-acaa-5d9843059704. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:57:44 np0005539505 nova_compute[186958]: 2025-11-29 06:57:44.628 186962 DEBUG oslo_concurrency.lockutils [req-42fdf83f-76d3-4a0a-8d44-d6e766824681 req-ae151f30-53b1-471f-83fa-b37096cb9338 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:44 np0005539505 nova_compute[186958]: 2025-11-29 06:57:44.628 186962 DEBUG oslo_concurrency.lockutils [req-42fdf83f-76d3-4a0a-8d44-d6e766824681 req-ae151f30-53b1-471f-83fa-b37096cb9338 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:44 np0005539505 nova_compute[186958]: 2025-11-29 06:57:44.628 186962 DEBUG nova.network.neutron [req-42fdf83f-76d3-4a0a-8d44-d6e766824681 req-ae151f30-53b1-471f-83fa-b37096cb9338 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Refreshing network info cache for port b9da574e-6a19-48ee-acaa-5d9843059704 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:57:44 np0005539505 podman[220946]: 2025-11-29 06:57:44.633259187 +0000 UTC m=+0.947656853 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:57:45 np0005539505 nova_compute[186958]: 2025-11-29 06:57:45.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:46 np0005539505 nova_compute[186958]: 2025-11-29 06:57:46.033 186962 DEBUG nova.network.neutron [req-42fdf83f-76d3-4a0a-8d44-d6e766824681 req-ae151f30-53b1-471f-83fa-b37096cb9338 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Updated VIF entry in instance network info cache for port b9da574e-6a19-48ee-acaa-5d9843059704. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:57:46 np0005539505 nova_compute[186958]: 2025-11-29 06:57:46.034 186962 DEBUG nova.network.neutron [req-42fdf83f-76d3-4a0a-8d44-d6e766824681 req-ae151f30-53b1-471f-83fa-b37096cb9338 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Updating instance_info_cache with network_info: [{"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:46 np0005539505 nova_compute[186958]: 2025-11-29 06:57:46.063 186962 DEBUG oslo_concurrency.lockutils [req-42fdf83f-76d3-4a0a-8d44-d6e766824681 req-ae151f30-53b1-471f-83fa-b37096cb9338 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:46 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:46Z|00182|binding|INFO|Releasing lport 4035feb9-29a5-4ae9-8490-a44f1379821c from this chassis (sb_readonly=0)
Nov 29 01:57:46 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:46Z|00183|binding|INFO|Releasing lport c217269d-26c1-4873-a3c4-259e754d3f31 from this chassis (sb_readonly=0)
Nov 29 01:57:46 np0005539505 nova_compute[186958]: 2025-11-29 06:57:46.206 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:46 np0005539505 podman[220978]: 2025-11-29 06:57:46.744427986 +0000 UTC m=+0.069404237 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:57:46 np0005539505 kernel: tap4dfd4ca1-34 (unregistering): left promiscuous mode
Nov 29 01:57:46 np0005539505 NetworkManager[55134]: <info>  [1764399466.8341] device (tap4dfd4ca1-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:57:46 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:46Z|00184|binding|INFO|Releasing lport 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 from this chassis (sb_readonly=0)
Nov 29 01:57:46 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:46Z|00185|binding|INFO|Setting lport 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 down in Southbound
Nov 29 01:57:46 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:46Z|00186|binding|INFO|Removing iface tap4dfd4ca1-34 ovn-installed in OVS
Nov 29 01:57:46 np0005539505 nova_compute[186958]: 2025-11-29 06:57:46.849 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:46 np0005539505 nova_compute[186958]: 2025-11-29 06:57:46.854 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:46 np0005539505 nova_compute[186958]: 2025-11-29 06:57:46.876 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:46.879 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:c9:0d 10.100.0.5'], port_security=['fa:16:3e:61:c9:0d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4d501671-0357-4018-80f9-31d43293d107', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db691b6b-17b7-42a9-9fd2-162233da0513', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4362be0b90a64d63b2294bbc495486d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7810e1a5-baa0-4375-99a8-632ac4dab559', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03ee1f45-6435-43da-9a98-5273904b0bb0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=4dfd4ca1-34cd-46b3-9509-5d99ceaee255) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:46.882 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 in datapath db691b6b-17b7-42a9-9fd2-162233da0513 unbound from our chassis#033[00m
Nov 29 01:57:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:46.884 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network db691b6b-17b7-42a9-9fd2-162233da0513, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:57:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:46.886 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4b1c48bb-bb2e-4226-8f3a-1cf70cd43e7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:46.886 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 namespace which is not needed anymore#033[00m
Nov 29 01:57:46 np0005539505 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Nov 29 01:57:46 np0005539505 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d0000002a.scope: Consumed 14.432s CPU time.
Nov 29 01:57:46 np0005539505 systemd-machined[153285]: Machine qemu-19-instance-0000002a terminated.
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.381 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.412 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.413 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.414 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.414 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.503 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:47 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[220341]: [NOTICE]   (220352) : haproxy version is 2.8.14-c23fe91
Nov 29 01:57:47 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[220341]: [NOTICE]   (220352) : path to executable is /usr/sbin/haproxy
Nov 29 01:57:47 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[220341]: [WARNING]  (220352) : Exiting Master process...
Nov 29 01:57:47 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[220341]: [ALERT]    (220352) : Current worker (220354) exited with code 143 (Terminated)
Nov 29 01:57:47 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[220341]: [WARNING]  (220352) : All workers exited. Exiting... (0)
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.597 186962 DEBUG nova.compute.manager [req-67942c87-b1dc-4e2f-a898-4e288d28a269 req-c77852ed-644b-4a06-b362-fb4650505850 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received event network-vif-unplugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:47 np0005539505 systemd[1]: libpod-4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e.scope: Deactivated successfully.
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.598 186962 DEBUG oslo_concurrency.lockutils [req-67942c87-b1dc-4e2f-a898-4e288d28a269 req-c77852ed-644b-4a06-b362-fb4650505850 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.602 186962 DEBUG oslo_concurrency.lockutils [req-67942c87-b1dc-4e2f-a898-4e288d28a269 req-c77852ed-644b-4a06-b362-fb4650505850 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.602 186962 DEBUG oslo_concurrency.lockutils [req-67942c87-b1dc-4e2f-a898-4e288d28a269 req-c77852ed-644b-4a06-b362-fb4650505850 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.603 186962 DEBUG nova.compute.manager [req-67942c87-b1dc-4e2f-a898-4e288d28a269 req-c77852ed-644b-4a06-b362-fb4650505850 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] No waiting events found dispatching network-vif-unplugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.603 186962 WARNING nova.compute.manager [req-67942c87-b1dc-4e2f-a898-4e288d28a269 req-c77852ed-644b-4a06-b362-fb4650505850 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received unexpected event network-vif-unplugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 for instance with vm_state active and task_state powering-off.#033[00m
Nov 29 01:57:47 np0005539505 podman[221021]: 2025-11-29 06:57:47.604146107 +0000 UTC m=+0.590679916 container died 4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.614 186962 INFO nova.virt.libvirt.driver [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance shutdown successfully after 27 seconds.#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.624 186962 INFO nova.virt.libvirt.driver [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance destroyed successfully.#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.625 186962 DEBUG nova.objects.instance [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.647 186962 DEBUG nova.compute.manager [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.674 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk --force-share --output=json" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.675 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.753 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.754 186962 DEBUG oslo_concurrency.lockutils [None req-4124901f-3594-4c65-b838-2c6fb5f19c24 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 27.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.763 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.838 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.840 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:47.844 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.871 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.914 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:47 np0005539505 nova_compute[186958]: 2025-11-29 06:57:47.993 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.094 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.097 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5562MB free_disk=73.23869323730469GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.098 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.098 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.208 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 4d501671-0357-4018-80f9-31d43293d107 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.210 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance c8ab194e-e936-4110-aef3-1eb79dc427c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.210 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.210 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.283 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.301 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.328 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.330 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:48 np0005539505 nova_compute[186958]: 2025-11-29 06:57:48.436 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:48 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e-userdata-shm.mount: Deactivated successfully.
Nov 29 01:57:48 np0005539505 systemd[1]: var-lib-containers-storage-overlay-3cf97fbe655a30fdb638b1b1b815079426660c9939fa7cdbb6711b394ce7ab81-merged.mount: Deactivated successfully.
Nov 29 01:57:49 np0005539505 podman[221021]: 2025-11-29 06:57:49.11163926 +0000 UTC m=+2.098173059 container cleanup 4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:57:49 np0005539505 systemd[1]: libpod-conmon-4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e.scope: Deactivated successfully.
Nov 29 01:57:49 np0005539505 nova_compute[186958]: 2025-11-29 06:57:49.782 186962 DEBUG nova.compute.manager [req-ab35ec84-0489-4689-b13d-71fe40d1e200 req-caf4fbf0-e68e-4478-b908-ffa50f9add8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:49 np0005539505 nova_compute[186958]: 2025-11-29 06:57:49.783 186962 DEBUG oslo_concurrency.lockutils [req-ab35ec84-0489-4689-b13d-71fe40d1e200 req-caf4fbf0-e68e-4478-b908-ffa50f9add8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:49 np0005539505 nova_compute[186958]: 2025-11-29 06:57:49.783 186962 DEBUG oslo_concurrency.lockutils [req-ab35ec84-0489-4689-b13d-71fe40d1e200 req-caf4fbf0-e68e-4478-b908-ffa50f9add8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:49 np0005539505 nova_compute[186958]: 2025-11-29 06:57:49.784 186962 DEBUG oslo_concurrency.lockutils [req-ab35ec84-0489-4689-b13d-71fe40d1e200 req-caf4fbf0-e68e-4478-b908-ffa50f9add8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:49 np0005539505 nova_compute[186958]: 2025-11-29 06:57:49.784 186962 DEBUG nova.compute.manager [req-ab35ec84-0489-4689-b13d-71fe40d1e200 req-caf4fbf0-e68e-4478-b908-ffa50f9add8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] No waiting events found dispatching network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:49 np0005539505 nova_compute[186958]: 2025-11-29 06:57:49.784 186962 WARNING nova.compute.manager [req-ab35ec84-0489-4689-b13d-71fe40d1e200 req-caf4fbf0-e68e-4478-b908-ffa50f9add8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received unexpected event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 for instance with vm_state stopped and task_state None.#033[00m
Nov 29 01:57:50 np0005539505 nova_compute[186958]: 2025-11-29 06:57:50.508 186962 INFO nova.compute.manager [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Rebuilding instance#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.051 186962 DEBUG nova.compute.manager [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.115 186962 DEBUG nova.objects.instance [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.127 186962 DEBUG nova.objects.instance [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.146 186962 DEBUG nova.objects.instance [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'resources' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.159 186962 DEBUG nova.objects.instance [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.172 186962 DEBUG nova.objects.instance [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.177 186962 INFO nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance already shutdown.#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.182 186962 INFO nova.virt.libvirt.driver [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance destroyed successfully.#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.188 186962 INFO nova.virt.libvirt.driver [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance destroyed successfully.#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.189 186962 DEBUG nova.virt.libvirt.vif [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:57:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1768107239',display_name='tempest-tempest.common.compute-instance-1768107239',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1768107239',id=42,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:57:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-gh9wve4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:57:49Z,user_data=None,user_id='812d926ee4ed4159b2e88b7a69990423',uuid=4d501671-0357-4018-80f9-31d43293d107,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.189 186962 DEBUG nova.network.os_vif_util [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.190 186962 DEBUG nova.network.os_vif_util [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.191 186962 DEBUG os_vif [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.193 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.194 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4dfd4ca1-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.195 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.199 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.204 186962 INFO os_vif [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34')#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.205 186962 INFO nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Deleting instance files /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107_del#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.205 186962 INFO nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Deletion of /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107_del complete#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.378 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.378 186962 INFO nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Creating image(s)#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.379 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.379 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.380 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.380 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.380 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:51 np0005539505 podman[221082]: 2025-11-29 06:57:51.935181736 +0000 UTC m=+2.779838164 container remove 4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:57:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:51.948 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f9ec66-72c9-4355-a49f-a5bd5b186ec4]: (4, ('Sat Nov 29 06:57:46 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 (4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e)\n4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e\nSat Nov 29 06:57:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 (4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e)\n4a69dc28ee71adb5173dbbaeaf6ce41155f3bfc1fcf94b1d780844fbc211b49e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:51.955 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[70355b72-00a6-447e-8674-91297103b84b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:51.958 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb691b6b-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.961 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:51 np0005539505 kernel: tapdb691b6b-10: left promiscuous mode
Nov 29 01:57:51 np0005539505 nova_compute[186958]: 2025-11-29 06:57:51.975 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:51.980 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[aad22fbe-54de-4ce3-8e01-9413cc927a26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:51.999 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a2042a75-44db-4553-a793-09272875cd52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:52.001 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fd109d22-664c-42f4-948a-039568d39263]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:52.030 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[aebee309-e447-4630-acdb-a6e7b4901b98]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488801, 'reachable_time': 15064, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221099, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:52.038 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:57:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:52.039 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[211de353-375f-4725-a244-712422fb6913]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:52 np0005539505 systemd[1]: run-netns-ovnmeta\x2ddb691b6b\x2d17b7\x2d42a9\x2d9fd2\x2d162233da0513.mount: Deactivated successfully.
Nov 29 01:57:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:52.040 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:57:52 np0005539505 nova_compute[186958]: 2025-11-29 06:57:52.958 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:52 np0005539505 nova_compute[186958]: 2025-11-29 06:57:52.959 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:52 np0005539505 nova_compute[186958]: 2025-11-29 06:57:52.976 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:52 np0005539505 nova_compute[186958]: 2025-11-29 06:57:52.976 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.192 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.268 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.269 186962 DEBUG nova.virt.images [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] 3372b7b2-657b-4c4d-9d9d-7c5b771a630a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.645 186962 DEBUG nova.privsep.utils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.646 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.664 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.667 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.668 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.668 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.848 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.849 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.849 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 01:57:53 np0005539505 nova_compute[186958]: 2025-11-29 06:57:53.849 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c8ab194e-e936-4110-aef3-1eb79dc427c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:54 np0005539505 nova_compute[186958]: 2025-11-29 06:57:54.838 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.part /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted" returned: 0 in 1.192s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:54 np0005539505 nova_compute[186958]: 2025-11-29 06:57:54.843 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:54 np0005539505 nova_compute[186958]: 2025-11-29 06:57:54.927 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123.converted --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:54 np0005539505 nova_compute[186958]: 2025-11-29 06:57:54.929 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:54 np0005539505 nova_compute[186958]: 2025-11-29 06:57:54.944 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.031 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.035 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.036 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.037 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.059 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.137 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.138 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.167 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Updating instance_info_cache with network_info: [{"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.207 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-c8ab194e-e936-4110-aef3-1eb79dc427c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.208 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.220 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk 1073741824" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.222 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.222 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.290 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.291 186962 DEBUG nova.virt.disk.api [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Checking if we can resize image /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.292 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.357 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.358 186962 DEBUG nova.virt.disk.api [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Cannot resize image /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.359 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.359 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Ensure instance console log exists: /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.359 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.360 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.360 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.362 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Start _get_guest_xml network_info=[{"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.367 186962 WARNING nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.377 186962 DEBUG nova.virt.libvirt.host [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.378 186962 DEBUG nova.virt.libvirt.host [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.384 186962 DEBUG nova.virt.libvirt.host [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.385 186962 DEBUG nova.virt.libvirt.host [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.386 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.386 186962 DEBUG nova.virt.hardware [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.387 186962 DEBUG nova.virt.hardware [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.387 186962 DEBUG nova.virt.hardware [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.387 186962 DEBUG nova.virt.hardware [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.387 186962 DEBUG nova.virt.hardware [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.388 186962 DEBUG nova.virt.hardware [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.388 186962 DEBUG nova.virt.hardware [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.388 186962 DEBUG nova.virt.hardware [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.388 186962 DEBUG nova.virt.hardware [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.388 186962 DEBUG nova.virt.hardware [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.389 186962 DEBUG nova.virt.hardware [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.389 186962 DEBUG nova.objects.instance [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.419 186962 DEBUG nova.virt.libvirt.vif [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:57:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1768107239',display_name='tempest-tempest.common.compute-instance-1768107239',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1768107239',id=42,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:57:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-gh9wve4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:57:51Z,user_data=None,user_id='812d926ee4ed4159b2e88b7a69990423',uuid=4d501671-0357-4018-80f9-31d43293d107,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.420 186962 DEBUG nova.network.os_vif_util [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.421 186962 DEBUG nova.network.os_vif_util [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.424 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  <uuid>4d501671-0357-4018-80f9-31d43293d107</uuid>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  <name>instance-0000002a</name>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <nova:name>tempest-tempest.common.compute-instance-1768107239</nova:name>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:57:55</nova:creationTime>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:        <nova:user uuid="812d926ee4ed4159b2e88b7a69990423">tempest-ServerActionsTestOtherA-229564135-project-member</nova:user>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:        <nova:project uuid="4362be0b90a64d63b2294bbc495486d3">tempest-ServerActionsTestOtherA-229564135</nova:project>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="3372b7b2-657b-4c4d-9d9d-7c5b771a630a"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:        <nova:port uuid="4dfd4ca1-34cd-46b3-9509-5d99ceaee255">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <entry name="serial">4d501671-0357-4018-80f9-31d43293d107</entry>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <entry name="uuid">4d501671-0357-4018-80f9-31d43293d107</entry>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.config"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:61:c9:0d"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <target dev="tap4dfd4ca1-34"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/console.log" append="off"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:57:55 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:57:55 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:57:55 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:57:55 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.425 186962 DEBUG nova.compute.manager [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Preparing to wait for external event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.426 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.426 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.426 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.427 186962 DEBUG nova.virt.libvirt.vif [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:57:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1768107239',display_name='tempest-tempest.common.compute-instance-1768107239',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1768107239',id=42,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:57:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-gh9wve4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:57:51Z,user_data=None,user_id='812d926ee4ed4159b2e88b7a69990423',uuid=4d501671-0357-4018-80f9-31d43293d107,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.428 186962 DEBUG nova.network.os_vif_util [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.428 186962 DEBUG nova.network.os_vif_util [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.429 186962 DEBUG os_vif [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.430 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.431 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.431 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.436 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.436 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4dfd4ca1-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.437 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4dfd4ca1-34, col_values=(('external_ids', {'iface-id': '4dfd4ca1-34cd-46b3-9509-5d99ceaee255', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:c9:0d', 'vm-uuid': '4d501671-0357-4018-80f9-31d43293d107'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.439 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:55 np0005539505 NetworkManager[55134]: <info>  [1764399475.4401] manager: (tap4dfd4ca1-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.440 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.449 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.450 186962 INFO os_vif [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34')#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.495 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.496 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.496 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No VIF found with MAC fa:16:3e:61:c9:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.497 186962 INFO nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Using config drive#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.515 186962 DEBUG nova.objects.instance [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.551 186962 DEBUG nova.objects.instance [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'keypairs' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:55 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:55Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1a:28:01 10.100.0.12
Nov 29 01:57:55 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:55Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1a:28:01 10.100.0.12
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.857 186962 INFO nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Creating config drive at /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.config#033[00m
Nov 29 01:57:55 np0005539505 nova_compute[186958]: 2025-11-29 06:57:55.864 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcbjcm2z_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.000 186962 DEBUG oslo_concurrency.processutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcbjcm2z_" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:57:56 np0005539505 kernel: tap4dfd4ca1-34: entered promiscuous mode
Nov 29 01:57:56 np0005539505 NetworkManager[55134]: <info>  [1764399476.0807] manager: (tap4dfd4ca1-34): new Tun device (/org/freedesktop/NetworkManager/Devices/100)
Nov 29 01:57:56 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:56Z|00187|binding|INFO|Claiming lport 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 for this chassis.
Nov 29 01:57:56 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:56Z|00188|binding|INFO|4dfd4ca1-34cd-46b3-9509-5d99ceaee255: Claiming fa:16:3e:61:c9:0d 10.100.0.5
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.081 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.087 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:c9:0d 10.100.0.5'], port_security=['fa:16:3e:61:c9:0d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4d501671-0357-4018-80f9-31d43293d107', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db691b6b-17b7-42a9-9fd2-162233da0513', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4362be0b90a64d63b2294bbc495486d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7810e1a5-baa0-4375-99a8-632ac4dab559', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03ee1f45-6435-43da-9a98-5273904b0bb0, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=4dfd4ca1-34cd-46b3-9509-5d99ceaee255) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.088 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 in datapath db691b6b-17b7-42a9-9fd2-162233da0513 bound to our chassis#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.090 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network db691b6b-17b7-42a9-9fd2-162233da0513#033[00m
Nov 29 01:57:56 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:56Z|00189|binding|INFO|Setting lport 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 ovn-installed in OVS
Nov 29 01:57:56 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:56Z|00190|binding|INFO|Setting lport 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 up in Southbound
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.102 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.104 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[18dd1aa2-07fd-4109-8d30-13eab65eee4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.105 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdb691b6b-11 in ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.107 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.108 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdb691b6b-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.108 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ad69ed-7cb5-4c75-98d9-46cf61770af3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.109 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[852cfc1c-56a7-43ee-860f-30962a9f4dc2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 systemd-udevd[221164]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:57:56 np0005539505 NetworkManager[55134]: <info>  [1764399476.1280] device (tap4dfd4ca1-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.126 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[58cdd023-0ffe-44c9-992e-6f72db138612]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 NetworkManager[55134]: <info>  [1764399476.1291] device (tap4dfd4ca1-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:57:56 np0005539505 systemd-machined[153285]: New machine qemu-23-instance-0000002a.
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.145 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[72085b54-d261-458e-bf87-8bae941b0e68]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 systemd[1]: Started Virtual Machine qemu-23-instance-0000002a.
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.183 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad84f05-9051-4fbd-a2c5-0331e822276f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.189 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[33af80df-eaf8-4d5d-b369-8c623419bc2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 NetworkManager[55134]: <info>  [1764399476.1911] manager: (tapdb691b6b-10): new Veth device (/org/freedesktop/NetworkManager/Devices/101)
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.231 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[edbbed2a-f514-4769-8a4f-ddac019db50a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.237 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3b20970c-f6f1-4bff-a69b-5f50fd243106]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 NetworkManager[55134]: <info>  [1764399476.2684] device (tapdb691b6b-10): carrier: link connected
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.280 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[5c870506-0155-45fe-97c2-4327b450f707]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.301 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[09de3029-4d38-47b4-a961-7a942ba6fd3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb691b6b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:ad:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492385, 'reachable_time': 33335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221198, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.320 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7f2518c4-8b1a-4d64-818e-ccd4fc662148]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe61:ad90'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 492385, 'tstamp': 492385}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221199, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.343 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d2ce09-8438-4e21-8c04-7b7dbd58e38d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb691b6b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:ad:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492385, 'reachable_time': 33335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221200, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.356 186962 DEBUG nova.compute.manager [req-9dab73cc-45c4-4994-ba50-0fa0e451b128 req-0de395ab-c5e6-4d01-be32-d69b42baeb68 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.357 186962 DEBUG oslo_concurrency.lockutils [req-9dab73cc-45c4-4994-ba50-0fa0e451b128 req-0de395ab-c5e6-4d01-be32-d69b42baeb68 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.357 186962 DEBUG oslo_concurrency.lockutils [req-9dab73cc-45c4-4994-ba50-0fa0e451b128 req-0de395ab-c5e6-4d01-be32-d69b42baeb68 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.358 186962 DEBUG oslo_concurrency.lockutils [req-9dab73cc-45c4-4994-ba50-0fa0e451b128 req-0de395ab-c5e6-4d01-be32-d69b42baeb68 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.358 186962 DEBUG nova.compute.manager [req-9dab73cc-45c4-4994-ba50-0fa0e451b128 req-0de395ab-c5e6-4d01-be32-d69b42baeb68 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Processing event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.385 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce0d9ff-879d-424e-8741-d66ffdb15fc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.460 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6db097be-e002-4764-bbbf-0d61e28ee834]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.462 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb691b6b-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.462 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.463 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb691b6b-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.465 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:56 np0005539505 NetworkManager[55134]: <info>  [1764399476.4660] manager: (tapdb691b6b-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Nov 29 01:57:56 np0005539505 kernel: tapdb691b6b-10: entered promiscuous mode
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.468 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.470 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdb691b6b-10, col_values=(('external_ids', {'iface-id': '4035feb9-29a5-4ae9-8490-a44f1379821c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.471 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:56 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:56Z|00191|binding|INFO|Releasing lport 4035feb9-29a5-4ae9-8490-a44f1379821c from this chassis (sb_readonly=0)
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.484 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Removed pending event for 4d501671-0357-4018-80f9-31d43293d107 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.484 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.485 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399476.482514, 4d501671-0357-4018-80f9-31d43293d107 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.485 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] VM Started (Lifecycle Event)#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.486 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[63a8b39d-eaa6-4f76-b051-b3ec5e073cb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.487 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-db691b6b-17b7-42a9-9fd2-162233da0513
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID db691b6b-17b7-42a9-9fd2-162233da0513
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.488 186962 DEBUG nova.compute.manager [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.488 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'env', 'PROCESS_TAG=haproxy-db691b6b-17b7-42a9-9fd2-162233da0513', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/db691b6b-17b7-42a9-9fd2-162233da0513.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.488 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.493 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.498 186962 INFO nova.virt.libvirt.driver [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance spawned successfully.#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.498 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.525 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.532 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.537 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.537 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.538 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.539 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.539 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.540 186962 DEBUG nova.virt.libvirt.driver [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.591 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.592 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399476.4827368, 4d501671-0357-4018-80f9-31d43293d107 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.592 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.637 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.645 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399476.492421, 4d501671-0357-4018-80f9-31d43293d107 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.645 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.677 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.681 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.684 186962 DEBUG nova.compute.manager [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.712 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.764 186962 INFO nova.compute.manager [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] bringing vm to original state: 'stopped'#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.854 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.856 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.856 186962 DEBUG nova.compute.manager [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.861 186962 DEBUG nova.compute.manager [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 29 01:57:56 np0005539505 kernel: tap4dfd4ca1-34 (unregistering): left promiscuous mode
Nov 29 01:57:56 np0005539505 NetworkManager[55134]: <info>  [1764399476.9051] device (tap4dfd4ca1-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.910 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399461.9051435, 72856fd1-9e86-48df-817f-42b206cc0bea => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.912 186962 INFO nova.compute.manager [-] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:57:56 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:56Z|00192|binding|INFO|Releasing lport 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 from this chassis (sb_readonly=0)
Nov 29 01:57:56 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:56Z|00193|binding|INFO|Setting lport 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 down in Southbound
Nov 29 01:57:56 np0005539505 ovn_controller[95143]: 2025-11-29T06:57:56Z|00194|binding|INFO|Removing iface tap4dfd4ca1-34 ovn-installed in OVS
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.928 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:56.934 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:c9:0d 10.100.0.5'], port_security=['fa:16:3e:61:c9:0d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4d501671-0357-4018-80f9-31d43293d107', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db691b6b-17b7-42a9-9fd2-162233da0513', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4362be0b90a64d63b2294bbc495486d3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7810e1a5-baa0-4375-99a8-632ac4dab559', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03ee1f45-6435-43da-9a98-5273904b0bb0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=4dfd4ca1-34cd-46b3-9509-5d99ceaee255) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:57:56 np0005539505 podman[221239]: 2025-11-29 06:57:56.936479065 +0000 UTC m=+0.065116312 container create 762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.939 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:56 np0005539505 nova_compute[186958]: 2025-11-29 06:57:56.954 186962 DEBUG nova.compute.manager [None req-185e8fd2-0032-43e9-a744-26bd810f41a4 - - - - - -] [instance: 72856fd1-9e86-48df-817f-42b206cc0bea] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:56 np0005539505 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Nov 29 01:57:56 np0005539505 systemd[1]: Started libpod-conmon-762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b.scope.
Nov 29 01:57:56 np0005539505 systemd-machined[153285]: Machine qemu-23-instance-0000002a terminated.
Nov 29 01:57:56 np0005539505 podman[221239]: 2025-11-29 06:57:56.900050376 +0000 UTC m=+0.028687643 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:57:56 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:57:57 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02f7adaf502b082ce6f8288cb87814ce79c96cf8e9c495c108392b3d9fbab96c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:57:57 np0005539505 podman[221239]: 2025-11-29 06:57:57.113913713 +0000 UTC m=+0.242550990 container init 762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:57:57 np0005539505 podman[221239]: 2025-11-29 06:57:57.125378759 +0000 UTC m=+0.254016006 container start 762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 01:57:57 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221256]: [NOTICE]   (221267) : New worker (221278) forked
Nov 29 01:57:57 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221256]: [NOTICE]   (221267) : Loading success.
Nov 29 01:57:57 np0005539505 nova_compute[186958]: 2025-11-29 06:57:57.173 186962 INFO nova.virt.libvirt.driver [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance destroyed successfully.#033[00m
Nov 29 01:57:57 np0005539505 nova_compute[186958]: 2025-11-29 06:57:57.174 186962 DEBUG nova.compute.manager [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.216 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 4dfd4ca1-34cd-46b3-9509-5d99ceaee255 in datapath db691b6b-17b7-42a9-9fd2-162233da0513 unbound from our chassis#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.218 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network db691b6b-17b7-42a9-9fd2-162233da0513, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.219 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[107c9cde-22b6-4faa-8410-74252148e081]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.220 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 namespace which is not needed anymore#033[00m
Nov 29 01:57:57 np0005539505 nova_compute[186958]: 2025-11-29 06:57:57.290 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 0.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:57 np0005539505 nova_compute[186958]: 2025-11-29 06:57:57.325 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:57 np0005539505 nova_compute[186958]: 2025-11-29 06:57:57.325 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:57 np0005539505 nova_compute[186958]: 2025-11-29 06:57:57.326 186962 DEBUG nova.objects.instance [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 01:57:57 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221256]: [NOTICE]   (221267) : haproxy version is 2.8.14-c23fe91
Nov 29 01:57:57 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221256]: [NOTICE]   (221267) : path to executable is /usr/sbin/haproxy
Nov 29 01:57:57 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221256]: [WARNING]  (221267) : Exiting Master process...
Nov 29 01:57:57 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221256]: [WARNING]  (221267) : Exiting Master process...
Nov 29 01:57:57 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221256]: [ALERT]    (221267) : Current worker (221278) exited with code 143 (Terminated)
Nov 29 01:57:57 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221256]: [WARNING]  (221267) : All workers exited. Exiting... (0)
Nov 29 01:57:57 np0005539505 systemd[1]: libpod-762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b.scope: Deactivated successfully.
Nov 29 01:57:57 np0005539505 podman[221305]: 2025-11-29 06:57:57.373523942 +0000 UTC m=+0.051529924 container died 762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:57:57 np0005539505 nova_compute[186958]: 2025-11-29 06:57:57.393 186962 DEBUG oslo_concurrency.lockutils [None req-8c40e585-7f3a-4f41-be51-1832155ff902 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:57 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b-userdata-shm.mount: Deactivated successfully.
Nov 29 01:57:57 np0005539505 systemd[1]: var-lib-containers-storage-overlay-02f7adaf502b082ce6f8288cb87814ce79c96cf8e9c495c108392b3d9fbab96c-merged.mount: Deactivated successfully.
Nov 29 01:57:57 np0005539505 podman[221305]: 2025-11-29 06:57:57.656741043 +0000 UTC m=+0.334747025 container cleanup 762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 01:57:57 np0005539505 systemd[1]: libpod-conmon-762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b.scope: Deactivated successfully.
Nov 29 01:57:57 np0005539505 podman[221333]: 2025-11-29 06:57:57.727642514 +0000 UTC m=+0.048326650 container remove 762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.736 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1c736324-08f3-4358-8183-c15cf8fd5f37]: (4, ('Sat Nov 29 06:57:57 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 (762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b)\n762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b\nSat Nov 29 06:57:57 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 (762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b)\n762a76fcb61089ae8452e7050a004022230af752f633ca88bbdc2d8cf892e69b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.739 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cc2a4633-4d6b-499e-9804-14f9e886a1ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.741 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb691b6b-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:57 np0005539505 nova_compute[186958]: 2025-11-29 06:57:57.744 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:57 np0005539505 kernel: tapdb691b6b-10: left promiscuous mode
Nov 29 01:57:57 np0005539505 nova_compute[186958]: 2025-11-29 06:57:57.765 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.770 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bbfc634a-1940-4014-b067-552569e1309d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.789 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[266668b4-16c0-4021-b6ea-243f21f54d90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.791 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6b9dd8e3-903d-452c-9c67-1970d0e9c5ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.813 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[630be361-b46e-4407-81cc-65830b68c423]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 492376, 'reachable_time': 35349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221353, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.819 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:57:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:57:57.819 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[9b0e0413-15dd-4fb4-835c-f237ea5c88c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:57:57 np0005539505 systemd[1]: run-netns-ovnmeta\x2ddb691b6b\x2d17b7\x2d42a9\x2d9fd2\x2d162233da0513.mount: Deactivated successfully.
Nov 29 01:57:57 np0005539505 podman[221354]: 2025-11-29 06:57:57.872989469 +0000 UTC m=+0.060296370 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:57:57 np0005539505 podman[221351]: 2025-11-29 06:57:57.881654724 +0000 UTC m=+0.071689595 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.362 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.442 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.483 186962 DEBUG nova.compute.manager [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.483 186962 DEBUG oslo_concurrency.lockutils [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.483 186962 DEBUG oslo_concurrency.lockutils [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.484 186962 DEBUG oslo_concurrency.lockutils [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.484 186962 DEBUG nova.compute.manager [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] No waiting events found dispatching network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.484 186962 WARNING nova.compute.manager [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received unexpected event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 for instance with vm_state stopped and task_state None.#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.485 186962 DEBUG nova.compute.manager [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received event network-vif-unplugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.485 186962 DEBUG oslo_concurrency.lockutils [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.485 186962 DEBUG oslo_concurrency.lockutils [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.486 186962 DEBUG oslo_concurrency.lockutils [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.486 186962 DEBUG nova.compute.manager [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] No waiting events found dispatching network-vif-unplugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.486 186962 WARNING nova.compute.manager [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received unexpected event network-vif-unplugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 for instance with vm_state stopped and task_state None.#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.486 186962 DEBUG nova.compute.manager [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.486 186962 DEBUG oslo_concurrency.lockutils [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.487 186962 DEBUG oslo_concurrency.lockutils [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.487 186962 DEBUG oslo_concurrency.lockutils [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.487 186962 DEBUG nova.compute.manager [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] No waiting events found dispatching network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:57:58 np0005539505 nova_compute[186958]: 2025-11-29 06:57:58.487 186962 WARNING nova.compute.manager [req-12696fe6-f365-4a65-b7b8-b931d846d66e req-5b57496c-d2c4-418f-88fd-3566c9dab58f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received unexpected event network-vif-plugged-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 for instance with vm_state stopped and task_state None.#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.051 186962 DEBUG oslo_concurrency.lockutils [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.052 186962 DEBUG oslo_concurrency.lockutils [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.052 186962 DEBUG oslo_concurrency.lockutils [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "4d501671-0357-4018-80f9-31d43293d107-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.052 186962 DEBUG oslo_concurrency.lockutils [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.053 186962 DEBUG oslo_concurrency.lockutils [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.068 186962 INFO nova.compute.manager [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Terminating instance#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.079 186962 DEBUG nova.compute.manager [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.087 186962 INFO nova.virt.libvirt.driver [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] Instance destroyed successfully.#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.088 186962 DEBUG nova.objects.instance [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'resources' on Instance uuid 4d501671-0357-4018-80f9-31d43293d107 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.100 186962 DEBUG nova.virt.libvirt.vif [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T06:57:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1768107239',display_name='tempest-tempest.common.compute-instance-1768107239',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1768107239',id=42,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:57:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-gh9wve4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:57:57Z,user_data=None,user_id='812d926ee4ed4159b2e88b7a69990423',uuid=4d501671-0357-4018-80f9-31d43293d107,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.100 186962 DEBUG nova.network.os_vif_util [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "address": "fa:16:3e:61:c9:0d", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4dfd4ca1-34", "ovs_interfaceid": "4dfd4ca1-34cd-46b3-9509-5d99ceaee255", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.101 186962 DEBUG nova.network.os_vif_util [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.101 186962 DEBUG os_vif [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.103 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.103 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4dfd4ca1-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.105 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.107 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.110 186962 INFO os_vif [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:c9:0d,bridge_name='br-int',has_traffic_filtering=True,id=4dfd4ca1-34cd-46b3-9509-5d99ceaee255,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4dfd4ca1-34')#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.110 186962 INFO nova.virt.libvirt.driver [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Deleting instance files /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107_del#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.111 186962 INFO nova.virt.libvirt.driver [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Deletion of /var/lib/nova/instances/4d501671-0357-4018-80f9-31d43293d107_del complete#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.187 186962 INFO nova.compute.manager [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Took 0.11 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.188 186962 DEBUG oslo.service.loopingcall [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.188 186962 DEBUG nova.compute.manager [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.188 186962 DEBUG nova.network.neutron [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:57:59 np0005539505 nova_compute[186958]: 2025-11-29 06:57:59.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:01.042 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:01 np0005539505 nova_compute[186958]: 2025-11-29 06:58:01.695 186962 DEBUG nova.network.neutron [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:58:01 np0005539505 nova_compute[186958]: 2025-11-29 06:58:01.714 186962 INFO nova.compute.manager [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] Took 2.53 seconds to deallocate network for instance.#033[00m
Nov 29 01:58:01 np0005539505 nova_compute[186958]: 2025-11-29 06:58:01.794 186962 DEBUG oslo_concurrency.lockutils [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:01 np0005539505 nova_compute[186958]: 2025-11-29 06:58:01.795 186962 DEBUG oslo_concurrency.lockutils [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:01 np0005539505 nova_compute[186958]: 2025-11-29 06:58:01.879 186962 DEBUG nova.compute.manager [req-bfd17356-f1ba-46e4-a87f-8c2444f7b73e req-3f8c507e-b66a-47e3-86e4-67d839959496 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4d501671-0357-4018-80f9-31d43293d107] Received event network-vif-deleted-4dfd4ca1-34cd-46b3-9509-5d99ceaee255 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:01 np0005539505 nova_compute[186958]: 2025-11-29 06:58:01.925 186962 DEBUG nova.compute.provider_tree [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:58:01 np0005539505 nova_compute[186958]: 2025-11-29 06:58:01.953 186962 DEBUG nova.scheduler.client.report [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:58:01 np0005539505 nova_compute[186958]: 2025-11-29 06:58:01.985 186962 DEBUG oslo_concurrency.lockutils [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:02 np0005539505 nova_compute[186958]: 2025-11-29 06:58:02.012 186962 INFO nova.scheduler.client.report [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Deleted allocations for instance 4d501671-0357-4018-80f9-31d43293d107#033[00m
Nov 29 01:58:02 np0005539505 nova_compute[186958]: 2025-11-29 06:58:02.083 186962 DEBUG oslo_concurrency.lockutils [None req-572b4356-8a81-4b4c-a38a-bc280e1b80de 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "4d501671-0357-4018-80f9-31d43293d107" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.031s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:02 np0005539505 podman[221398]: 2025-11-29 06:58:02.745870841 +0000 UTC m=+0.069252443 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 01:58:03 np0005539505 nova_compute[186958]: 2025-11-29 06:58:03.445 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:04 np0005539505 nova_compute[186958]: 2025-11-29 06:58:04.154 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:08 np0005539505 nova_compute[186958]: 2025-11-29 06:58:08.448 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:08 np0005539505 podman[221418]: 2025-11-29 06:58:08.773965344 +0000 UTC m=+0.092615109 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:58:08 np0005539505 podman[221442]: 2025-11-29 06:58:08.990719686 +0000 UTC m=+0.157620507 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:58:09 np0005539505 nova_compute[186958]: 2025-11-29 06:58:09.156 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:12 np0005539505 nova_compute[186958]: 2025-11-29 06:58:12.170 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399477.1701028, 4d501671-0357-4018-80f9-31d43293d107 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:58:12 np0005539505 nova_compute[186958]: 2025-11-29 06:58:12.171 186962 INFO nova.compute.manager [-] [instance: 4d501671-0357-4018-80f9-31d43293d107] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:58:12 np0005539505 nova_compute[186958]: 2025-11-29 06:58:12.218 186962 DEBUG nova.compute.manager [None req-4d18307f-5b53-4c0f-9beb-e10389df4047 - - - - - -] [instance: 4d501671-0357-4018-80f9-31d43293d107] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:13 np0005539505 nova_compute[186958]: 2025-11-29 06:58:13.491 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.159 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.526 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.527 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.550 186962 DEBUG nova.compute.manager [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.696 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.697 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.712 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.713 186962 INFO nova.compute.claims [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.856 186962 DEBUG nova.compute.provider_tree [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.878 186962 DEBUG nova.scheduler.client.report [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.907 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.908 186962 DEBUG nova.compute.manager [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.977 186962 DEBUG nova.compute.manager [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.977 186962 DEBUG nova.network.neutron [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:58:14 np0005539505 nova_compute[186958]: 2025-11-29 06:58:14.995 186962 INFO nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.017 186962 DEBUG nova.compute.manager [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.128 186962 DEBUG nova.compute.manager [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.130 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.131 186962 INFO nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Creating image(s)#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.132 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "/var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.133 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "/var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.134 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "/var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.162 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.247 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.249 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.250 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.266 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.334 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.335 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:15 np0005539505 nova_compute[186958]: 2025-11-29 06:58:15.356 186962 DEBUG nova.policy [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '812d926ee4ed4159b2e88b7a69990423', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4362be0b90a64d63b2294bbc495486d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:58:15 np0005539505 podman[221477]: 2025-11-29 06:58:15.742723016 +0000 UTC m=+0.080633387 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.032 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk 1073741824" returned: 0 in 0.696s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.033 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.033 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.127 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.129 186962 DEBUG nova.virt.disk.api [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Checking if we can resize image /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.129 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.186 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.188 186962 DEBUG nova.virt.disk.api [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Cannot resize image /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.188 186962 DEBUG nova.objects.instance [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'migration_context' on Instance uuid b2ea8f54-d424-4c0e-8387-90f8f7f699a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.218 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.219 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Ensure instance console log exists: /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.219 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.220 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.220 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:16 np0005539505 nova_compute[186958]: 2025-11-29 06:58:16.621 186962 DEBUG nova.network.neutron [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Successfully created port: 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:58:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:17.456 104201 DEBUG eventlet.wsgi.server [-] (104201) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Nov 29 01:58:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:17.462 104201 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Nov 29 01:58:17 np0005539505 ovn_metadata_agent[104089]: Accept: */*#015
Nov 29 01:58:17 np0005539505 ovn_metadata_agent[104089]: Connection: close#015
Nov 29 01:58:17 np0005539505 ovn_metadata_agent[104089]: Content-Type: text/plain#015
Nov 29 01:58:17 np0005539505 ovn_metadata_agent[104089]: Host: 169.254.169.254#015
Nov 29 01:58:17 np0005539505 ovn_metadata_agent[104089]: User-Agent: curl/7.84.0#015
Nov 29 01:58:17 np0005539505 ovn_metadata_agent[104089]: X-Forwarded-For: 10.100.0.12#015
Nov 29 01:58:17 np0005539505 ovn_metadata_agent[104089]: X-Ovn-Network-Id: 161dc148-b0f6-438a-a30b-2e3f075bfa98 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Nov 29 01:58:17 np0005539505 podman[221504]: 2025-11-29 06:58:17.795883284 +0000 UTC m=+0.120160148 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 01:58:18 np0005539505 nova_compute[186958]: 2025-11-29 06:58:18.264 186962 DEBUG nova.network.neutron [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Successfully updated port: 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:58:18 np0005539505 nova_compute[186958]: 2025-11-29 06:58:18.280 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "refresh_cache-b2ea8f54-d424-4c0e-8387-90f8f7f699a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:58:18 np0005539505 nova_compute[186958]: 2025-11-29 06:58:18.280 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquired lock "refresh_cache-b2ea8f54-d424-4c0e-8387-90f8f7f699a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:58:18 np0005539505 nova_compute[186958]: 2025-11-29 06:58:18.280 186962 DEBUG nova.network.neutron [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:58:18 np0005539505 nova_compute[186958]: 2025-11-29 06:58:18.384 186962 DEBUG nova.compute.manager [req-8278eb18-d095-42d6-9108-9c98ded2ecaa req-0340a547-756b-4630-a7db-f65cda05f9fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Received event network-changed-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:18 np0005539505 nova_compute[186958]: 2025-11-29 06:58:18.385 186962 DEBUG nova.compute.manager [req-8278eb18-d095-42d6-9108-9c98ded2ecaa req-0340a547-756b-4630-a7db-f65cda05f9fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Refreshing instance network info cache due to event network-changed-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:58:18 np0005539505 nova_compute[186958]: 2025-11-29 06:58:18.385 186962 DEBUG oslo_concurrency.lockutils [req-8278eb18-d095-42d6-9108-9c98ded2ecaa req-0340a547-756b-4630-a7db-f65cda05f9fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b2ea8f54-d424-4c0e-8387-90f8f7f699a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:58:18 np0005539505 nova_compute[186958]: 2025-11-29 06:58:18.452 186962 DEBUG nova.network.neutron [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:58:18 np0005539505 nova_compute[186958]: 2025-11-29 06:58:18.554 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:19 np0005539505 nova_compute[186958]: 2025-11-29 06:58:19.162 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:19.495 104201 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Nov 29 01:58:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:19.496 104201 INFO eventlet.wsgi.server [-] 10.100.0.12,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 1673 time: 2.0347724#033[00m
Nov 29 01:58:19 np0005539505 haproxy-metadata-proxy-161dc148-b0f6-438a-a30b-2e3f075bfa98[220959]: 10.100.0.12:52212 [29/Nov/2025:06:58:17.454] listener listener/metadata 0/0/0/2041/2041 200 1657 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Nov 29 01:58:19 np0005539505 nova_compute[186958]: 2025-11-29 06:58:19.766 186962 DEBUG oslo_concurrency.lockutils [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Acquiring lock "c8ab194e-e936-4110-aef3-1eb79dc427c6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:19 np0005539505 nova_compute[186958]: 2025-11-29 06:58:19.767 186962 DEBUG oslo_concurrency.lockutils [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:19 np0005539505 nova_compute[186958]: 2025-11-29 06:58:19.768 186962 DEBUG oslo_concurrency.lockutils [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Acquiring lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:19 np0005539505 nova_compute[186958]: 2025-11-29 06:58:19.769 186962 DEBUG oslo_concurrency.lockutils [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:19 np0005539505 nova_compute[186958]: 2025-11-29 06:58:19.769 186962 DEBUG oslo_concurrency.lockutils [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:19 np0005539505 nova_compute[186958]: 2025-11-29 06:58:19.785 186962 INFO nova.compute.manager [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Terminating instance#033[00m
Nov 29 01:58:19 np0005539505 nova_compute[186958]: 2025-11-29 06:58:19.798 186962 DEBUG nova.compute.manager [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:58:19 np0005539505 kernel: tapb9da574e-6a (unregistering): left promiscuous mode
Nov 29 01:58:19 np0005539505 NetworkManager[55134]: <info>  [1764399499.8245] device (tapb9da574e-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:58:19 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:19Z|00195|binding|INFO|Releasing lport b9da574e-6a19-48ee-acaa-5d9843059704 from this chassis (sb_readonly=0)
Nov 29 01:58:19 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:19Z|00196|binding|INFO|Setting lport b9da574e-6a19-48ee-acaa-5d9843059704 down in Southbound
Nov 29 01:58:19 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:19Z|00197|binding|INFO|Removing iface tapb9da574e-6a ovn-installed in OVS
Nov 29 01:58:19 np0005539505 nova_compute[186958]: 2025-11-29 06:58:19.830 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:19.837 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:28:01 10.100.0.12'], port_security=['fa:16:3e:1a:28:01 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c8ab194e-e936-4110-aef3-1eb79dc427c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-161dc148-b0f6-438a-a30b-2e3f075bfa98', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f890cbeccf24a3cb44c8120bb217100', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9b06df29-3b9c-4f14-a3f5-5cccbc07d13e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1dfccf9c-9f98-4928-bddf-0a33fb548540, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b9da574e-6a19-48ee-acaa-5d9843059704) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:58:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:19.838 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b9da574e-6a19-48ee-acaa-5d9843059704 in datapath 161dc148-b0f6-438a-a30b-2e3f075bfa98 unbound from our chassis#033[00m
Nov 29 01:58:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:19.840 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 161dc148-b0f6-438a-a30b-2e3f075bfa98, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:58:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:19.841 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[144edcb8-265d-4d81-bf25-ceedb142bfe7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:19.842 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98 namespace which is not needed anymore#033[00m
Nov 29 01:58:19 np0005539505 nova_compute[186958]: 2025-11-29 06:58:19.852 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:19 np0005539505 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Nov 29 01:58:19 np0005539505 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002b.scope: Consumed 15.094s CPU time.
Nov 29 01:58:19 np0005539505 systemd-machined[153285]: Machine qemu-22-instance-0000002b terminated.
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.079 186962 INFO nova.virt.libvirt.driver [-] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Instance destroyed successfully.#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.080 186962 DEBUG nova.objects.instance [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lazy-loading 'resources' on Instance uuid c8ab194e-e936-4110-aef3-1eb79dc427c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.091 186962 DEBUG nova.virt.libvirt.vif [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:57:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=43,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDbmlwCnrXwgDi+kZQ82iiBrJiPqnqamk6+oVjwA2kGpXrLDsSctqAK0rWLIJYdi1rqr+9xPvmLi4KCpU3OYdjbgezY4yyAIb7mcEAWaeeb0FPPXaiNiyIN2epr0LUYUrQ==',key_name='tempest-keypair-2021864762',keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:57:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f890cbeccf24a3cb44c8120bb217100',ramdisk_id='',reservation_id='r-ai19nv1a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersV294TestFqdnHostnames-444178242',owner_user_name='tempest-ServersV294TestFqdnHostnames-444178242-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:57:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='577626ce99e245058e48fb6c5268a00f',uuid=c8ab194e-e936-4110-aef3-1eb79dc427c6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.091 186962 DEBUG nova.network.os_vif_util [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Converting VIF {"id": "b9da574e-6a19-48ee-acaa-5d9843059704", "address": "fa:16:3e:1a:28:01", "network": {"id": "161dc148-b0f6-438a-a30b-2e3f075bfa98", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-192356474-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f890cbeccf24a3cb44c8120bb217100", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb9da574e-6a", "ovs_interfaceid": "b9da574e-6a19-48ee-acaa-5d9843059704", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.092 186962 DEBUG nova.network.os_vif_util [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1a:28:01,bridge_name='br-int',has_traffic_filtering=True,id=b9da574e-6a19-48ee-acaa-5d9843059704,network=Network(161dc148-b0f6-438a-a30b-2e3f075bfa98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9da574e-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.093 186962 DEBUG os_vif [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1a:28:01,bridge_name='br-int',has_traffic_filtering=True,id=b9da574e-6a19-48ee-acaa-5d9843059704,network=Network(161dc148-b0f6-438a-a30b-2e3f075bfa98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9da574e-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.095 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.095 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb9da574e-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.097 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.099 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.102 186962 INFO os_vif [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1a:28:01,bridge_name='br-int',has_traffic_filtering=True,id=b9da574e-6a19-48ee-acaa-5d9843059704,network=Network(161dc148-b0f6-438a-a30b-2e3f075bfa98),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb9da574e-6a')#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.103 186962 INFO nova.virt.libvirt.driver [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Deleting instance files /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6_del#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.104 186962 INFO nova.virt.libvirt.driver [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Deletion of /var/lib/nova/instances/c8ab194e-e936-4110-aef3-1eb79dc427c6_del complete#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.196 186962 INFO nova.compute.manager [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.196 186962 DEBUG oslo.service.loopingcall [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.197 186962 DEBUG nova.compute.manager [-] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.197 186962 DEBUG nova.network.neutron [-] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:58:20 np0005539505 neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98[220942]: [NOTICE]   (220957) : haproxy version is 2.8.14-c23fe91
Nov 29 01:58:20 np0005539505 neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98[220942]: [NOTICE]   (220957) : path to executable is /usr/sbin/haproxy
Nov 29 01:58:20 np0005539505 neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98[220942]: [WARNING]  (220957) : Exiting Master process...
Nov 29 01:58:20 np0005539505 neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98[220942]: [ALERT]    (220957) : Current worker (220959) exited with code 143 (Terminated)
Nov 29 01:58:20 np0005539505 neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98[220942]: [WARNING]  (220957) : All workers exited. Exiting... (0)
Nov 29 01:58:20 np0005539505 systemd[1]: libpod-a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95.scope: Deactivated successfully.
Nov 29 01:58:20 np0005539505 podman[221548]: 2025-11-29 06:58:20.28073843 +0000 UTC m=+0.343610996 container died a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.599 186962 DEBUG nova.network.neutron [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Updating instance_info_cache with network_info: [{"id": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "address": "fa:16:3e:4c:e1:d2", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c82a0e1-8f", "ovs_interfaceid": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.623 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Releasing lock "refresh_cache-b2ea8f54-d424-4c0e-8387-90f8f7f699a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.624 186962 DEBUG nova.compute.manager [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Instance network_info: |[{"id": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "address": "fa:16:3e:4c:e1:d2", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c82a0e1-8f", "ovs_interfaceid": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.625 186962 DEBUG oslo_concurrency.lockutils [req-8278eb18-d095-42d6-9108-9c98ded2ecaa req-0340a547-756b-4630-a7db-f65cda05f9fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b2ea8f54-d424-4c0e-8387-90f8f7f699a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.625 186962 DEBUG nova.network.neutron [req-8278eb18-d095-42d6-9108-9c98ded2ecaa req-0340a547-756b-4630-a7db-f65cda05f9fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Refreshing network info cache for port 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.628 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Start _get_guest_xml network_info=[{"id": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "address": "fa:16:3e:4c:e1:d2", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c82a0e1-8f", "ovs_interfaceid": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.633 186962 WARNING nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.644 186962 DEBUG nova.virt.libvirt.host [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.645 186962 DEBUG nova.virt.libvirt.host [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.649 186962 DEBUG nova.virt.libvirt.host [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.650 186962 DEBUG nova.virt.libvirt.host [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.651 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.652 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.652 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.652 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.652 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.653 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.653 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.653 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.653 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.654 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.654 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.654 186962 DEBUG nova.virt.hardware [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.658 186962 DEBUG nova.virt.libvirt.vif [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:58:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-325119097',display_name='tempest-ServerActionsTestOtherA-server-325119097',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-325119097',id=45,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-68lqq68v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:58:15Z,user_data=None,user_id='812d926ee4ed4159b2e88b7a69990423',uuid=b2ea8f54-d424-4c0e-8387-90f8f7f699a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "address": "fa:16:3e:4c:e1:d2", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c82a0e1-8f", "ovs_interfaceid": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.658 186962 DEBUG nova.network.os_vif_util [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "address": "fa:16:3e:4c:e1:d2", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c82a0e1-8f", "ovs_interfaceid": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.659 186962 DEBUG nova.network.os_vif_util [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:e1:d2,bridge_name='br-int',has_traffic_filtering=True,id=8c82a0e1-8f6b-4fda-87ac-5bb3f267671c,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c82a0e1-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.660 186962 DEBUG nova.objects.instance [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid b2ea8f54-d424-4c0e-8387-90f8f7f699a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.686 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  <uuid>b2ea8f54-d424-4c0e-8387-90f8f7f699a0</uuid>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  <name>instance-0000002d</name>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerActionsTestOtherA-server-325119097</nova:name>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:58:20</nova:creationTime>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:        <nova:user uuid="812d926ee4ed4159b2e88b7a69990423">tempest-ServerActionsTestOtherA-229564135-project-member</nova:user>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:        <nova:project uuid="4362be0b90a64d63b2294bbc495486d3">tempest-ServerActionsTestOtherA-229564135</nova:project>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:        <nova:port uuid="8c82a0e1-8f6b-4fda-87ac-5bb3f267671c">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <entry name="serial">b2ea8f54-d424-4c0e-8387-90f8f7f699a0</entry>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <entry name="uuid">b2ea8f54-d424-4c0e-8387-90f8f7f699a0</entry>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk.config"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:4c:e1:d2"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <target dev="tap8c82a0e1-8f"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/console.log" append="off"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:58:20 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:58:20 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:58:20 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:58:20 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.688 186962 DEBUG nova.compute.manager [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Preparing to wait for external event network-vif-plugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.688 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.688 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.688 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.689 186962 DEBUG nova.virt.libvirt.vif [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:58:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-325119097',display_name='tempest-ServerActionsTestOtherA-server-325119097',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-325119097',id=45,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-68lqq68v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:58:15Z,user_data=None,user_id='812d926ee4ed4159b2e88b7a69990423',uuid=b2ea8f54-d424-4c0e-8387-90f8f7f699a0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "address": "fa:16:3e:4c:e1:d2", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c82a0e1-8f", "ovs_interfaceid": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.689 186962 DEBUG nova.network.os_vif_util [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "address": "fa:16:3e:4c:e1:d2", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c82a0e1-8f", "ovs_interfaceid": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.690 186962 DEBUG nova.network.os_vif_util [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:e1:d2,bridge_name='br-int',has_traffic_filtering=True,id=8c82a0e1-8f6b-4fda-87ac-5bb3f267671c,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c82a0e1-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.690 186962 DEBUG os_vif [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:e1:d2,bridge_name='br-int',has_traffic_filtering=True,id=8c82a0e1-8f6b-4fda-87ac-5bb3f267671c,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c82a0e1-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.690 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.691 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.691 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.694 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.694 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c82a0e1-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.695 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8c82a0e1-8f, col_values=(('external_ids', {'iface-id': '8c82a0e1-8f6b-4fda-87ac-5bb3f267671c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4c:e1:d2', 'vm-uuid': 'b2ea8f54-d424-4c0e-8387-90f8f7f699a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.696 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:20 np0005539505 NetworkManager[55134]: <info>  [1764399500.6977] manager: (tap8c82a0e1-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.700 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.701 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.703 186962 INFO os_vif [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:e1:d2,bridge_name='br-int',has_traffic_filtering=True,id=8c82a0e1-8f6b-4fda-87ac-5bb3f267671c,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c82a0e1-8f')#033[00m
Nov 29 01:58:20 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95-userdata-shm.mount: Deactivated successfully.
Nov 29 01:58:20 np0005539505 systemd[1]: var-lib-containers-storage-overlay-8b931443e4214ee6daf8e8b60fcfbb7b7478ae0af533b172f6f205f10c5bc9ce-merged.mount: Deactivated successfully.
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.911 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.912 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.912 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] No VIF found with MAC fa:16:3e:4c:e1:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.912 186962 INFO nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Using config drive#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.998 186962 DEBUG nova.compute.manager [req-996cab9f-b8f7-4f3f-9695-29a608d2eeb9 req-fdee0e77-2a62-4542-88c9-63687a371775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Received event network-vif-unplugged-b9da574e-6a19-48ee-acaa-5d9843059704 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.999 186962 DEBUG oslo_concurrency.lockutils [req-996cab9f-b8f7-4f3f-9695-29a608d2eeb9 req-fdee0e77-2a62-4542-88c9-63687a371775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:20 np0005539505 nova_compute[186958]: 2025-11-29 06:58:20.999 186962 DEBUG oslo_concurrency.lockutils [req-996cab9f-b8f7-4f3f-9695-29a608d2eeb9 req-fdee0e77-2a62-4542-88c9-63687a371775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:21 np0005539505 nova_compute[186958]: 2025-11-29 06:58:21.000 186962 DEBUG oslo_concurrency.lockutils [req-996cab9f-b8f7-4f3f-9695-29a608d2eeb9 req-fdee0e77-2a62-4542-88c9-63687a371775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:21 np0005539505 nova_compute[186958]: 2025-11-29 06:58:21.000 186962 DEBUG nova.compute.manager [req-996cab9f-b8f7-4f3f-9695-29a608d2eeb9 req-fdee0e77-2a62-4542-88c9-63687a371775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] No waiting events found dispatching network-vif-unplugged-b9da574e-6a19-48ee-acaa-5d9843059704 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:58:21 np0005539505 nova_compute[186958]: 2025-11-29 06:58:21.000 186962 DEBUG nova.compute.manager [req-996cab9f-b8f7-4f3f-9695-29a608d2eeb9 req-fdee0e77-2a62-4542-88c9-63687a371775 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Received event network-vif-unplugged-b9da574e-6a19-48ee-acaa-5d9843059704 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:58:21 np0005539505 podman[221548]: 2025-11-29 06:58:21.704585588 +0000 UTC m=+1.767458134 container cleanup a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 01:58:21 np0005539505 systemd[1]: libpod-conmon-a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95.scope: Deactivated successfully.
Nov 29 01:58:21 np0005539505 nova_compute[186958]: 2025-11-29 06:58:21.749 186962 INFO nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Creating config drive at /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk.config#033[00m
Nov 29 01:58:21 np0005539505 nova_compute[186958]: 2025-11-29 06:58:21.756 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpje66uizr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:58:21 np0005539505 nova_compute[186958]: 2025-11-29 06:58:21.888 186962 DEBUG oslo_concurrency.processutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpje66uizr" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:58:21 np0005539505 kernel: tap8c82a0e1-8f: entered promiscuous mode
Nov 29 01:58:22 np0005539505 systemd-udevd[221529]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:58:22 np0005539505 NetworkManager[55134]: <info>  [1764399501.9529] manager: (tap8c82a0e1-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/104)
Nov 29 01:58:22 np0005539505 NetworkManager[55134]: <info>  [1764399502.0143] device (tap8c82a0e1-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:58:22 np0005539505 NetworkManager[55134]: <info>  [1764399502.0156] device (tap8c82a0e1-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:58:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:22Z|00198|binding|INFO|Claiming lport 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c for this chassis.
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.014 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:22Z|00199|binding|INFO|8c82a0e1-8f6b-4fda-87ac-5bb3f267671c: Claiming fa:16:3e:4c:e1:d2 10.100.0.5
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.024 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:e1:d2 10.100.0.5'], port_security=['fa:16:3e:4c:e1:d2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b2ea8f54-d424-4c0e-8387-90f8f7f699a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db691b6b-17b7-42a9-9fd2-162233da0513', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4362be0b90a64d63b2294bbc495486d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7810e1a5-baa0-4375-99a8-632ac4dab559', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03ee1f45-6435-43da-9a98-5273904b0bb0, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=8c82a0e1-8f6b-4fda-87ac-5bb3f267671c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:58:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:22Z|00200|binding|INFO|Setting lport 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c ovn-installed in OVS
Nov 29 01:58:22 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:22Z|00201|binding|INFO|Setting lport 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c up in Southbound
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.028 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.030 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:22 np0005539505 systemd-machined[153285]: New machine qemu-24-instance-0000002d.
Nov 29 01:58:22 np0005539505 systemd[1]: Started Virtual Machine qemu-24-instance-0000002d.
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.280 186962 DEBUG nova.network.neutron [-] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.298 186962 INFO nova.compute.manager [-] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Took 2.10 seconds to deallocate network for instance.#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.383 186962 DEBUG oslo_concurrency.lockutils [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.384 186962 DEBUG oslo_concurrency.lockutils [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.464 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399502.4639337, b2ea8f54-d424-4c0e-8387-90f8f7f699a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.464 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] VM Started (Lifecycle Event)#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.483 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.486 186962 DEBUG nova.compute.provider_tree [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.489 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399502.4647865, b2ea8f54-d424-4c0e-8387-90f8f7f699a0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.489 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.504 186962 DEBUG nova.scheduler.client.report [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.509 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.511 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.538 186962 DEBUG oslo_concurrency.lockutils [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.540 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.571 186962 INFO nova.scheduler.client.report [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Deleted allocations for instance c8ab194e-e936-4110-aef3-1eb79dc427c6#033[00m
Nov 29 01:58:22 np0005539505 podman[221597]: 2025-11-29 06:58:22.647658895 +0000 UTC m=+0.919822026 container remove a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.655 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c4d82433-1a95-4779-8283-20dea2441c8a]: (4, ('Sat Nov 29 06:58:19 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98 (a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95)\na44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95\nSat Nov 29 06:58:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98 (a44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95)\na44e6116a8b3fa518fa8f8fab6ded3746f7b0d8cd743f1cc0f31a69030664a95\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.658 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c2a16125-848f-423f-8502-92fbc5a2391d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.660 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap161dc148-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.662 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:22 np0005539505 kernel: tap161dc148-b0: left promiscuous mode
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.674 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.679 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9cff6b43-8a38-43a9-8819-62d8c6f4964f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.709 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[366d1b51-148b-44ec-8064-8d821842d582]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.711 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ba29b6-5fbc-46fb-b2c8-9a9fb0a0127f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.734 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fc142d32-ccb5-41f6-85c8-32a677e90b42]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 490631, 'reachable_time': 41681, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221641, 'error': None, 'target': 'ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 systemd[1]: run-netns-ovnmeta\x2d161dc148\x2db0f6\x2d438a\x2da30b\x2d2e3f075bfa98.mount: Deactivated successfully.
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.738 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-161dc148-b0f6-438a-a30b-2e3f075bfa98 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.738 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[3af20f65-320d-43af-b333-99399203f59e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.740 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c in datapath db691b6b-17b7-42a9-9fd2-162233da0513 unbound from our chassis#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.742 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network db691b6b-17b7-42a9-9fd2-162233da0513#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.754 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3c76083a-1409-42c6-a249-23aaced908e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.755 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdb691b6b-11 in ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.757 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdb691b6b-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.757 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3e433918-4436-4cdb-8901-296cb9bd527a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.758 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4f434bce-3ca8-4dcf-868c-8ac47eaf32df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.773 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[de27a33a-a6c2-441d-8507-571bfa8e57e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.806 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[acba5e1e-ac95-4a28-ac43-439d7721ddfd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.849 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[7497d4fb-6f5c-4dd2-8177-220582d6aeae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.856 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b309a1b7-67fd-45f2-9eb4-6301552b65d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 NetworkManager[55134]: <info>  [1764399502.8583] manager: (tapdb691b6b-10): new Veth device (/org/freedesktop/NetworkManager/Devices/105)
Nov 29 01:58:22 np0005539505 systemd-udevd[221649]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.897 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[d80f634e-2b01-49c5-85ab-60fc58fccc23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.901 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[dac4e935-d003-431a-8e9d-b1b2dc929fb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 NetworkManager[55134]: <info>  [1764399502.9340] device (tapdb691b6b-10): carrier: link connected
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.937 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[66db0e6a-8f15-4b7e-becb-d13b4471f566]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.958 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[daa58ce1-0a77-428e-af5f-dc19ce18c563]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb691b6b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:ad:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495051, 'reachable_time': 28171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221668, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 nova_compute[186958]: 2025-11-29 06:58:22.967 186962 DEBUG oslo_concurrency.lockutils [None req-cb297090-8ab3-4514-a0a6-667ed72d1170 577626ce99e245058e48fb6c5268a00f 8f890cbeccf24a3cb44c8120bb217100 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.971 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed1d754-7021-4bc1-8330-66d2c03de371]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe61:ad90'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 495051, 'tstamp': 495051}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221669, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:22.985 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[380fca33-8dcb-4a48-aa3e-de550af80640]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdb691b6b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:61:ad:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495051, 'reachable_time': 28171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221670, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:23.011 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4143e12c-7050-4f83-aaba-490f9d12f350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:23.060 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2402ca36-08fb-4902-a0c5-c6ff864764de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:23.062 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb691b6b-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:23.063 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:23.064 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb691b6b-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.079 186962 DEBUG nova.compute.manager [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Received event network-vif-plugged-b9da574e-6a19-48ee-acaa-5d9843059704 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.079 186962 DEBUG oslo_concurrency.lockutils [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.080 186962 DEBUG oslo_concurrency.lockutils [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.080 186962 DEBUG oslo_concurrency.lockutils [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c8ab194e-e936-4110-aef3-1eb79dc427c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.080 186962 DEBUG nova.compute.manager [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] No waiting events found dispatching network-vif-plugged-b9da574e-6a19-48ee-acaa-5d9843059704 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.080 186962 WARNING nova.compute.manager [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Received unexpected event network-vif-plugged-b9da574e-6a19-48ee-acaa-5d9843059704 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.080 186962 DEBUG nova.compute.manager [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Received event network-vif-deleted-b9da574e-6a19-48ee-acaa-5d9843059704 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.081 186962 DEBUG nova.compute.manager [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Received event network-vif-plugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.081 186962 DEBUG oslo_concurrency.lockutils [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.081 186962 DEBUG oslo_concurrency.lockutils [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.081 186962 DEBUG oslo_concurrency.lockutils [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.082 186962 DEBUG nova.compute.manager [req-63e86cd0-f379-4343-b307-c9860649c3f8 req-5de271b9-db89-4be1-b954-88a10ea71726 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Processing event network-vif-plugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.082 186962 DEBUG nova.compute.manager [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:58:23 np0005539505 NetworkManager[55134]: <info>  [1764399503.0994] manager: (tapdb691b6b-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.099 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:23 np0005539505 kernel: tapdb691b6b-10: entered promiscuous mode
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.101 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.102 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399503.1011417, b2ea8f54-d424-4c0e-8387-90f8f7f699a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.102 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:23.104 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdb691b6b-10, col_values=(('external_ids', {'iface-id': '4035feb9-29a5-4ae9-8490-a44f1379821c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.105 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:23 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:23Z|00202|binding|INFO|Releasing lport 4035feb9-29a5-4ae9-8490-a44f1379821c from this chassis (sb_readonly=0)
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.109 186962 INFO nova.virt.libvirt.driver [-] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Instance spawned successfully.#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.110 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.123 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.128 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.132 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.132 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.133 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.134 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.134 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.135 186962 DEBUG nova.virt.libvirt.driver [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:23.135 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:23.136 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[da49f8e5-a034-407e-9a60-5d12dad24c62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:23.137 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-db691b6b-17b7-42a9-9fd2-162233da0513
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/db691b6b-17b7-42a9-9fd2-162233da0513.pid.haproxy
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID db691b6b-17b7-42a9-9fd2-162233da0513
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:58:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:23.137 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'env', 'PROCESS_TAG=haproxy-db691b6b-17b7-42a9-9fd2-162233da0513', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/db691b6b-17b7-42a9-9fd2-162233da0513.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.139 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.145 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.229 186962 INFO nova.compute.manager [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Took 8.10 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.230 186962 DEBUG nova.compute.manager [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.304 186962 INFO nova.compute.manager [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Took 8.69 seconds to build instance.#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.323 186962 DEBUG oslo_concurrency.lockutils [None req-ee06d770-4962-4a11-b172-dae07a858a37 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.524 186962 DEBUG nova.network.neutron [req-8278eb18-d095-42d6-9108-9c98ded2ecaa req-0340a547-756b-4630-a7db-f65cda05f9fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Updated VIF entry in instance network info cache for port 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.525 186962 DEBUG nova.network.neutron [req-8278eb18-d095-42d6-9108-9c98ded2ecaa req-0340a547-756b-4630-a7db-f65cda05f9fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Updating instance_info_cache with network_info: [{"id": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "address": "fa:16:3e:4c:e1:d2", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c82a0e1-8f", "ovs_interfaceid": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.542 186962 DEBUG oslo_concurrency.lockutils [req-8278eb18-d095-42d6-9108-9c98ded2ecaa req-0340a547-756b-4630-a7db-f65cda05f9fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b2ea8f54-d424-4c0e-8387-90f8f7f699a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:58:23 np0005539505 nova_compute[186958]: 2025-11-29 06:58:23.557 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:23 np0005539505 podman[221702]: 2025-11-29 06:58:23.4945597 +0000 UTC m=+0.026396176 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:58:24 np0005539505 podman[221702]: 2025-11-29 06:58:24.505900781 +0000 UTC m=+1.037737227 container create 3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:58:24 np0005539505 systemd[1]: Started libpod-conmon-3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a.scope.
Nov 29 01:58:24 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:58:24 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c592b3b8ec7933364cb609b42eec099e1339ca4a1164facb431eb7358b9fc25/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:58:24 np0005539505 podman[221702]: 2025-11-29 06:58:24.757838465 +0000 UTC m=+1.289675011 container init 3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:58:24 np0005539505 podman[221702]: 2025-11-29 06:58:24.763533492 +0000 UTC m=+1.295369988 container start 3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 01:58:24 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221717]: [NOTICE]   (221721) : New worker (221723) forked
Nov 29 01:58:24 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221717]: [NOTICE]   (221721) : Loading success.
Nov 29 01:58:25 np0005539505 nova_compute[186958]: 2025-11-29 06:58:25.546 186962 DEBUG nova.compute.manager [req-5eb632e9-e8e5-42ba-b9d9-42fd1e245ab1 req-20736680-769c-4379-b723-a32ebab72a50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Received event network-vif-plugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:25 np0005539505 nova_compute[186958]: 2025-11-29 06:58:25.546 186962 DEBUG oslo_concurrency.lockutils [req-5eb632e9-e8e5-42ba-b9d9-42fd1e245ab1 req-20736680-769c-4379-b723-a32ebab72a50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:25 np0005539505 nova_compute[186958]: 2025-11-29 06:58:25.546 186962 DEBUG oslo_concurrency.lockutils [req-5eb632e9-e8e5-42ba-b9d9-42fd1e245ab1 req-20736680-769c-4379-b723-a32ebab72a50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:25 np0005539505 nova_compute[186958]: 2025-11-29 06:58:25.547 186962 DEBUG oslo_concurrency.lockutils [req-5eb632e9-e8e5-42ba-b9d9-42fd1e245ab1 req-20736680-769c-4379-b723-a32ebab72a50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:25 np0005539505 nova_compute[186958]: 2025-11-29 06:58:25.547 186962 DEBUG nova.compute.manager [req-5eb632e9-e8e5-42ba-b9d9-42fd1e245ab1 req-20736680-769c-4379-b723-a32ebab72a50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] No waiting events found dispatching network-vif-plugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:58:25 np0005539505 nova_compute[186958]: 2025-11-29 06:58:25.548 186962 WARNING nova.compute.manager [req-5eb632e9-e8e5-42ba-b9d9-42fd1e245ab1 req-20736680-769c-4379-b723-a32ebab72a50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Received unexpected event network-vif-plugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c for instance with vm_state active and task_state None.#033[00m
Nov 29 01:58:25 np0005539505 nova_compute[186958]: 2025-11-29 06:58:25.696 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:26.935 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:26.936 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:26.936 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:27 np0005539505 nova_compute[186958]: 2025-11-29 06:58:27.656 186962 DEBUG nova.compute.manager [req-df840326-6246-4e61-8559-5a68adebca69 req-7f5851c9-4479-4792-b581-2f0cc9d7589b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Received event network-changed-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:27 np0005539505 nova_compute[186958]: 2025-11-29 06:58:27.656 186962 DEBUG nova.compute.manager [req-df840326-6246-4e61-8559-5a68adebca69 req-7f5851c9-4479-4792-b581-2f0cc9d7589b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Refreshing instance network info cache due to event network-changed-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:58:27 np0005539505 nova_compute[186958]: 2025-11-29 06:58:27.657 186962 DEBUG oslo_concurrency.lockutils [req-df840326-6246-4e61-8559-5a68adebca69 req-7f5851c9-4479-4792-b581-2f0cc9d7589b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b2ea8f54-d424-4c0e-8387-90f8f7f699a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:58:27 np0005539505 nova_compute[186958]: 2025-11-29 06:58:27.657 186962 DEBUG oslo_concurrency.lockutils [req-df840326-6246-4e61-8559-5a68adebca69 req-7f5851c9-4479-4792-b581-2f0cc9d7589b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b2ea8f54-d424-4c0e-8387-90f8f7f699a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:58:27 np0005539505 nova_compute[186958]: 2025-11-29 06:58:27.657 186962 DEBUG nova.network.neutron [req-df840326-6246-4e61-8559-5a68adebca69 req-7f5851c9-4479-4792-b581-2f0cc9d7589b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Refreshing network info cache for port 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:58:27 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:27Z|00203|binding|INFO|Releasing lport 4035feb9-29a5-4ae9-8490-a44f1379821c from this chassis (sb_readonly=0)
Nov 29 01:58:28 np0005539505 nova_compute[186958]: 2025-11-29 06:58:28.006 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:28 np0005539505 nova_compute[186958]: 2025-11-29 06:58:28.559 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:28 np0005539505 podman[221733]: 2025-11-29 06:58:28.745958099 +0000 UTC m=+0.067120881 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 01:58:28 np0005539505 podman[221732]: 2025-11-29 06:58:28.768273784 +0000 UTC m=+0.096082851 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7)
Nov 29 01:58:29 np0005539505 nova_compute[186958]: 2025-11-29 06:58:29.760 186962 DEBUG nova.network.neutron [req-df840326-6246-4e61-8559-5a68adebca69 req-7f5851c9-4479-4792-b581-2f0cc9d7589b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Updated VIF entry in instance network info cache for port 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:58:29 np0005539505 nova_compute[186958]: 2025-11-29 06:58:29.762 186962 DEBUG nova.network.neutron [req-df840326-6246-4e61-8559-5a68adebca69 req-7f5851c9-4479-4792-b581-2f0cc9d7589b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Updating instance_info_cache with network_info: [{"id": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "address": "fa:16:3e:4c:e1:d2", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c82a0e1-8f", "ovs_interfaceid": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:58:29 np0005539505 nova_compute[186958]: 2025-11-29 06:58:29.788 186962 DEBUG oslo_concurrency.lockutils [req-df840326-6246-4e61-8559-5a68adebca69 req-7f5851c9-4479-4792-b581-2f0cc9d7589b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b2ea8f54-d424-4c0e-8387-90f8f7f699a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:58:30 np0005539505 nova_compute[186958]: 2025-11-29 06:58:30.699 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.560 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.627 186962 DEBUG oslo_concurrency.lockutils [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.628 186962 DEBUG oslo_concurrency.lockutils [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.628 186962 DEBUG oslo_concurrency.lockutils [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.628 186962 DEBUG oslo_concurrency.lockutils [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.629 186962 DEBUG oslo_concurrency.lockutils [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.653 186962 INFO nova.compute.manager [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Terminating instance#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.672 186962 DEBUG nova.compute.manager [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:58:33 np0005539505 kernel: tap8c82a0e1-8f (unregistering): left promiscuous mode
Nov 29 01:58:33 np0005539505 NetworkManager[55134]: <info>  [1764399513.7260] device (tap8c82a0e1-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.744 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:33 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:33Z|00204|binding|INFO|Releasing lport 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c from this chassis (sb_readonly=0)
Nov 29 01:58:33 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:33Z|00205|binding|INFO|Setting lport 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c down in Southbound
Nov 29 01:58:33 np0005539505 ovn_controller[95143]: 2025-11-29T06:58:33Z|00206|binding|INFO|Removing iface tap8c82a0e1-8f ovn-installed in OVS
Nov 29 01:58:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:33.754 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:e1:d2 10.100.0.5'], port_security=['fa:16:3e:4c:e1:d2 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'b2ea8f54-d424-4c0e-8387-90f8f7f699a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db691b6b-17b7-42a9-9fd2-162233da0513', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4362be0b90a64d63b2294bbc495486d3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03ee1f45-6435-43da-9a98-5273904b0bb0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=8c82a0e1-8f6b-4fda-87ac-5bb3f267671c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.754 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:33.756 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 8c82a0e1-8f6b-4fda-87ac-5bb3f267671c in datapath db691b6b-17b7-42a9-9fd2-162233da0513 unbound from our chassis#033[00m
Nov 29 01:58:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:33.760 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network db691b6b-17b7-42a9-9fd2-162233da0513, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:58:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:33.762 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1fba3912-b57a-4666-83b7-169fdcd3c5ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:33.763 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 namespace which is not needed anymore#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.763 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:33 np0005539505 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Nov 29 01:58:33 np0005539505 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002d.scope: Consumed 11.099s CPU time.
Nov 29 01:58:33 np0005539505 podman[221777]: 2025-11-29 06:58:33.785569612 +0000 UTC m=+0.102369916 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:58:33 np0005539505 systemd-machined[153285]: Machine qemu-24-instance-0000002d terminated.
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.936 186962 INFO nova.virt.libvirt.driver [-] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Instance destroyed successfully.#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.937 186962 DEBUG nova.objects.instance [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lazy-loading 'resources' on Instance uuid b2ea8f54-d424-4c0e-8387-90f8f7f699a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.954 186962 DEBUG nova.virt.libvirt.vif [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:58:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-325119097',display_name='tempest-ServerActionsTestOtherA-server-325119097',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-325119097',id=45,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:58:23Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4362be0b90a64d63b2294bbc495486d3',ramdisk_id='',reservation_id='r-68lqq68v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-229564135',owner_user_name='tempest-ServerActionsTestOtherA-229564135-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:58:23Z,user_data=None,user_id='812d926ee4ed4159b2e88b7a69990423',uuid=b2ea8f54-d424-4c0e-8387-90f8f7f699a0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "address": "fa:16:3e:4c:e1:d2", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c82a0e1-8f", "ovs_interfaceid": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.955 186962 DEBUG nova.network.os_vif_util [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converting VIF {"id": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "address": "fa:16:3e:4c:e1:d2", "network": {"id": "db691b6b-17b7-42a9-9fd2-162233da0513", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1448344175-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4362be0b90a64d63b2294bbc495486d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c82a0e1-8f", "ovs_interfaceid": "8c82a0e1-8f6b-4fda-87ac-5bb3f267671c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.955 186962 DEBUG nova.network.os_vif_util [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4c:e1:d2,bridge_name='br-int',has_traffic_filtering=True,id=8c82a0e1-8f6b-4fda-87ac-5bb3f267671c,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c82a0e1-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.956 186962 DEBUG os_vif [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4c:e1:d2,bridge_name='br-int',has_traffic_filtering=True,id=8c82a0e1-8f6b-4fda-87ac-5bb3f267671c,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c82a0e1-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.958 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.958 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c82a0e1-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.959 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.961 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.964 186962 INFO os_vif [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4c:e1:d2,bridge_name='br-int',has_traffic_filtering=True,id=8c82a0e1-8f6b-4fda-87ac-5bb3f267671c,network=Network(db691b6b-17b7-42a9-9fd2-162233da0513),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c82a0e1-8f')#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.964 186962 INFO nova.virt.libvirt.driver [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Deleting instance files /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0_del#033[00m
Nov 29 01:58:33 np0005539505 nova_compute[186958]: 2025-11-29 06:58:33.965 186962 INFO nova.virt.libvirt.driver [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Deletion of /var/lib/nova/instances/b2ea8f54-d424-4c0e-8387-90f8f7f699a0_del complete#033[00m
Nov 29 01:58:34 np0005539505 nova_compute[186958]: 2025-11-29 06:58:34.093 186962 DEBUG nova.compute.manager [req-d0d9507c-f1eb-4aa6-bd47-2d3e9a57565d req-6198e83e-fc2f-45f6-93ad-30251ebe6d18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Received event network-vif-unplugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:34 np0005539505 nova_compute[186958]: 2025-11-29 06:58:34.093 186962 DEBUG oslo_concurrency.lockutils [req-d0d9507c-f1eb-4aa6-bd47-2d3e9a57565d req-6198e83e-fc2f-45f6-93ad-30251ebe6d18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:34 np0005539505 nova_compute[186958]: 2025-11-29 06:58:34.094 186962 DEBUG oslo_concurrency.lockutils [req-d0d9507c-f1eb-4aa6-bd47-2d3e9a57565d req-6198e83e-fc2f-45f6-93ad-30251ebe6d18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:34 np0005539505 nova_compute[186958]: 2025-11-29 06:58:34.094 186962 DEBUG oslo_concurrency.lockutils [req-d0d9507c-f1eb-4aa6-bd47-2d3e9a57565d req-6198e83e-fc2f-45f6-93ad-30251ebe6d18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:34 np0005539505 nova_compute[186958]: 2025-11-29 06:58:34.095 186962 DEBUG nova.compute.manager [req-d0d9507c-f1eb-4aa6-bd47-2d3e9a57565d req-6198e83e-fc2f-45f6-93ad-30251ebe6d18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] No waiting events found dispatching network-vif-unplugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:58:34 np0005539505 nova_compute[186958]: 2025-11-29 06:58:34.095 186962 DEBUG nova.compute.manager [req-d0d9507c-f1eb-4aa6-bd47-2d3e9a57565d req-6198e83e-fc2f-45f6-93ad-30251ebe6d18 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Received event network-vif-unplugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:58:34 np0005539505 nova_compute[186958]: 2025-11-29 06:58:34.100 186962 INFO nova.compute.manager [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:58:34 np0005539505 nova_compute[186958]: 2025-11-29 06:58:34.101 186962 DEBUG oslo.service.loopingcall [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:58:34 np0005539505 nova_compute[186958]: 2025-11-29 06:58:34.102 186962 DEBUG nova.compute.manager [-] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:58:34 np0005539505 nova_compute[186958]: 2025-11-29 06:58:34.102 186962 DEBUG nova.network.neutron [-] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:58:34 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221717]: [NOTICE]   (221721) : haproxy version is 2.8.14-c23fe91
Nov 29 01:58:34 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221717]: [NOTICE]   (221721) : path to executable is /usr/sbin/haproxy
Nov 29 01:58:34 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221717]: [WARNING]  (221721) : Exiting Master process...
Nov 29 01:58:34 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221717]: [ALERT]    (221721) : Current worker (221723) exited with code 143 (Terminated)
Nov 29 01:58:34 np0005539505 neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513[221717]: [WARNING]  (221721) : All workers exited. Exiting... (0)
Nov 29 01:58:34 np0005539505 systemd[1]: libpod-3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a.scope: Deactivated successfully.
Nov 29 01:58:34 np0005539505 podman[221821]: 2025-11-29 06:58:34.351300945 +0000 UTC m=+0.496915134 container died 3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.077 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399500.076837, c8ab194e-e936-4110-aef3-1eb79dc427c6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.078 186962 INFO nova.compute.manager [-] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.098 186962 DEBUG nova.network.neutron [-] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.110 186962 DEBUG nova.compute.manager [None req-f73e404a-adba-49c1-b86c-25307ece93f1 - - - - - -] [instance: c8ab194e-e936-4110-aef3-1eb79dc427c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.135 186962 INFO nova.compute.manager [-] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Took 1.03 seconds to deallocate network for instance.#033[00m
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.241 186962 DEBUG oslo_concurrency.lockutils [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.242 186962 DEBUG oslo_concurrency.lockutils [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.317 186962 DEBUG nova.compute.provider_tree [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.339 186962 DEBUG nova.scheduler.client.report [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.376 186962 DEBUG oslo_concurrency.lockutils [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:35 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a-userdata-shm.mount: Deactivated successfully.
Nov 29 01:58:35 np0005539505 systemd[1]: var-lib-containers-storage-overlay-1c592b3b8ec7933364cb609b42eec099e1339ca4a1164facb431eb7358b9fc25-merged.mount: Deactivated successfully.
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.420 186962 INFO nova.scheduler.client.report [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Deleted allocations for instance b2ea8f54-d424-4c0e-8387-90f8f7f699a0#033[00m
Nov 29 01:58:35 np0005539505 nova_compute[186958]: 2025-11-29 06:58:35.523 186962 DEBUG oslo_concurrency.lockutils [None req-04b47b50-ab49-473f-a9cc-e189e99c1145 812d926ee4ed4159b2e88b7a69990423 4362be0b90a64d63b2294bbc495486d3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:36 np0005539505 podman[221821]: 2025-11-29 06:58:36.070349377 +0000 UTC m=+2.215963536 container cleanup 3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:58:36 np0005539505 systemd[1]: libpod-conmon-3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a.scope: Deactivated successfully.
Nov 29 01:58:36 np0005539505 nova_compute[186958]: 2025-11-29 06:58:36.253 186962 DEBUG nova.compute.manager [req-06c92f38-4079-4c94-b1ed-260812fdaa6b req-24cbd849-b8ea-43a5-b4d6-0b1380056220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Received event network-vif-plugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:36 np0005539505 nova_compute[186958]: 2025-11-29 06:58:36.254 186962 DEBUG oslo_concurrency.lockutils [req-06c92f38-4079-4c94-b1ed-260812fdaa6b req-24cbd849-b8ea-43a5-b4d6-0b1380056220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:36 np0005539505 nova_compute[186958]: 2025-11-29 06:58:36.255 186962 DEBUG oslo_concurrency.lockutils [req-06c92f38-4079-4c94-b1ed-260812fdaa6b req-24cbd849-b8ea-43a5-b4d6-0b1380056220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:36 np0005539505 nova_compute[186958]: 2025-11-29 06:58:36.255 186962 DEBUG oslo_concurrency.lockutils [req-06c92f38-4079-4c94-b1ed-260812fdaa6b req-24cbd849-b8ea-43a5-b4d6-0b1380056220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b2ea8f54-d424-4c0e-8387-90f8f7f699a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:36 np0005539505 nova_compute[186958]: 2025-11-29 06:58:36.256 186962 DEBUG nova.compute.manager [req-06c92f38-4079-4c94-b1ed-260812fdaa6b req-24cbd849-b8ea-43a5-b4d6-0b1380056220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] No waiting events found dispatching network-vif-plugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:58:36 np0005539505 nova_compute[186958]: 2025-11-29 06:58:36.257 186962 WARNING nova.compute.manager [req-06c92f38-4079-4c94-b1ed-260812fdaa6b req-24cbd849-b8ea-43a5-b4d6-0b1380056220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Received unexpected event network-vif-plugged-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c for instance with vm_state deleted and task_state None.#033[00m
Nov 29 01:58:36 np0005539505 nova_compute[186958]: 2025-11-29 06:58:36.257 186962 DEBUG nova.compute.manager [req-06c92f38-4079-4c94-b1ed-260812fdaa6b req-24cbd849-b8ea-43a5-b4d6-0b1380056220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Received event network-vif-deleted-8c82a0e1-8f6b-4fda-87ac-5bb3f267671c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:58:37 np0005539505 podman[221869]: 2025-11-29 06:58:37.87467551 +0000 UTC m=+1.777088235 container remove 3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 01:58:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:37.880 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4d2dda-1dd4-4e7f-8c0d-2dcfc6121e4a]: (4, ('Sat Nov 29 06:58:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 (3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a)\n3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a\nSat Nov 29 06:58:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 (3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a)\n3295ef455ba75230cff60aae157b283c6c6bccaab4dfe0ac8c33f97b1d1a117a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:37.881 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2e883d8f-1006-47dd-9043-15152b80214e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:37.882 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb691b6b-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:58:37 np0005539505 nova_compute[186958]: 2025-11-29 06:58:37.884 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539505 kernel: tapdb691b6b-10: left promiscuous mode
Nov 29 01:58:37 np0005539505 nova_compute[186958]: 2025-11-29 06:58:37.901 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539505 nova_compute[186958]: 2025-11-29 06:58:37.913 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:37.915 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5084a8b8-d9ba-40bb-bdb9-a444eda14d6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:37.931 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f2f9d9e0-6dee-4fb6-a0f6-7206df91ba7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:37.932 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bec42519-4c51-4443-b3d6-b10deed639c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:37.955 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6541b302-2663-4a07-8e43-903277a37b8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495042, 'reachable_time': 31994, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221884, 'error': None, 'target': 'ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:37.958 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-db691b6b-17b7-42a9-9fd2-162233da0513 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:58:37 np0005539505 systemd[1]: run-netns-ovnmeta\x2ddb691b6b\x2d17b7\x2d42a9\x2d9fd2\x2d162233da0513.mount: Deactivated successfully.
Nov 29 01:58:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:37.959 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ab7aa4-8ffa-4018-ac29-e8ba408ce71e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:58:38 np0005539505 nova_compute[186958]: 2025-11-29 06:58:38.468 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:38 np0005539505 nova_compute[186958]: 2025-11-29 06:58:38.562 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:38 np0005539505 nova_compute[186958]: 2025-11-29 06:58:38.961 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:39 np0005539505 podman[221889]: 2025-11-29 06:58:39.719249725 +0000 UTC m=+0.054731467 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:58:39 np0005539505 podman[221890]: 2025-11-29 06:58:39.755908581 +0000 UTC m=+0.089146868 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 29 01:58:42 np0005539505 nova_compute[186958]: 2025-11-29 06:58:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:43 np0005539505 nova_compute[186958]: 2025-11-29 06:58:43.563 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:43 np0005539505 nova_compute[186958]: 2025-11-29 06:58:43.963 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:44 np0005539505 nova_compute[186958]: 2025-11-29 06:58:44.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:45 np0005539505 nova_compute[186958]: 2025-11-29 06:58:45.539 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:46 np0005539505 podman[221937]: 2025-11-29 06:58:46.749350006 +0000 UTC m=+0.074041144 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm)
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 06:58:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 01:58:48 np0005539505 nova_compute[186958]: 2025-11-29 06:58:48.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:48 np0005539505 nova_compute[186958]: 2025-11-29 06:58:48.565 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:48 np0005539505 podman[221956]: 2025-11-29 06:58:48.721672929 +0000 UTC m=+0.057926430 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 01:58:48 np0005539505 nova_compute[186958]: 2025-11-29 06:58:48.935 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399513.9337864, b2ea8f54-d424-4c0e-8387-90f8f7f699a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:58:48 np0005539505 nova_compute[186958]: 2025-11-29 06:58:48.935 186962 INFO nova.compute.manager [-] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:58:48 np0005539505 nova_compute[186958]: 2025-11-29 06:58:48.965 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:51 np0005539505 nova_compute[186958]: 2025-11-29 06:58:51.003 186962 DEBUG nova.compute.manager [None req-c5997ad7-9bed-4a32-b42f-0cbda93c4cbc - - - - - -] [instance: b2ea8f54-d424-4c0e-8387-90f8f7f699a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:58:51 np0005539505 nova_compute[186958]: 2025-11-29 06:58:51.005 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:51 np0005539505 nova_compute[186958]: 2025-11-29 06:58:51.006 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:51 np0005539505 nova_compute[186958]: 2025-11-29 06:58:51.006 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:51 np0005539505 nova_compute[186958]: 2025-11-29 06:58:51.007 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:58:51 np0005539505 nova_compute[186958]: 2025-11-29 06:58:51.259 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:58:51 np0005539505 nova_compute[186958]: 2025-11-29 06:58:51.261 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5706MB free_disk=73.23405456542969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:58:51 np0005539505 nova_compute[186958]: 2025-11-29 06:58:51.262 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:58:51 np0005539505 nova_compute[186958]: 2025-11-29 06:58:51.262 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:58:53 np0005539505 nova_compute[186958]: 2025-11-29 06:58:53.351 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:58:53 np0005539505 nova_compute[186958]: 2025-11-29 06:58:53.352 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:58:53 np0005539505 nova_compute[186958]: 2025-11-29 06:58:53.385 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 01:58:53 np0005539505 nova_compute[186958]: 2025-11-29 06:58:53.409 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 01:58:53 np0005539505 nova_compute[186958]: 2025-11-29 06:58:53.410 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 01:58:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:53.449 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:58:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:58:53.450 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:58:53 np0005539505 nova_compute[186958]: 2025-11-29 06:58:53.464 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 01:58:53 np0005539505 nova_compute[186958]: 2025-11-29 06:58:53.503 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 01:58:53 np0005539505 nova_compute[186958]: 2025-11-29 06:58:53.505 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:53 np0005539505 nova_compute[186958]: 2025-11-29 06:58:53.564 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:58:53 np0005539505 nova_compute[186958]: 2025-11-29 06:58:53.567 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:53 np0005539505 nova_compute[186958]: 2025-11-29 06:58:53.967 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:54 np0005539505 nova_compute[186958]: 2025-11-29 06:58:54.596 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:58:55 np0005539505 nova_compute[186958]: 2025-11-29 06:58:55.627 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:58:55 np0005539505 nova_compute[186958]: 2025-11-29 06:58:55.628 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:58:56 np0005539505 nova_compute[186958]: 2025-11-29 06:58:56.623 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:56 np0005539505 nova_compute[186958]: 2025-11-29 06:58:56.624 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:58:56 np0005539505 nova_compute[186958]: 2025-11-29 06:58:56.624 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:58:58 np0005539505 nova_compute[186958]: 2025-11-29 06:58:58.572 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:58 np0005539505 nova_compute[186958]: 2025-11-29 06:58:58.969 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:58:59 np0005539505 podman[221976]: 2025-11-29 06:58:59.734001682 +0000 UTC m=+0.056631893 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 01:58:59 np0005539505 podman[221975]: 2025-11-29 06:58:59.760350255 +0000 UTC m=+0.078405042 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git)
Nov 29 01:59:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:00.454 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:59:01 np0005539505 nova_compute[186958]: 2025-11-29 06:59:01.105 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:59:01 np0005539505 nova_compute[186958]: 2025-11-29 06:59:01.106 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:01 np0005539505 nova_compute[186958]: 2025-11-29 06:59:01.106 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:01 np0005539505 nova_compute[186958]: 2025-11-29 06:59:01.106 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:01 np0005539505 nova_compute[186958]: 2025-11-29 06:59:01.106 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:59:01 np0005539505 nova_compute[186958]: 2025-11-29 06:59:01.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.427 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "7eb30444-a881-4d6c-b193-c84ee550e6d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.428 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.470 186962 DEBUG nova.compute.manager [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.575 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.598 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.602 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.603 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.610 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.610 186962 INFO nova.compute.claims [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.758 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.831 186962 DEBUG nova.compute.provider_tree [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.884 186962 DEBUG nova.scheduler.client.report [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.907 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.908 186962 DEBUG nova.compute.manager [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 01:59:03 np0005539505 nova_compute[186958]: 2025-11-29 06:59:03.971 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.025 186962 DEBUG nova.compute.manager [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.026 186962 DEBUG nova.network.neutron [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.063 186962 INFO nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.094 186962 DEBUG nova.compute.manager [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.558 186962 DEBUG nova.compute.manager [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.559 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.560 186962 INFO nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Creating image(s)#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.560 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "/var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.560 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.561 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "/var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.574 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.602 186962 DEBUG nova.policy [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4ecd161098b5422084003b39f0504a8f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98df116965b74e4a9985049062e65162', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.682 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.108s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.683 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.683 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.694 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:04 np0005539505 podman[222021]: 2025-11-29 06:59:04.716038415 +0000 UTC m=+0.053781449 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.749 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:04 np0005539505 nova_compute[186958]: 2025-11-29 06:59:04.750 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.787 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk 1073741824" returned: 0 in 1.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.788 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.789 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.875 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.876 186962 DEBUG nova.virt.disk.api [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Checking if we can resize image /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.876 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.949 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.950 186962 DEBUG nova.virt.disk.api [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Cannot resize image /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.950 186962 DEBUG nova.objects.instance [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'migration_context' on Instance uuid 7eb30444-a881-4d6c-b193-c84ee550e6d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.981 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.982 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Ensure instance console log exists: /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.982 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.982 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:05 np0005539505 nova_compute[186958]: 2025-11-29 06:59:05.983 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:06 np0005539505 nova_compute[186958]: 2025-11-29 06:59:06.185 186962 DEBUG nova.network.neutron [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Successfully created port: d128a673-234f-43e3-8e11-622b61d840ec _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 01:59:08 np0005539505 nova_compute[186958]: 2025-11-29 06:59:08.575 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:08 np0005539505 nova_compute[186958]: 2025-11-29 06:59:08.973 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:10 np0005539505 podman[222054]: 2025-11-29 06:59:10.731632182 +0000 UTC m=+0.061958000 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 01:59:10 np0005539505 podman[222055]: 2025-11-29 06:59:10.794998882 +0000 UTC m=+0.121357393 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:59:10 np0005539505 nova_compute[186958]: 2025-11-29 06:59:10.889 186962 DEBUG nova.network.neutron [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Successfully updated port: d128a673-234f-43e3-8e11-622b61d840ec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 01:59:10 np0005539505 nova_compute[186958]: 2025-11-29 06:59:10.946 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "refresh_cache-7eb30444-a881-4d6c-b193-c84ee550e6d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:59:10 np0005539505 nova_compute[186958]: 2025-11-29 06:59:10.946 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquired lock "refresh_cache-7eb30444-a881-4d6c-b193-c84ee550e6d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:59:10 np0005539505 nova_compute[186958]: 2025-11-29 06:59:10.947 186962 DEBUG nova.network.neutron [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 01:59:11 np0005539505 nova_compute[186958]: 2025-11-29 06:59:11.114 186962 DEBUG nova.compute.manager [req-b109ca82-3cba-4cbe-b7ce-ec4d461d40c7 req-cb3a6df9-c425-4e20-aa80-798edaef65c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Received event network-changed-d128a673-234f-43e3-8e11-622b61d840ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:59:11 np0005539505 nova_compute[186958]: 2025-11-29 06:59:11.115 186962 DEBUG nova.compute.manager [req-b109ca82-3cba-4cbe-b7ce-ec4d461d40c7 req-cb3a6df9-c425-4e20-aa80-798edaef65c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Refreshing instance network info cache due to event network-changed-d128a673-234f-43e3-8e11-622b61d840ec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 01:59:11 np0005539505 nova_compute[186958]: 2025-11-29 06:59:11.116 186962 DEBUG oslo_concurrency.lockutils [req-b109ca82-3cba-4cbe-b7ce-ec4d461d40c7 req-cb3a6df9-c425-4e20-aa80-798edaef65c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7eb30444-a881-4d6c-b193-c84ee550e6d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 01:59:11 np0005539505 nova_compute[186958]: 2025-11-29 06:59:11.354 186962 DEBUG nova.network.neutron [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 01:59:13 np0005539505 nova_compute[186958]: 2025-11-29 06:59:13.578 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:13 np0005539505 nova_compute[186958]: 2025-11-29 06:59:13.976 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:17 np0005539505 podman[222104]: 2025-11-29 06:59:17.726210119 +0000 UTC m=+0.058360933 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:59:18 np0005539505 nova_compute[186958]: 2025-11-29 06:59:18.580 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:18 np0005539505 nova_compute[186958]: 2025-11-29 06:59:18.978 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:19 np0005539505 podman[222124]: 2025-11-29 06:59:19.722982662 +0000 UTC m=+0.051855773 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 01:59:20 np0005539505 nova_compute[186958]: 2025-11-29 06:59:20.552 186962 DEBUG nova.network.neutron [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Updating instance_info_cache with network_info: [{"id": "d128a673-234f-43e3-8e11-622b61d840ec", "address": "fa:16:3e:22:ab:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd128a673-23", "ovs_interfaceid": "d128a673-234f-43e3-8e11-622b61d840ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.190 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Releasing lock "refresh_cache-7eb30444-a881-4d6c-b193-c84ee550e6d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.191 186962 DEBUG nova.compute.manager [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Instance network_info: |[{"id": "d128a673-234f-43e3-8e11-622b61d840ec", "address": "fa:16:3e:22:ab:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd128a673-23", "ovs_interfaceid": "d128a673-234f-43e3-8e11-622b61d840ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.191 186962 DEBUG oslo_concurrency.lockutils [req-b109ca82-3cba-4cbe-b7ce-ec4d461d40c7 req-cb3a6df9-c425-4e20-aa80-798edaef65c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7eb30444-a881-4d6c-b193-c84ee550e6d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.191 186962 DEBUG nova.network.neutron [req-b109ca82-3cba-4cbe-b7ce-ec4d461d40c7 req-cb3a6df9-c425-4e20-aa80-798edaef65c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Refreshing network info cache for port d128a673-234f-43e3-8e11-622b61d840ec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.194 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Start _get_guest_xml network_info=[{"id": "d128a673-234f-43e3-8e11-622b61d840ec", "address": "fa:16:3e:22:ab:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd128a673-23", "ovs_interfaceid": "d128a673-234f-43e3-8e11-622b61d840ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.200 186962 WARNING nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.214 186962 DEBUG nova.virt.libvirt.host [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.215 186962 DEBUG nova.virt.libvirt.host [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.219 186962 DEBUG nova.virt.libvirt.host [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.220 186962 DEBUG nova.virt.libvirt.host [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.222 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.223 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.223 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.224 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.224 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.224 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.224 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.224 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.224 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.225 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.225 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.225 186962 DEBUG nova.virt.hardware [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.229 186962 DEBUG nova.virt.libvirt.vif [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:58:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-725319999',display_name='tempest-DeleteServersTestJSON-server-725319999',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-725319999',id=47,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-rimj9cnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:59:04Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=7eb30444-a881-4d6c-b193-c84ee550e6d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d128a673-234f-43e3-8e11-622b61d840ec", "address": "fa:16:3e:22:ab:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd128a673-23", "ovs_interfaceid": "d128a673-234f-43e3-8e11-622b61d840ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.229 186962 DEBUG nova.network.os_vif_util [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "d128a673-234f-43e3-8e11-622b61d840ec", "address": "fa:16:3e:22:ab:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd128a673-23", "ovs_interfaceid": "d128a673-234f-43e3-8e11-622b61d840ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.230 186962 DEBUG nova.network.os_vif_util [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:ab:b0,bridge_name='br-int',has_traffic_filtering=True,id=d128a673-234f-43e3-8e11-622b61d840ec,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd128a673-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.231 186962 DEBUG nova.objects.instance [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7eb30444-a881-4d6c-b193-c84ee550e6d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.311 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] End _get_guest_xml xml=<domain type="kvm">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  <uuid>7eb30444-a881-4d6c-b193-c84ee550e6d5</uuid>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  <name>instance-0000002f</name>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <nova:name>tempest-DeleteServersTestJSON-server-725319999</nova:name>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 06:59:21</nova:creationTime>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:        <nova:user uuid="4ecd161098b5422084003b39f0504a8f">tempest-DeleteServersTestJSON-1973671383-project-member</nova:user>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:        <nova:project uuid="98df116965b74e4a9985049062e65162">tempest-DeleteServersTestJSON-1973671383</nova:project>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:        <nova:port uuid="d128a673-234f-43e3-8e11-622b61d840ec">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <system>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <entry name="serial">7eb30444-a881-4d6c-b193-c84ee550e6d5</entry>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <entry name="uuid">7eb30444-a881-4d6c-b193-c84ee550e6d5</entry>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    </system>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  <os>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  </os>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  <features>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  </features>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  </clock>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  <devices>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk.config"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    </disk>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:22:ab:b0"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <target dev="tapd128a673-23"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    </interface>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/console.log" append="off"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    </serial>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <video>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    </video>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    </rng>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 01:59:21 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 01:59:21 np0005539505 nova_compute[186958]:  </devices>
Nov 29 01:59:21 np0005539505 nova_compute[186958]: </domain>
Nov 29 01:59:21 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.313 186962 DEBUG nova.compute.manager [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Preparing to wait for external event network-vif-plugged-d128a673-234f-43e3-8e11-622b61d840ec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.313 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.313 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.314 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.315 186962 DEBUG nova.virt.libvirt.vif [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T06:58:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-725319999',display_name='tempest-DeleteServersTestJSON-server-725319999',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-725319999',id=47,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-rimj9cnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T06:59:04Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=7eb30444-a881-4d6c-b193-c84ee550e6d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d128a673-234f-43e3-8e11-622b61d840ec", "address": "fa:16:3e:22:ab:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd128a673-23", "ovs_interfaceid": "d128a673-234f-43e3-8e11-622b61d840ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.315 186962 DEBUG nova.network.os_vif_util [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "d128a673-234f-43e3-8e11-622b61d840ec", "address": "fa:16:3e:22:ab:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd128a673-23", "ovs_interfaceid": "d128a673-234f-43e3-8e11-622b61d840ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.316 186962 DEBUG nova.network.os_vif_util [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:ab:b0,bridge_name='br-int',has_traffic_filtering=True,id=d128a673-234f-43e3-8e11-622b61d840ec,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd128a673-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.316 186962 DEBUG os_vif [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:ab:b0,bridge_name='br-int',has_traffic_filtering=True,id=d128a673-234f-43e3-8e11-622b61d840ec,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd128a673-23') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.317 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.317 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.318 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.321 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.321 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd128a673-23, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.322 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd128a673-23, col_values=(('external_ids', {'iface-id': 'd128a673-234f-43e3-8e11-622b61d840ec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:ab:b0', 'vm-uuid': '7eb30444-a881-4d6c-b193-c84ee550e6d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.324 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:21 np0005539505 NetworkManager[55134]: <info>  [1764399561.3255] manager: (tapd128a673-23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.327 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.333 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:21 np0005539505 nova_compute[186958]: 2025-11-29 06:59:21.334 186962 INFO os_vif [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:ab:b0,bridge_name='br-int',has_traffic_filtering=True,id=d128a673-234f-43e3-8e11-622b61d840ec,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd128a673-23')#033[00m
Nov 29 01:59:22 np0005539505 nova_compute[186958]: 2025-11-29 06:59:22.371 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:59:22 np0005539505 nova_compute[186958]: 2025-11-29 06:59:22.373 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 01:59:22 np0005539505 nova_compute[186958]: 2025-11-29 06:59:22.373 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] No VIF found with MAC fa:16:3e:22:ab:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 01:59:22 np0005539505 nova_compute[186958]: 2025-11-29 06:59:22.374 186962 INFO nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Using config drive#033[00m
Nov 29 01:59:23 np0005539505 nova_compute[186958]: 2025-11-29 06:59:23.582 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:23 np0005539505 nova_compute[186958]: 2025-11-29 06:59:23.730 186962 INFO nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Creating config drive at /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk.config#033[00m
Nov 29 01:59:23 np0005539505 nova_compute[186958]: 2025-11-29 06:59:23.735 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5__ylakg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 01:59:23 np0005539505 nova_compute[186958]: 2025-11-29 06:59:23.861 186962 DEBUG oslo_concurrency.processutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5__ylakg" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 01:59:23 np0005539505 kernel: tapd128a673-23: entered promiscuous mode
Nov 29 01:59:23 np0005539505 NetworkManager[55134]: <info>  [1764399563.9446] manager: (tapd128a673-23): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Nov 29 01:59:23 np0005539505 ovn_controller[95143]: 2025-11-29T06:59:23Z|00207|binding|INFO|Claiming lport d128a673-234f-43e3-8e11-622b61d840ec for this chassis.
Nov 29 01:59:23 np0005539505 ovn_controller[95143]: 2025-11-29T06:59:23Z|00208|binding|INFO|d128a673-234f-43e3-8e11-622b61d840ec: Claiming fa:16:3e:22:ab:b0 10.100.0.13
Nov 29 01:59:23 np0005539505 nova_compute[186958]: 2025-11-29 06:59:23.945 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:23 np0005539505 nova_compute[186958]: 2025-11-29 06:59:23.950 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:23.965 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:ab:b0 10.100.0.13'], port_security=['fa:16:3e:22:ab:b0 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7eb30444-a881-4d6c-b193-c84ee550e6d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '2', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=d128a673-234f-43e3-8e11-622b61d840ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:59:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:23.966 104094 INFO neutron.agent.ovn.metadata.agent [-] Port d128a673-234f-43e3-8e11-622b61d840ec in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd bound to our chassis#033[00m
Nov 29 01:59:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:23.967 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd#033[00m
Nov 29 01:59:23 np0005539505 systemd-udevd[222164]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 01:59:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:23.980 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[48455b51-bfbe-43ba-8f68-48f20c84c082]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:23.982 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd9eb57e-b1 in ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 01:59:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:23.984 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd9eb57e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 01:59:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:23.985 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[76bbff57-d747-4fb4-a182-085fd87a3641]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:23.985 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c59b51c6-8ba2-4e76-986a-90b99bad99f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:23 np0005539505 systemd-machined[153285]: New machine qemu-25-instance-0000002f.
Nov 29 01:59:23 np0005539505 NetworkManager[55134]: <info>  [1764399563.9985] device (tapd128a673-23): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 01:59:23 np0005539505 NetworkManager[55134]: <info>  [1764399563.9996] device (tapd128a673-23): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.004 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[02b9bb03-57d8-430e-a7db-394a0dcf225d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.007 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:24 np0005539505 ovn_controller[95143]: 2025-11-29T06:59:24Z|00209|binding|INFO|Setting lport d128a673-234f-43e3-8e11-622b61d840ec ovn-installed in OVS
Nov 29 01:59:24 np0005539505 ovn_controller[95143]: 2025-11-29T06:59:24Z|00210|binding|INFO|Setting lport d128a673-234f-43e3-8e11-622b61d840ec up in Southbound
Nov 29 01:59:24 np0005539505 systemd[1]: Started Virtual Machine qemu-25-instance-0000002f.
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.014 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.020 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f3b8a1f9-5e59-4e73-b461-ec5ace7acd6c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.053 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[4d09c00c-6743-4549-b2ca-c77d4597d239]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.061 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c8389c1b-3e9c-43fd-8fe1-0764ea6fa113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 NetworkManager[55134]: <info>  [1764399564.0621] manager: (tapfd9eb57e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/109)
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.102 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[0997e784-e9f4-4fd5-a12a-69ebf95cb9c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.106 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[845f8d09-a019-4f13-926b-267ac3aac1e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 NetworkManager[55134]: <info>  [1764399564.1284] device (tapfd9eb57e-b0): carrier: link connected
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.132 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[be490fa8-27f5-42bf-b08b-907882a5e0e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.149 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[38d130f1-c78a-4393-b633-eb7b2c1b6323]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501171, 'reachable_time': 25458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222197, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.165 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[65a28309-574f-4dda-a02c-9c6d2262da76]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:80ac'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501171, 'tstamp': 501171}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222198, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.181 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9b880ff2-4392-47dc-9843-8ad15b65807d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd9eb57e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:80:ac'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501171, 'reachable_time': 25458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222199, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.210 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[53a6b110-ea3e-40af-865a-2db361394c4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.263 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[15d6528b-be04-401a-b89e-f5078bcf3d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.265 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.266 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.266 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd9eb57e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:59:24 np0005539505 kernel: tapfd9eb57e-b0: entered promiscuous mode
Nov 29 01:59:24 np0005539505 NetworkManager[55134]: <info>  [1764399564.2691] manager: (tapfd9eb57e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.269 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.272 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd9eb57e-b0, col_values=(('external_ids', {'iface-id': 'e7b4cb4f-cb6d-4f0e-8c8d-34c743671595'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.274 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:24 np0005539505 ovn_controller[95143]: 2025-11-29T06:59:24Z|00211|binding|INFO|Releasing lport e7b4cb4f-cb6d-4f0e-8c8d-34c743671595 from this chassis (sb_readonly=0)
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.288 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.289 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.290 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6e74b56a-9c24-472d-be8a-ed87d90a5eef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.291 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.pid.haproxy
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID fd9eb57e-b1f8-4bae-a60f-8e40613556cd
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 01:59:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:24.293 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'env', 'PROCESS_TAG=haproxy-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd9eb57e-b1f8-4bae-a60f-8e40613556cd.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.377 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399564.3768086, 7eb30444-a881-4d6c-b193-c84ee550e6d5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.378 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] VM Started (Lifecycle Event)#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.403 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.408 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399564.3769999, 7eb30444-a881-4d6c-b193-c84ee550e6d5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.409 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] VM Paused (Lifecycle Event)#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.438 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.442 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.461 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.585 186962 DEBUG nova.compute.manager [req-c4d2dcb3-9f03-4dd4-8a68-aa74818d51e9 req-2b410d8f-aefa-4ada-81df-1227de070299 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Received event network-vif-plugged-d128a673-234f-43e3-8e11-622b61d840ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.586 186962 DEBUG oslo_concurrency.lockutils [req-c4d2dcb3-9f03-4dd4-8a68-aa74818d51e9 req-2b410d8f-aefa-4ada-81df-1227de070299 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.587 186962 DEBUG oslo_concurrency.lockutils [req-c4d2dcb3-9f03-4dd4-8a68-aa74818d51e9 req-2b410d8f-aefa-4ada-81df-1227de070299 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.587 186962 DEBUG oslo_concurrency.lockutils [req-c4d2dcb3-9f03-4dd4-8a68-aa74818d51e9 req-2b410d8f-aefa-4ada-81df-1227de070299 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.587 186962 DEBUG nova.compute.manager [req-c4d2dcb3-9f03-4dd4-8a68-aa74818d51e9 req-2b410d8f-aefa-4ada-81df-1227de070299 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Processing event network-vif-plugged-d128a673-234f-43e3-8e11-622b61d840ec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.588 186962 DEBUG nova.compute.manager [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.594 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399564.5939724, 7eb30444-a881-4d6c-b193-c84ee550e6d5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.595 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] VM Resumed (Lifecycle Event)#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.597 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.601 186962 INFO nova.virt.libvirt.driver [-] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Instance spawned successfully.#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.602 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.634 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.642 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.645 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.645 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.646 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.646 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.647 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.647 186962 DEBUG nova.virt.libvirt.driver [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.716 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 01:59:24 np0005539505 podman[222237]: 2025-11-29 06:59:24.656636516 +0000 UTC m=+0.025770268 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 01:59:24 np0005539505 podman[222237]: 2025-11-29 06:59:24.835034711 +0000 UTC m=+0.204168443 container create 5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.907 186962 INFO nova.compute.manager [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Took 20.35 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 01:59:24 np0005539505 nova_compute[186958]: 2025-11-29 06:59:24.908 186962 DEBUG nova.compute.manager [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:59:24 np0005539505 systemd[1]: Started libpod-conmon-5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa.scope.
Nov 29 01:59:24 np0005539505 systemd[1]: Started libcrun container.
Nov 29 01:59:24 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e17e37bbe0cc2bfdb8b8dfe17ce1a22625ed6282e90018902cba6239872862bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 01:59:24 np0005539505 podman[222237]: 2025-11-29 06:59:24.981963414 +0000 UTC m=+0.351097166 container init 5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 01:59:24 np0005539505 podman[222237]: 2025-11-29 06:59:24.987768414 +0000 UTC m=+0.356902146 container start 5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 01:59:25 np0005539505 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222252]: [NOTICE]   (222256) : New worker (222258) forked
Nov 29 01:59:25 np0005539505 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222252]: [NOTICE]   (222256) : Loading success.
Nov 29 01:59:25 np0005539505 nova_compute[186958]: 2025-11-29 06:59:25.364 186962 DEBUG nova.network.neutron [req-b109ca82-3cba-4cbe-b7ce-ec4d461d40c7 req-cb3a6df9-c425-4e20-aa80-798edaef65c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Updated VIF entry in instance network info cache for port d128a673-234f-43e3-8e11-622b61d840ec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 01:59:25 np0005539505 nova_compute[186958]: 2025-11-29 06:59:25.368 186962 DEBUG nova.network.neutron [req-b109ca82-3cba-4cbe-b7ce-ec4d461d40c7 req-cb3a6df9-c425-4e20-aa80-798edaef65c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Updating instance_info_cache with network_info: [{"id": "d128a673-234f-43e3-8e11-622b61d840ec", "address": "fa:16:3e:22:ab:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd128a673-23", "ovs_interfaceid": "d128a673-234f-43e3-8e11-622b61d840ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:59:25 np0005539505 nova_compute[186958]: 2025-11-29 06:59:25.419 186962 INFO nova.compute.manager [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Took 21.89 seconds to build instance.#033[00m
Nov 29 01:59:25 np0005539505 nova_compute[186958]: 2025-11-29 06:59:25.447 186962 DEBUG oslo_concurrency.lockutils [req-b109ca82-3cba-4cbe-b7ce-ec4d461d40c7 req-cb3a6df9-c425-4e20-aa80-798edaef65c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7eb30444-a881-4d6c-b193-c84ee550e6d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 01:59:25 np0005539505 nova_compute[186958]: 2025-11-29 06:59:25.490 186962 DEBUG oslo_concurrency.lockutils [None req-4446076a-0c40-4498-b347-afda1b669bea 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:26 np0005539505 nova_compute[186958]: 2025-11-29 06:59:26.326 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:26.936 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:26.937 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:26.938 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:28 np0005539505 nova_compute[186958]: 2025-11-29 06:59:28.585 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:29 np0005539505 nova_compute[186958]: 2025-11-29 06:59:29.488 186962 DEBUG nova.compute.manager [req-4ae2fe61-8040-463d-b6fd-5531b4e803c6 req-aa8ca1ab-9811-4e0c-af02-f3d3960795f1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Received event network-vif-plugged-d128a673-234f-43e3-8e11-622b61d840ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:59:29 np0005539505 nova_compute[186958]: 2025-11-29 06:59:29.488 186962 DEBUG oslo_concurrency.lockutils [req-4ae2fe61-8040-463d-b6fd-5531b4e803c6 req-aa8ca1ab-9811-4e0c-af02-f3d3960795f1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:29 np0005539505 nova_compute[186958]: 2025-11-29 06:59:29.489 186962 DEBUG oslo_concurrency.lockutils [req-4ae2fe61-8040-463d-b6fd-5531b4e803c6 req-aa8ca1ab-9811-4e0c-af02-f3d3960795f1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:29 np0005539505 nova_compute[186958]: 2025-11-29 06:59:29.489 186962 DEBUG oslo_concurrency.lockutils [req-4ae2fe61-8040-463d-b6fd-5531b4e803c6 req-aa8ca1ab-9811-4e0c-af02-f3d3960795f1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:29 np0005539505 nova_compute[186958]: 2025-11-29 06:59:29.490 186962 DEBUG nova.compute.manager [req-4ae2fe61-8040-463d-b6fd-5531b4e803c6 req-aa8ca1ab-9811-4e0c-af02-f3d3960795f1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] No waiting events found dispatching network-vif-plugged-d128a673-234f-43e3-8e11-622b61d840ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:59:29 np0005539505 nova_compute[186958]: 2025-11-29 06:59:29.490 186962 WARNING nova.compute.manager [req-4ae2fe61-8040-463d-b6fd-5531b4e803c6 req-aa8ca1ab-9811-4e0c-af02-f3d3960795f1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Received unexpected event network-vif-plugged-d128a673-234f-43e3-8e11-622b61d840ec for instance with vm_state active and task_state None.#033[00m
Nov 29 01:59:30 np0005539505 podman[222268]: 2025-11-29 06:59:30.760020899 +0000 UTC m=+0.066396670 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 01:59:30 np0005539505 podman[222267]: 2025-11-29 06:59:30.780353596 +0000 UTC m=+0.099323156 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc.)
Nov 29 01:59:31 np0005539505 nova_compute[186958]: 2025-11-29 06:59:31.329 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.152 186962 DEBUG oslo_concurrency.lockutils [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "7eb30444-a881-4d6c-b193-c84ee550e6d5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.153 186962 DEBUG oslo_concurrency.lockutils [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.153 186962 DEBUG oslo_concurrency.lockutils [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.154 186962 DEBUG oslo_concurrency.lockutils [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.154 186962 DEBUG oslo_concurrency.lockutils [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.168 186962 INFO nova.compute.manager [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Terminating instance#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.179 186962 DEBUG nova.compute.manager [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 01:59:32 np0005539505 kernel: tapd128a673-23 (unregistering): left promiscuous mode
Nov 29 01:59:32 np0005539505 NetworkManager[55134]: <info>  [1764399572.2076] device (tapd128a673-23): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.215 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:32 np0005539505 ovn_controller[95143]: 2025-11-29T06:59:32Z|00212|binding|INFO|Releasing lport d128a673-234f-43e3-8e11-622b61d840ec from this chassis (sb_readonly=0)
Nov 29 01:59:32 np0005539505 ovn_controller[95143]: 2025-11-29T06:59:32Z|00213|binding|INFO|Setting lport d128a673-234f-43e3-8e11-622b61d840ec down in Southbound
Nov 29 01:59:32 np0005539505 ovn_controller[95143]: 2025-11-29T06:59:32Z|00214|binding|INFO|Removing iface tapd128a673-23 ovn-installed in OVS
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.219 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.232 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:32.232 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:ab:b0 10.100.0.13'], port_security=['fa:16:3e:22:ab:b0 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7eb30444-a881-4d6c-b193-c84ee550e6d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98df116965b74e4a9985049062e65162', 'neutron:revision_number': '4', 'neutron:security_group_ids': '234720a9-9cd1-4b87-9bec-1abfe8ff0514', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e694bb30-a43a-4d18-87fa-e5c0dd8850c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=d128a673-234f-43e3-8e11-622b61d840ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:59:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:32.234 104094 INFO neutron.agent.ovn.metadata.agent [-] Port d128a673-234f-43e3-8e11-622b61d840ec in datapath fd9eb57e-b1f8-4bae-a60f-8e40613556cd unbound from our chassis#033[00m
Nov 29 01:59:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:32.236 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 01:59:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:32.237 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[132ccd3b-f1a2-49ce-b851-37c83cd7d18e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:32.238 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd namespace which is not needed anymore#033[00m
Nov 29 01:59:32 np0005539505 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Nov 29 01:59:32 np0005539505 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002f.scope: Consumed 8.096s CPU time.
Nov 29 01:59:32 np0005539505 systemd-machined[153285]: Machine qemu-25-instance-0000002f terminated.
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.405 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.411 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.455 186962 INFO nova.virt.libvirt.driver [-] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Instance destroyed successfully.#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.456 186962 DEBUG nova.objects.instance [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lazy-loading 'resources' on Instance uuid 7eb30444-a881-4d6c-b193-c84ee550e6d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.510 186962 DEBUG nova.virt.libvirt.vif [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T06:58:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-725319999',display_name='tempest-DeleteServersTestJSON-server-725319999',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-725319999',id=47,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T06:59:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98df116965b74e4a9985049062e65162',ramdisk_id='',reservation_id='r-rimj9cnb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-1973671383',owner_user_name='tempest-DeleteServersTestJSON-1973671383-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T06:59:25Z,user_data=None,user_id='4ecd161098b5422084003b39f0504a8f',uuid=7eb30444-a881-4d6c-b193-c84ee550e6d5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d128a673-234f-43e3-8e11-622b61d840ec", "address": "fa:16:3e:22:ab:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd128a673-23", "ovs_interfaceid": "d128a673-234f-43e3-8e11-622b61d840ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.510 186962 DEBUG nova.network.os_vif_util [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converting VIF {"id": "d128a673-234f-43e3-8e11-622b61d840ec", "address": "fa:16:3e:22:ab:b0", "network": {"id": "fd9eb57e-b1f8-4bae-a60f-8e40613556cd", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1503104692-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98df116965b74e4a9985049062e65162", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd128a673-23", "ovs_interfaceid": "d128a673-234f-43e3-8e11-622b61d840ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.511 186962 DEBUG nova.network.os_vif_util [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:ab:b0,bridge_name='br-int',has_traffic_filtering=True,id=d128a673-234f-43e3-8e11-622b61d840ec,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd128a673-23') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.511 186962 DEBUG os_vif [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:ab:b0,bridge_name='br-int',has_traffic_filtering=True,id=d128a673-234f-43e3-8e11-622b61d840ec,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd128a673-23') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.513 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.513 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd128a673-23, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.515 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.516 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.518 186962 INFO os_vif [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:ab:b0,bridge_name='br-int',has_traffic_filtering=True,id=d128a673-234f-43e3-8e11-622b61d840ec,network=Network(fd9eb57e-b1f8-4bae-a60f-8e40613556cd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd128a673-23')#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.519 186962 INFO nova.virt.libvirt.driver [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Deleting instance files /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5_del#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.519 186962 INFO nova.virt.libvirt.driver [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Deletion of /var/lib/nova/instances/7eb30444-a881-4d6c-b193-c84ee550e6d5_del complete#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.616 186962 INFO nova.compute.manager [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.617 186962 DEBUG oslo.service.loopingcall [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.617 186962 DEBUG nova.compute.manager [-] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 01:59:32 np0005539505 nova_compute[186958]: 2025-11-29 06:59:32.617 186962 DEBUG nova.network.neutron [-] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 01:59:33 np0005539505 nova_compute[186958]: 2025-11-29 06:59:33.143 186962 DEBUG nova.compute.manager [req-d92c982c-1d68-4069-9b15-8f500322c0a8 req-88f8eddc-ffd1-406a-bbd9-3abcc4f738c0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Received event network-vif-unplugged-d128a673-234f-43e3-8e11-622b61d840ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:59:33 np0005539505 nova_compute[186958]: 2025-11-29 06:59:33.144 186962 DEBUG oslo_concurrency.lockutils [req-d92c982c-1d68-4069-9b15-8f500322c0a8 req-88f8eddc-ffd1-406a-bbd9-3abcc4f738c0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:33 np0005539505 nova_compute[186958]: 2025-11-29 06:59:33.144 186962 DEBUG oslo_concurrency.lockutils [req-d92c982c-1d68-4069-9b15-8f500322c0a8 req-88f8eddc-ffd1-406a-bbd9-3abcc4f738c0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:33 np0005539505 nova_compute[186958]: 2025-11-29 06:59:33.144 186962 DEBUG oslo_concurrency.lockutils [req-d92c982c-1d68-4069-9b15-8f500322c0a8 req-88f8eddc-ffd1-406a-bbd9-3abcc4f738c0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:33 np0005539505 nova_compute[186958]: 2025-11-29 06:59:33.144 186962 DEBUG nova.compute.manager [req-d92c982c-1d68-4069-9b15-8f500322c0a8 req-88f8eddc-ffd1-406a-bbd9-3abcc4f738c0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] No waiting events found dispatching network-vif-unplugged-d128a673-234f-43e3-8e11-622b61d840ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:59:33 np0005539505 nova_compute[186958]: 2025-11-29 06:59:33.145 186962 DEBUG nova.compute.manager [req-d92c982c-1d68-4069-9b15-8f500322c0a8 req-88f8eddc-ffd1-406a-bbd9-3abcc4f738c0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Received event network-vif-unplugged-d128a673-234f-43e3-8e11-622b61d840ec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 01:59:33 np0005539505 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222252]: [NOTICE]   (222256) : haproxy version is 2.8.14-c23fe91
Nov 29 01:59:33 np0005539505 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222252]: [NOTICE]   (222256) : path to executable is /usr/sbin/haproxy
Nov 29 01:59:33 np0005539505 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222252]: [WARNING]  (222256) : Exiting Master process...
Nov 29 01:59:33 np0005539505 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222252]: [ALERT]    (222256) : Current worker (222258) exited with code 143 (Terminated)
Nov 29 01:59:33 np0005539505 neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd[222252]: [WARNING]  (222256) : All workers exited. Exiting... (0)
Nov 29 01:59:33 np0005539505 systemd[1]: libpod-5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa.scope: Deactivated successfully.
Nov 29 01:59:33 np0005539505 podman[222332]: 2025-11-29 06:59:33.440730803 +0000 UTC m=+1.091573987 container died 5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 01:59:33 np0005539505 nova_compute[186958]: 2025-11-29 06:59:33.587 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa-userdata-shm.mount: Deactivated successfully.
Nov 29 01:59:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay-e17e37bbe0cc2bfdb8b8dfe17ce1a22625ed6282e90018902cba6239872862bb-merged.mount: Deactivated successfully.
Nov 29 01:59:34 np0005539505 podman[222332]: 2025-11-29 06:59:34.009053523 +0000 UTC m=+1.659896707 container cleanup 5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 01:59:34 np0005539505 systemd[1]: libpod-conmon-5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa.scope: Deactivated successfully.
Nov 29 01:59:34 np0005539505 podman[222381]: 2025-11-29 06:59:34.3517698 +0000 UTC m=+0.317154468 container remove 5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 01:59:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:34.356 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[390dffb5-fd83-4e8b-b56f-45fbfc600e4d]: (4, ('Sat Nov 29 06:59:32 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa)\n5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa\nSat Nov 29 06:59:34 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd (5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa)\n5c9b00198b89c1e3eccfe453f2cb78cb25d4eaf565502a3cd195efaf9447b5fa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:34.360 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[31eb3050-6e51-47f6-b9ef-8edd1fd35edf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:34.364 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd9eb57e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 01:59:34 np0005539505 nova_compute[186958]: 2025-11-29 06:59:34.367 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:34 np0005539505 kernel: tapfd9eb57e-b0: left promiscuous mode
Nov 29 01:59:34 np0005539505 nova_compute[186958]: 2025-11-29 06:59:34.379 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:34.384 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6ddc067a-ee38-4b1d-bb2d-7dd9bc05933c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:34.398 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8403fdde-87bd-4c39-9699-bca9b6c21e06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:34.400 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[941ae7ea-d42b-473b-ae5b-de4be108c3fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:34.420 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c81da33e-e4ef-474c-81c6-3646831be29f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501163, 'reachable_time': 17174, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222397, 'error': None, 'target': 'ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:34 np0005539505 systemd[1]: run-netns-ovnmeta\x2dfd9eb57e\x2db1f8\x2d4bae\x2da60f\x2d8e40613556cd.mount: Deactivated successfully.
Nov 29 01:59:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:34.427 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd9eb57e-b1f8-4bae-a60f-8e40613556cd deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 01:59:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:34.427 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[a33c07cc-6483-4216-a88f-07fa00fb661b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 01:59:35 np0005539505 podman[222398]: 2025-11-29 06:59:35.765902013 +0000 UTC m=+0.081734660 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.414 186962 DEBUG nova.network.neutron [-] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.544 186962 DEBUG nova.compute.manager [req-8547b23b-7648-4d62-b92b-4c1d54107c9f req-27d939f2-77ef-42b0-af6c-09ee66b86df5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Received event network-vif-plugged-d128a673-234f-43e3-8e11-622b61d840ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.545 186962 DEBUG oslo_concurrency.lockutils [req-8547b23b-7648-4d62-b92b-4c1d54107c9f req-27d939f2-77ef-42b0-af6c-09ee66b86df5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.545 186962 DEBUG oslo_concurrency.lockutils [req-8547b23b-7648-4d62-b92b-4c1d54107c9f req-27d939f2-77ef-42b0-af6c-09ee66b86df5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.545 186962 DEBUG oslo_concurrency.lockutils [req-8547b23b-7648-4d62-b92b-4c1d54107c9f req-27d939f2-77ef-42b0-af6c-09ee66b86df5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.545 186962 DEBUG nova.compute.manager [req-8547b23b-7648-4d62-b92b-4c1d54107c9f req-27d939f2-77ef-42b0-af6c-09ee66b86df5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] No waiting events found dispatching network-vif-plugged-d128a673-234f-43e3-8e11-622b61d840ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.546 186962 WARNING nova.compute.manager [req-8547b23b-7648-4d62-b92b-4c1d54107c9f req-27d939f2-77ef-42b0-af6c-09ee66b86df5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Received unexpected event network-vif-plugged-d128a673-234f-43e3-8e11-622b61d840ec for instance with vm_state active and task_state deleting.#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.549 186962 DEBUG nova.compute.manager [req-59a28a35-fddc-4ccb-bfa7-8c1c75dfa548 req-795abe35-5ab7-4e44-8852-cc202ebbf47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Received event network-vif-deleted-d128a673-234f-43e3-8e11-622b61d840ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.549 186962 INFO nova.compute.manager [req-59a28a35-fddc-4ccb-bfa7-8c1c75dfa548 req-795abe35-5ab7-4e44-8852-cc202ebbf47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Neutron deleted interface d128a673-234f-43e3-8e11-622b61d840ec; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.550 186962 DEBUG nova.network.neutron [req-59a28a35-fddc-4ccb-bfa7-8c1c75dfa548 req-795abe35-5ab7-4e44-8852-cc202ebbf47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.583 186962 INFO nova.compute.manager [-] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Took 3.97 seconds to deallocate network for instance.#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.600 186962 DEBUG nova.compute.manager [req-59a28a35-fddc-4ccb-bfa7-8c1c75dfa548 req-795abe35-5ab7-4e44-8852-cc202ebbf47b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Detach interface failed, port_id=d128a673-234f-43e3-8e11-622b61d840ec, reason: Instance 7eb30444-a881-4d6c-b193-c84ee550e6d5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.725 186962 DEBUG oslo_concurrency.lockutils [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.726 186962 DEBUG oslo_concurrency.lockutils [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.837 186962 DEBUG nova.compute.provider_tree [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:59:36 np0005539505 nova_compute[186958]: 2025-11-29 06:59:36.920 186962 DEBUG nova.scheduler.client.report [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:59:37 np0005539505 nova_compute[186958]: 2025-11-29 06:59:37.040 186962 DEBUG oslo_concurrency.lockutils [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:37 np0005539505 nova_compute[186958]: 2025-11-29 06:59:37.070 186962 INFO nova.scheduler.client.report [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Deleted allocations for instance 7eb30444-a881-4d6c-b193-c84ee550e6d5#033[00m
Nov 29 01:59:37 np0005539505 nova_compute[186958]: 2025-11-29 06:59:37.192 186962 DEBUG oslo_concurrency.lockutils [None req-b1476b24-11a7-49be-be83-9d97b29a8f25 4ecd161098b5422084003b39f0504a8f 98df116965b74e4a9985049062e65162 - - default default] Lock "7eb30444-a881-4d6c-b193-c84ee550e6d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.040s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:37 np0005539505 nova_compute[186958]: 2025-11-29 06:59:37.515 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:38 np0005539505 nova_compute[186958]: 2025-11-29 06:59:38.628 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:41 np0005539505 podman[222418]: 2025-11-29 06:59:41.735404035 +0000 UTC m=+0.064090812 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 01:59:41 np0005539505 podman[222419]: 2025-11-29 06:59:41.770359241 +0000 UTC m=+0.093316200 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 01:59:42 np0005539505 nova_compute[186958]: 2025-11-29 06:59:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:42 np0005539505 nova_compute[186958]: 2025-11-29 06:59:42.517 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:43 np0005539505 nova_compute[186958]: 2025-11-29 06:59:43.631 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:46 np0005539505 nova_compute[186958]: 2025-11-29 06:59:46.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:46 np0005539505 nova_compute[186958]: 2025-11-29 06:59:46.412 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:47 np0005539505 nova_compute[186958]: 2025-11-29 06:59:47.454 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399572.4525344, 7eb30444-a881-4d6c-b193-c84ee550e6d5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 01:59:47 np0005539505 nova_compute[186958]: 2025-11-29 06:59:47.455 186962 INFO nova.compute.manager [-] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] VM Stopped (Lifecycle Event)#033[00m
Nov 29 01:59:47 np0005539505 nova_compute[186958]: 2025-11-29 06:59:47.492 186962 DEBUG nova.compute.manager [None req-4be7efff-95fd-45f6-bb26-40bfedcb534f - - - - - -] [instance: 7eb30444-a881-4d6c-b193-c84ee550e6d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 01:59:47 np0005539505 nova_compute[186958]: 2025-11-29 06:59:47.520 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.413 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.414 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.414 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.414 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 01:59:48 np0005539505 podman[222469]: 2025-11-29 06:59:48.5629183 +0000 UTC m=+0.114622295 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.617 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.618 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5683MB free_disk=73.23404693603516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.618 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.619 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.632 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.715 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.716 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.745 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.760 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.789 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 01:59:48 np0005539505 nova_compute[186958]: 2025-11-29 06:59:48.790 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 01:59:50 np0005539505 podman[222489]: 2025-11-29 06:59:50.763828354 +0000 UTC m=+0.090519798 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 01:59:50 np0005539505 nova_compute[186958]: 2025-11-29 06:59:50.784 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:51 np0005539505 nova_compute[186958]: 2025-11-29 06:59:51.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:52 np0005539505 nova_compute[186958]: 2025-11-29 06:59:52.524 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:53 np0005539505 nova_compute[186958]: 2025-11-29 06:59:53.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:53 np0005539505 nova_compute[186958]: 2025-11-29 06:59:53.635 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:54 np0005539505 nova_compute[186958]: 2025-11-29 06:59:54.769 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:54.774 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 01:59:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 06:59:54.776 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 01:59:56 np0005539505 nova_compute[186958]: 2025-11-29 06:59:56.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:56 np0005539505 nova_compute[186958]: 2025-11-29 06:59:56.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 01:59:57 np0005539505 nova_compute[186958]: 2025-11-29 06:59:57.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 01:59:57 np0005539505 nova_compute[186958]: 2025-11-29 06:59:57.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 01:59:57 np0005539505 nova_compute[186958]: 2025-11-29 06:59:57.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 01:59:57 np0005539505 nova_compute[186958]: 2025-11-29 06:59:57.455 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 01:59:57 np0005539505 nova_compute[186958]: 2025-11-29 06:59:57.528 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 01:59:58 np0005539505 nova_compute[186958]: 2025-11-29 06:59:58.636 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:00.779 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:01 np0005539505 podman[222511]: 2025-11-29 07:00:01.735382416 +0000 UTC m=+0.061780048 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:00:01 np0005539505 podman[222510]: 2025-11-29 07:00:01.744214606 +0000 UTC m=+0.075260569 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal)
Nov 29 02:00:02 np0005539505 nova_compute[186958]: 2025-11-29 07:00:02.531 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:03 np0005539505 nova_compute[186958]: 2025-11-29 07:00:03.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:03 np0005539505 nova_compute[186958]: 2025-11-29 07:00:03.639 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:06 np0005539505 podman[222556]: 2025-11-29 07:00:06.746427309 +0000 UTC m=+0.067586792 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:00:07 np0005539505 nova_compute[186958]: 2025-11-29 07:00:07.534 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:08 np0005539505 nova_compute[186958]: 2025-11-29 07:00:08.641 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:12 np0005539505 nova_compute[186958]: 2025-11-29 07:00:12.536 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:12 np0005539505 podman[222576]: 2025-11-29 07:00:12.729115764 +0000 UTC m=+0.061632454 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:00:12 np0005539505 podman[222577]: 2025-11-29 07:00:12.827110896 +0000 UTC m=+0.156318942 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:00:13 np0005539505 nova_compute[186958]: 2025-11-29 07:00:13.644 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:17 np0005539505 nova_compute[186958]: 2025-11-29 07:00:17.538 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:18 np0005539505 nova_compute[186958]: 2025-11-29 07:00:18.646 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:18 np0005539505 podman[222625]: 2025-11-29 07:00:18.727007668 +0000 UTC m=+0.058320261 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS)
Nov 29 02:00:21 np0005539505 podman[222644]: 2025-11-29 07:00:21.744723315 +0000 UTC m=+0.081750713 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:00:22 np0005539505 nova_compute[186958]: 2025-11-29 07:00:22.539 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:23 np0005539505 nova_compute[186958]: 2025-11-29 07:00:23.648 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:26.937 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:26.937 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:26.937 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:27 np0005539505 nova_compute[186958]: 2025-11-29 07:00:27.541 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:28 np0005539505 nova_compute[186958]: 2025-11-29 07:00:28.650 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:32 np0005539505 nova_compute[186958]: 2025-11-29 07:00:32.543 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:32 np0005539505 podman[222665]: 2025-11-29 07:00:32.722241256 +0000 UTC m=+0.051254051 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:00:32 np0005539505 podman[222664]: 2025-11-29 07:00:32.722252116 +0000 UTC m=+0.056795197 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, config_id=edpm, io.openshift.tags=minimal rhel9)
Nov 29 02:00:33 np0005539505 nova_compute[186958]: 2025-11-29 07:00:33.651 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:37 np0005539505 nova_compute[186958]: 2025-11-29 07:00:37.545 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:37 np0005539505 podman[222707]: 2025-11-29 07:00:37.757782362 +0000 UTC m=+0.085828759 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:00:38 np0005539505 nova_compute[186958]: 2025-11-29 07:00:38.654 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:42 np0005539505 nova_compute[186958]: 2025-11-29 07:00:42.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:42 np0005539505 nova_compute[186958]: 2025-11-29 07:00:42.547 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:43.401 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:43 np0005539505 nova_compute[186958]: 2025-11-29 07:00:43.402 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:43.403 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:00:43 np0005539505 nova_compute[186958]: 2025-11-29 07:00:43.655 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:43 np0005539505 podman[222724]: 2025-11-29 07:00:43.746183027 +0000 UTC m=+0.077791891 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:00:43 np0005539505 podman[222725]: 2025-11-29 07:00:43.761007996 +0000 UTC m=+0.088265217 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.233 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.233 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.266 186962 DEBUG nova.compute.manager [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.429 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.430 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.438 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.439 186962 INFO nova.compute.claims [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.668 186962 DEBUG nova.compute.provider_tree [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.695 186962 DEBUG nova.scheduler.client.report [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.735 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.736 186962 DEBUG nova.compute.manager [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.815 186962 DEBUG nova.compute.manager [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.816 186962 DEBUG nova.network.neutron [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.837 186962 INFO nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:00:44 np0005539505 nova_compute[186958]: 2025-11-29 07:00:44.857 186962 DEBUG nova.compute.manager [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.040 186962 DEBUG nova.compute.manager [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.041 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.042 186962 INFO nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Creating image(s)#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.042 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.043 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.044 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.060 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.120 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.121 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.121 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.132 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.187 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.188 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.306 186962 DEBUG nova.policy [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.615 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk 1073741824" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.616 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.495s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.617 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.671 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.673 186962 DEBUG nova.virt.disk.api [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Checking if we can resize image /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.673 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.732 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.733 186962 DEBUG nova.virt.disk.api [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Cannot resize image /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.734 186962 DEBUG nova.objects.instance [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'migration_context' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.755 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.756 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Ensure instance console log exists: /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.757 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.757 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:45 np0005539505 nova_compute[186958]: 2025-11-29 07:00:45.758 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:47 np0005539505 nova_compute[186958]: 2025-11-29 07:00:47.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:47 np0005539505 nova_compute[186958]: 2025-11-29 07:00:47.550 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.083 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.084 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:00:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:00:48 np0005539505 nova_compute[186958]: 2025-11-29 07:00:48.332 186962 DEBUG nova.network.neutron [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Successfully created port: e5af6202-8a71-48e2-ae69-2b3cb0d3a948 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:00:48 np0005539505 nova_compute[186958]: 2025-11-29 07:00:48.657 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:49 np0005539505 podman[222787]: 2025-11-29 07:00:49.742880397 +0000 UTC m=+0.080953760 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.389 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.431 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.432 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.432 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.433 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.603 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.604 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5760MB free_disk=73.22608947753906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.605 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.605 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.773 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 230d36aa-b1ff-4e7d-a024-af0021cd0044 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.773 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:00:50 np0005539505 nova_compute[186958]: 2025-11-29 07:00:50.774 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:00:51 np0005539505 nova_compute[186958]: 2025-11-29 07:00:51.002 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:00:51 np0005539505 nova_compute[186958]: 2025-11-29 07:00:51.037 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:00:51 np0005539505 nova_compute[186958]: 2025-11-29 07:00:51.249 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:00:51 np0005539505 nova_compute[186958]: 2025-11-29 07:00:51.250 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:52 np0005539505 nova_compute[186958]: 2025-11-29 07:00:52.129 186962 DEBUG nova.network.neutron [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Successfully updated port: e5af6202-8a71-48e2-ae69-2b3cb0d3a948 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:00:52 np0005539505 nova_compute[186958]: 2025-11-29 07:00:52.172 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "refresh_cache-230d36aa-b1ff-4e7d-a024-af0021cd0044" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:00:52 np0005539505 nova_compute[186958]: 2025-11-29 07:00:52.172 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquired lock "refresh_cache-230d36aa-b1ff-4e7d-a024-af0021cd0044" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:00:52 np0005539505 nova_compute[186958]: 2025-11-29 07:00:52.173 186962 DEBUG nova.network.neutron [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:00:52 np0005539505 nova_compute[186958]: 2025-11-29 07:00:52.552 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:52 np0005539505 podman[222807]: 2025-11-29 07:00:52.754456473 +0000 UTC m=+0.084152891 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:00:53 np0005539505 nova_compute[186958]: 2025-11-29 07:00:53.234 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:53 np0005539505 nova_compute[186958]: 2025-11-29 07:00:53.235 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:53.406 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:53 np0005539505 nova_compute[186958]: 2025-11-29 07:00:53.659 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:53 np0005539505 nova_compute[186958]: 2025-11-29 07:00:53.747 186962 DEBUG nova.network.neutron [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:00:55 np0005539505 nova_compute[186958]: 2025-11-29 07:00:55.074 186962 DEBUG nova.compute.manager [req-15d72025-69cd-4e13-b4f8-22c6d923b64a req-9f448ac9-4141-4e82-b39f-72eda10bb22a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-changed-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:55 np0005539505 nova_compute[186958]: 2025-11-29 07:00:55.074 186962 DEBUG nova.compute.manager [req-15d72025-69cd-4e13-b4f8-22c6d923b64a req-9f448ac9-4141-4e82-b39f-72eda10bb22a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Refreshing instance network info cache due to event network-changed-e5af6202-8a71-48e2-ae69-2b3cb0d3a948. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:00:55 np0005539505 nova_compute[186958]: 2025-11-29 07:00:55.075 186962 DEBUG oslo_concurrency.lockutils [req-15d72025-69cd-4e13-b4f8-22c6d923b64a req-9f448ac9-4141-4e82-b39f-72eda10bb22a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-230d36aa-b1ff-4e7d-a024-af0021cd0044" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:00:55 np0005539505 nova_compute[186958]: 2025-11-29 07:00:55.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.176 186962 DEBUG nova.network.neutron [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Updating instance_info_cache with network_info: [{"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.240 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Releasing lock "refresh_cache-230d36aa-b1ff-4e7d-a024-af0021cd0044" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.240 186962 DEBUG nova.compute.manager [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance network_info: |[{"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.240 186962 DEBUG oslo_concurrency.lockutils [req-15d72025-69cd-4e13-b4f8-22c6d923b64a req-9f448ac9-4141-4e82-b39f-72eda10bb22a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-230d36aa-b1ff-4e7d-a024-af0021cd0044" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.241 186962 DEBUG nova.network.neutron [req-15d72025-69cd-4e13-b4f8-22c6d923b64a req-9f448ac9-4141-4e82-b39f-72eda10bb22a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Refreshing network info cache for port e5af6202-8a71-48e2-ae69-2b3cb0d3a948 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.244 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Start _get_guest_xml network_info=[{"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.248 186962 WARNING nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.254 186962 DEBUG nova.virt.libvirt.host [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.255 186962 DEBUG nova.virt.libvirt.host [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.259 186962 DEBUG nova.virt.libvirt.host [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.260 186962 DEBUG nova.virt.libvirt.host [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.261 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.261 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.262 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.262 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.262 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.262 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.262 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.263 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.263 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.263 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.263 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.264 186962 DEBUG nova.virt.hardware [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.268 186962 DEBUG nova.virt.libvirt.vif [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1580825723',display_name='tempest-ServersAdminTestJSON-server-1580825723',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1580825723',id=53,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-uhc27c80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:44Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=230d36aa-b1ff-4e7d-a024-af0021cd0044,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.269 186962 DEBUG nova.network.os_vif_util [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.270 186962 DEBUG nova.network.os_vif_util [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.270 186962 DEBUG nova.objects.instance [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'pci_devices' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.289 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  <uuid>230d36aa-b1ff-4e7d-a024-af0021cd0044</uuid>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  <name>instance-00000035</name>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersAdminTestJSON-server-1580825723</nova:name>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:00:56</nova:creationTime>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:        <nova:user uuid="cd616d4c2eb44fe0a0da2df1690c0e21">tempest-ServersAdminTestJSON-1087744064-project-member</nova:user>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:        <nova:project uuid="80b4126e17a14d73b40158a57f19d091">tempest-ServersAdminTestJSON-1087744064</nova:project>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:        <nova:port uuid="e5af6202-8a71-48e2-ae69-2b3cb0d3a948">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <entry name="serial">230d36aa-b1ff-4e7d-a024-af0021cd0044</entry>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <entry name="uuid">230d36aa-b1ff-4e7d-a024-af0021cd0044</entry>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:ff:47:55"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <target dev="tape5af6202-8a"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/console.log" append="off"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:00:56 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:00:56 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:00:56 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:00:56 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.290 186962 DEBUG nova.compute.manager [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Preparing to wait for external event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.290 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.290 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.291 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.291 186962 DEBUG nova.virt.libvirt.vif [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1580825723',display_name='tempest-ServersAdminTestJSON-server-1580825723',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1580825723',id=53,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-uhc27c80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:44Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=230d36aa-b1ff-4e7d-a024-af0021cd0044,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.292 186962 DEBUG nova.network.os_vif_util [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.292 186962 DEBUG nova.network.os_vif_util [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.293 186962 DEBUG os_vif [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.293 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.294 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.294 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.298 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.299 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5af6202-8a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.299 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape5af6202-8a, col_values=(('external_ids', {'iface-id': 'e5af6202-8a71-48e2-ae69-2b3cb0d3a948', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:47:55', 'vm-uuid': '230d36aa-b1ff-4e7d-a024-af0021cd0044'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.301 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:56 np0005539505 NetworkManager[55134]: <info>  [1764399656.3022] manager: (tape5af6202-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.303 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.310 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.311 186962 INFO os_vif [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a')#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.368 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.369 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.369 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No VIF found with MAC fa:16:3e:ff:47:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:00:56 np0005539505 nova_compute[186958]: 2025-11-29 07:00:56.369 186962 INFO nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Using config drive#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.272 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Acquiring lock "af2973a1-4571-44fc-8f98-148372c33b8a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.272 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.346 186962 DEBUG nova.compute.manager [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.415 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.416 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.535 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.536 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.544 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.544 186962 INFO nova.compute.claims [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.692 186962 INFO nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Creating config drive at /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.697 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvww5sht4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.824 186962 DEBUG nova.compute.provider_tree [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.838 186962 DEBUG oslo_concurrency.processutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvww5sht4" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.848 186962 DEBUG nova.scheduler.client.report [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.879 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.879 186962 DEBUG nova.compute.manager [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:00:57 np0005539505 kernel: tape5af6202-8a: entered promiscuous mode
Nov 29 02:00:57 np0005539505 NetworkManager[55134]: <info>  [1764399657.9133] manager: (tape5af6202-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.914 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:00:57Z|00215|binding|INFO|Claiming lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for this chassis.
Nov 29 02:00:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:00:57Z|00216|binding|INFO|e5af6202-8a71-48e2-ae69-2b3cb0d3a948: Claiming fa:16:3e:ff:47:55 10.100.0.14
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.922 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:57.930 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:47:55 10.100.0.14'], port_security=['fa:16:3e:ff:47:55 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '2', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=e5af6202-8a71-48e2-ae69-2b3cb0d3a948) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:00:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:57.932 104094 INFO neutron.agent.ovn.metadata.agent [-] Port e5af6202-8a71-48e2-ae69-2b3cb0d3a948 in datapath b97f3d85-11c0-4475-aea6-e8da158df42a bound to our chassis#033[00m
Nov 29 02:00:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:57.934 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:00:57 np0005539505 systemd-udevd[222847]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:00:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:57.947 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8f11bc27-81d1-40e5-9d7f-62ce78783bba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:57.948 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb97f3d85-11 in ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:00:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:57.950 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb97f3d85-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:00:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:57.950 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[62300095-9a5d-480d-aaa2-5d83057d6f5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:57.951 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[425570b8-9bef-4217-960d-51867951d1d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.953 186962 DEBUG nova.compute.manager [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.954 186962 DEBUG nova.network.neutron [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:00:57 np0005539505 NetworkManager[55134]: <info>  [1764399657.9637] device (tape5af6202-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:00:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:57.963 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae97cce-f1d8-498d-bcf7-8735f3ba6ab9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:57 np0005539505 NetworkManager[55134]: <info>  [1764399657.9656] device (tape5af6202-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:00:57 np0005539505 systemd-machined[153285]: New machine qemu-26-instance-00000035.
Nov 29 02:00:57 np0005539505 nova_compute[186958]: 2025-11-29 07:00:57.976 186962 INFO nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.001 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.002 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3b0938fa-5302-475b-a47c-4f62c0488ae3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:00:58Z|00217|binding|INFO|Setting lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 ovn-installed in OVS
Nov 29 02:00:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:00:58Z|00218|binding|INFO|Setting lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 up in Southbound
Nov 29 02:00:58 np0005539505 systemd[1]: Started Virtual Machine qemu-26-instance-00000035.
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.008 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.010 186962 DEBUG nova.compute.manager [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.036 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3ebef2b8-26d6-4c0b-989c-3650a553d7f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.042 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[41beecd1-2c83-4787-8e82-5d6de113d2a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 NetworkManager[55134]: <info>  [1764399658.0434] manager: (tapb97f3d85-10): new Veth device (/org/freedesktop/NetworkManager/Devices/113)
Nov 29 02:00:58 np0005539505 systemd-udevd[222851]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.078 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ba3f39e5-cb93-4898-ba5a-59088ad485f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.082 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[056215ae-b3c9-4439-b6bd-769f6ac95291]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 NetworkManager[55134]: <info>  [1764399658.1046] device (tapb97f3d85-10): carrier: link connected
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.109 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[bd07cbb6-3997-43d0-b0c7-098ddc58aa68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.128 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[194081bf-07b5-4257-bda2-ac6770ab444e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510568, 'reachable_time': 25382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222882, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.144 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5f569477-62d1-4e5e-a903-809f659bab95]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe53:e22d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510568, 'tstamp': 510568}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222883, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.155 186962 DEBUG nova.compute.manager [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.156 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.157 186962 INFO nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Creating image(s)#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.157 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Acquiring lock "/var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.158 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "/var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.158 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "/var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.162 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[acea20cd-7f5a-4d99-a863-909f5a6fedf1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510568, 'reachable_time': 25382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222884, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.174 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.198 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[adb40615-ba2b-44fe-bf20-a863df3f58c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.229 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.230 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.231 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.243 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.246 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4fbdfc6d-9852-49cd-a9e2-c52f32ff8711]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.248 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.248 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.249 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:58 np0005539505 NetworkManager[55134]: <info>  [1764399658.2520] manager: (tapb97f3d85-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Nov 29 02:00:58 np0005539505 kernel: tapb97f3d85-10: entered promiscuous mode
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.256 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:00:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:00:58Z|00219|binding|INFO|Releasing lport e6d6aadc-4cde-4c62-a881-70607e3666f6 from this chassis (sb_readonly=0)
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.258 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.260 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b97f3d85-11c0-4475-aea6-e8da158df42a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b97f3d85-11c0-4475-aea6-e8da158df42a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.260 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[092af6b6-0d15-43b1-b45c-59bc35a91e18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.261 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-b97f3d85-11c0-4475-aea6-e8da158df42a
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/b97f3d85-11c0-4475-aea6-e8da158df42a.pid.haproxy
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID b97f3d85-11c0-4475-aea6-e8da158df42a
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:00:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:00:58.262 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'env', 'PROCESS_TAG=haproxy-b97f3d85-11c0-4475-aea6-e8da158df42a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b97f3d85-11c0-4475-aea6-e8da158df42a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.270 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.299 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.300 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.356 186962 DEBUG nova.policy [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ed0cd4810c054eaa9d04fe9ef74bf011', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9a8c0812295749a6b8b144654ceafd3c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.370 186962 DEBUG nova.compute.manager [req-4105eda4-da2c-4120-baa7-3c9195bb730e req-cc43fe52-f99d-41f0-9d1a-8a4844b330b6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.370 186962 DEBUG oslo_concurrency.lockutils [req-4105eda4-da2c-4120-baa7-3c9195bb730e req-cc43fe52-f99d-41f0-9d1a-8a4844b330b6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.371 186962 DEBUG oslo_concurrency.lockutils [req-4105eda4-da2c-4120-baa7-3c9195bb730e req-cc43fe52-f99d-41f0-9d1a-8a4844b330b6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.371 186962 DEBUG oslo_concurrency.lockutils [req-4105eda4-da2c-4120-baa7-3c9195bb730e req-cc43fe52-f99d-41f0-9d1a-8a4844b330b6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.371 186962 DEBUG nova.compute.manager [req-4105eda4-da2c-4120-baa7-3c9195bb730e req-cc43fe52-f99d-41f0-9d1a-8a4844b330b6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Processing event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.646 186962 DEBUG nova.network.neutron [req-15d72025-69cd-4e13-b4f8-22c6d923b64a req-9f448ac9-4141-4e82-b39f-72eda10bb22a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Updated VIF entry in instance network info cache for port e5af6202-8a71-48e2-ae69-2b3cb0d3a948. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.647 186962 DEBUG nova.network.neutron [req-15d72025-69cd-4e13-b4f8-22c6d923b64a req-9f448ac9-4141-4e82-b39f-72eda10bb22a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Updating instance_info_cache with network_info: [{"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.662 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.678 186962 DEBUG oslo_concurrency.lockutils [req-15d72025-69cd-4e13-b4f8-22c6d923b64a req-9f448ac9-4141-4e82-b39f-72eda10bb22a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-230d36aa-b1ff-4e7d-a024-af0021cd0044" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:00:58 np0005539505 podman[222925]: 2025-11-29 07:00:58.585112137 +0000 UTC m=+0.022675332 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.899 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399658.8980892, 230d36aa-b1ff-4e7d-a024-af0021cd0044 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.900 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] VM Started (Lifecycle Event)#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.902 186962 DEBUG nova.compute.manager [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.907 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.911 186962 INFO nova.virt.libvirt.driver [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance spawned successfully.#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.911 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.953 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.961 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.965 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.965 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.966 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.966 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.967 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:58 np0005539505 nova_compute[186958]: 2025-11-29 07:00:58.967 186962 DEBUG nova.virt.libvirt.driver [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.027 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk 1073741824" returned: 0 in 0.727s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.028 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.797s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.029 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.065 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.066 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399658.8998377, 230d36aa-b1ff-4e7d-a024-af0021cd0044 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.066 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.087 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.088 186962 DEBUG nova.virt.disk.api [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Checking if we can resize image /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.089 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.112 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.117 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399658.905836, 230d36aa-b1ff-4e7d-a024-af0021cd0044 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.117 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.144 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.144 186962 DEBUG nova.virt.disk.api [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Cannot resize image /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.145 186962 DEBUG nova.objects.instance [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lazy-loading 'migration_context' on Instance uuid af2973a1-4571-44fc-8f98-148372c33b8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.149 186962 INFO nova.compute.manager [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Took 14.11 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.150 186962 DEBUG nova.compute.manager [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.154 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.161 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.164 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.165 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Ensure instance console log exists: /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.165 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.166 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.166 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.200 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.249 186962 INFO nova.compute.manager [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Took 14.90 seconds to build instance.#033[00m
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.268 186962 DEBUG oslo_concurrency.lockutils [None req-860d1f07-9cc7-494c-b74f-61a7b181b2fd cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:00:59 np0005539505 podman[222925]: 2025-11-29 07:00:59.694457893 +0000 UTC m=+1.132021058 container create 8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:00:59 np0005539505 nova_compute[186958]: 2025-11-29 07:00:59.865 186962 DEBUG nova.network.neutron [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Successfully created port: 6926042d-b0e8-404f-86b4-5a7e05c6b07f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:01:00 np0005539505 nova_compute[186958]: 2025-11-29 07:01:00.526 186962 DEBUG nova.compute.manager [req-42643c7f-1daf-4ac9-a09e-05874000ea06 req-c761794f-ce83-4926-8a14-6c31a8e4a352 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:00 np0005539505 nova_compute[186958]: 2025-11-29 07:01:00.528 186962 DEBUG oslo_concurrency.lockutils [req-42643c7f-1daf-4ac9-a09e-05874000ea06 req-c761794f-ce83-4926-8a14-6c31a8e4a352 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:00 np0005539505 nova_compute[186958]: 2025-11-29 07:01:00.528 186962 DEBUG oslo_concurrency.lockutils [req-42643c7f-1daf-4ac9-a09e-05874000ea06 req-c761794f-ce83-4926-8a14-6c31a8e4a352 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:00 np0005539505 nova_compute[186958]: 2025-11-29 07:01:00.528 186962 DEBUG oslo_concurrency.lockutils [req-42643c7f-1daf-4ac9-a09e-05874000ea06 req-c761794f-ce83-4926-8a14-6c31a8e4a352 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:00 np0005539505 nova_compute[186958]: 2025-11-29 07:01:00.528 186962 DEBUG nova.compute.manager [req-42643c7f-1daf-4ac9-a09e-05874000ea06 req-c761794f-ce83-4926-8a14-6c31a8e4a352 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] No waiting events found dispatching network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:01:00 np0005539505 nova_compute[186958]: 2025-11-29 07:01:00.529 186962 WARNING nova.compute.manager [req-42643c7f-1daf-4ac9-a09e-05874000ea06 req-c761794f-ce83-4926-8a14-6c31a8e4a352 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received unexpected event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:01:00 np0005539505 systemd[1]: Started libpod-conmon-8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda.scope.
Nov 29 02:01:00 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:01:00 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5450dfa4ad9e8f43aa0a93cc8ce577eb1baecfca75dfda4cb3225eab79e11f85/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:01:01 np0005539505 podman[222925]: 2025-11-29 07:01:01.21899911 +0000 UTC m=+2.656562305 container init 8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:01:01 np0005539505 podman[222925]: 2025-11-29 07:01:01.224986869 +0000 UTC m=+2.662550024 container start 8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:01:01 np0005539505 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222954]: [NOTICE]   (222958) : New worker (222960) forked
Nov 29 02:01:01 np0005539505 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222954]: [NOTICE]   (222958) : Loading success.
Nov 29 02:01:01 np0005539505 nova_compute[186958]: 2025-11-29 07:01:01.302 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:01 np0005539505 nova_compute[186958]: 2025-11-29 07:01:01.875 186962 DEBUG nova.network.neutron [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Successfully updated port: 6926042d-b0e8-404f-86b4-5a7e05c6b07f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:01:01 np0005539505 nova_compute[186958]: 2025-11-29 07:01:01.954 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Acquiring lock "refresh_cache-af2973a1-4571-44fc-8f98-148372c33b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:01:01 np0005539505 nova_compute[186958]: 2025-11-29 07:01:01.955 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Acquired lock "refresh_cache-af2973a1-4571-44fc-8f98-148372c33b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:01:01 np0005539505 nova_compute[186958]: 2025-11-29 07:01:01.955 186962 DEBUG nova.network.neutron [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:01:02 np0005539505 nova_compute[186958]: 2025-11-29 07:01:02.054 186962 DEBUG nova.compute.manager [req-9ba02415-81e0-4bef-8833-148ee346b911 req-c7a7c3d2-ef35-4cd5-9efb-bac94ffd256b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Received event network-changed-6926042d-b0e8-404f-86b4-5a7e05c6b07f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:02 np0005539505 nova_compute[186958]: 2025-11-29 07:01:02.055 186962 DEBUG nova.compute.manager [req-9ba02415-81e0-4bef-8833-148ee346b911 req-c7a7c3d2-ef35-4cd5-9efb-bac94ffd256b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Refreshing instance network info cache due to event network-changed-6926042d-b0e8-404f-86b4-5a7e05c6b07f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:01:02 np0005539505 nova_compute[186958]: 2025-11-29 07:01:02.055 186962 DEBUG oslo_concurrency.lockutils [req-9ba02415-81e0-4bef-8833-148ee346b911 req-c7a7c3d2-ef35-4cd5-9efb-bac94ffd256b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-af2973a1-4571-44fc-8f98-148372c33b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:01:02 np0005539505 nova_compute[186958]: 2025-11-29 07:01:02.738 186962 DEBUG nova.network.neutron [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:01:03 np0005539505 nova_compute[186958]: 2025-11-29 07:01:03.674 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:03 np0005539505 podman[222982]: 2025-11-29 07:01:03.73259738 +0000 UTC m=+0.056094467 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:01:03 np0005539505 podman[222981]: 2025-11-29 07:01:03.739690961 +0000 UTC m=+0.067910822 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public)
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.058 186962 DEBUG nova.network.neutron [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Updating instance_info_cache with network_info: [{"id": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "address": "fa:16:3e:a4:e4:d3", "network": {"id": "284fab49-a685-4bfe-8b0e-6c2e3d7fe794", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1099851490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9a8c0812295749a6b8b144654ceafd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6926042d-b0", "ovs_interfaceid": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.096 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Releasing lock "refresh_cache-af2973a1-4571-44fc-8f98-148372c33b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.097 186962 DEBUG nova.compute.manager [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Instance network_info: |[{"id": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "address": "fa:16:3e:a4:e4:d3", "network": {"id": "284fab49-a685-4bfe-8b0e-6c2e3d7fe794", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1099851490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9a8c0812295749a6b8b144654ceafd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6926042d-b0", "ovs_interfaceid": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.098 186962 DEBUG oslo_concurrency.lockutils [req-9ba02415-81e0-4bef-8833-148ee346b911 req-c7a7c3d2-ef35-4cd5-9efb-bac94ffd256b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-af2973a1-4571-44fc-8f98-148372c33b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.099 186962 DEBUG nova.network.neutron [req-9ba02415-81e0-4bef-8833-148ee346b911 req-c7a7c3d2-ef35-4cd5-9efb-bac94ffd256b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Refreshing network info cache for port 6926042d-b0e8-404f-86b4-5a7e05c6b07f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.105 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Start _get_guest_xml network_info=[{"id": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "address": "fa:16:3e:a4:e4:d3", "network": {"id": "284fab49-a685-4bfe-8b0e-6c2e3d7fe794", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1099851490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9a8c0812295749a6b8b144654ceafd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6926042d-b0", "ovs_interfaceid": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.112 186962 WARNING nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.121 186962 DEBUG nova.virt.libvirt.host [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.123 186962 DEBUG nova.virt.libvirt.host [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.134 186962 DEBUG nova.virt.libvirt.host [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.136 186962 DEBUG nova.virt.libvirt.host [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.137 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.138 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.139 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.139 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.139 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.140 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.140 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.141 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.141 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.141 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.142 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.142 186962 DEBUG nova.virt.hardware [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.147 186962 DEBUG nova.virt.libvirt.vif [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-287502341',display_name='tempest-ServerMetadataTestJSON-server-287502341',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-287502341',id=55,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9a8c0812295749a6b8b144654ceafd3c',ramdisk_id='',reservation_id='r-hfpp2jag',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-394669812',owner_user_name='tempest-ServerMetadataTestJSON-394669812-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:58Z,user_data=None,user_id='ed0cd4810c054eaa9d04fe9ef74bf011',uuid=af2973a1-4571-44fc-8f98-148372c33b8a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "address": "fa:16:3e:a4:e4:d3", "network": {"id": "284fab49-a685-4bfe-8b0e-6c2e3d7fe794", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1099851490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9a8c0812295749a6b8b144654ceafd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6926042d-b0", "ovs_interfaceid": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.148 186962 DEBUG nova.network.os_vif_util [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Converting VIF {"id": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "address": "fa:16:3e:a4:e4:d3", "network": {"id": "284fab49-a685-4bfe-8b0e-6c2e3d7fe794", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1099851490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9a8c0812295749a6b8b144654ceafd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6926042d-b0", "ovs_interfaceid": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.149 186962 DEBUG nova.network.os_vif_util [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:e4:d3,bridge_name='br-int',has_traffic_filtering=True,id=6926042d-b0e8-404f-86b4-5a7e05c6b07f,network=Network(284fab49-a685-4bfe-8b0e-6c2e3d7fe794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6926042d-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.150 186962 DEBUG nova.objects.instance [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lazy-loading 'pci_devices' on Instance uuid af2973a1-4571-44fc-8f98-148372c33b8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.168 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  <uuid>af2973a1-4571-44fc-8f98-148372c33b8a</uuid>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  <name>instance-00000037</name>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerMetadataTestJSON-server-287502341</nova:name>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:01:05</nova:creationTime>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:        <nova:user uuid="ed0cd4810c054eaa9d04fe9ef74bf011">tempest-ServerMetadataTestJSON-394669812-project-member</nova:user>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:        <nova:project uuid="9a8c0812295749a6b8b144654ceafd3c">tempest-ServerMetadataTestJSON-394669812</nova:project>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:        <nova:port uuid="6926042d-b0e8-404f-86b4-5a7e05c6b07f">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <entry name="serial">af2973a1-4571-44fc-8f98-148372c33b8a</entry>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <entry name="uuid">af2973a1-4571-44fc-8f98-148372c33b8a</entry>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk.config"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:a4:e4:d3"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <target dev="tap6926042d-b0"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/console.log" append="off"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:01:05 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:01:05 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:01:05 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:01:05 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.175 186962 DEBUG nova.compute.manager [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Preparing to wait for external event network-vif-plugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.176 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Acquiring lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.176 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.177 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.178 186962 DEBUG nova.virt.libvirt.vif [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:00:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-287502341',display_name='tempest-ServerMetadataTestJSON-server-287502341',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-287502341',id=55,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9a8c0812295749a6b8b144654ceafd3c',ramdisk_id='',reservation_id='r-hfpp2jag',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataTestJSON-394669812',owner_user_name='tempest-ServerMetadataTestJSON-394669812-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:00:58Z,user_data=None,user_id='ed0cd4810c054eaa9d04fe9ef74bf011',uuid=af2973a1-4571-44fc-8f98-148372c33b8a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "address": "fa:16:3e:a4:e4:d3", "network": {"id": "284fab49-a685-4bfe-8b0e-6c2e3d7fe794", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1099851490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9a8c0812295749a6b8b144654ceafd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6926042d-b0", "ovs_interfaceid": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.178 186962 DEBUG nova.network.os_vif_util [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Converting VIF {"id": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "address": "fa:16:3e:a4:e4:d3", "network": {"id": "284fab49-a685-4bfe-8b0e-6c2e3d7fe794", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1099851490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9a8c0812295749a6b8b144654ceafd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6926042d-b0", "ovs_interfaceid": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.179 186962 DEBUG nova.network.os_vif_util [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:e4:d3,bridge_name='br-int',has_traffic_filtering=True,id=6926042d-b0e8-404f-86b4-5a7e05c6b07f,network=Network(284fab49-a685-4bfe-8b0e-6c2e3d7fe794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6926042d-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.180 186962 DEBUG os_vif [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:e4:d3,bridge_name='br-int',has_traffic_filtering=True,id=6926042d-b0e8-404f-86b4-5a7e05c6b07f,network=Network(284fab49-a685-4bfe-8b0e-6c2e3d7fe794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6926042d-b0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.181 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.181 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.182 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.185 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.186 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6926042d-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.186 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6926042d-b0, col_values=(('external_ids', {'iface-id': '6926042d-b0e8-404f-86b4-5a7e05c6b07f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:e4:d3', 'vm-uuid': 'af2973a1-4571-44fc-8f98-148372c33b8a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.188 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:05 np0005539505 NetworkManager[55134]: <info>  [1764399665.1896] manager: (tap6926042d-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.192 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.196 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.197 186962 INFO os_vif [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:e4:d3,bridge_name='br-int',has_traffic_filtering=True,id=6926042d-b0e8-404f-86b4-5a7e05c6b07f,network=Network(284fab49-a685-4bfe-8b0e-6c2e3d7fe794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6926042d-b0')#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.841 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.842 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.842 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] No VIF found with MAC fa:16:3e:a4:e4:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:01:05 np0005539505 nova_compute[186958]: 2025-11-29 07:01:05.843 186962 INFO nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Using config drive#033[00m
Nov 29 02:01:06 np0005539505 nova_compute[186958]: 2025-11-29 07:01:06.759 186962 INFO nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Creating config drive at /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk.config#033[00m
Nov 29 02:01:06 np0005539505 nova_compute[186958]: 2025-11-29 07:01:06.764 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr9yem083 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:06 np0005539505 nova_compute[186958]: 2025-11-29 07:01:06.887 186962 DEBUG oslo_concurrency.processutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr9yem083" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:06 np0005539505 kernel: tap6926042d-b0: entered promiscuous mode
Nov 29 02:01:06 np0005539505 NetworkManager[55134]: <info>  [1764399666.9505] manager: (tap6926042d-b0): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Nov 29 02:01:06 np0005539505 nova_compute[186958]: 2025-11-29 07:01:06.961 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:06 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:06Z|00220|binding|INFO|Claiming lport 6926042d-b0e8-404f-86b4-5a7e05c6b07f for this chassis.
Nov 29 02:01:06 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:06Z|00221|binding|INFO|6926042d-b0e8-404f-86b4-5a7e05c6b07f: Claiming fa:16:3e:a4:e4:d3 10.100.0.3
Nov 29 02:01:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:06.987 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:e4:d3 10.100.0.3'], port_security=['fa:16:3e:a4:e4:d3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'af2973a1-4571-44fc-8f98-148372c33b8a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-284fab49-a685-4bfe-8b0e-6c2e3d7fe794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9a8c0812295749a6b8b144654ceafd3c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '880663a7-b218-4097-b37f-b0875e9dc214', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa3eedc4-6bbb-4a08-8fdb-c92c1707da3f, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=6926042d-b0e8-404f-86b4-5a7e05c6b07f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:01:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:06.989 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 6926042d-b0e8-404f-86b4-5a7e05c6b07f in datapath 284fab49-a685-4bfe-8b0e-6c2e3d7fe794 bound to our chassis#033[00m
Nov 29 02:01:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:06.991 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 284fab49-a685-4bfe-8b0e-6c2e3d7fe794#033[00m
Nov 29 02:01:07 np0005539505 systemd-machined[153285]: New machine qemu-27-instance-00000037.
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.001 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[163d2c46-fb49-488e-aaa4-d0fe9a1f1d67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.002 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap284fab49-a1 in ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.004 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap284fab49-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.004 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0b97ca2e-cafa-46f7-b488-1b9432cda11d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.005 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[83e92542-a69c-4b26-81ac-9f12e632dc71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.015 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[aef90deb-b928-4804-ba67-a99da174ddd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 systemd[1]: Started Virtual Machine qemu-27-instance-00000037.
Nov 29 02:01:07 np0005539505 systemd-udevd[223049]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.050 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:07Z|00222|binding|INFO|Setting lport 6926042d-b0e8-404f-86b4-5a7e05c6b07f ovn-installed in OVS
Nov 29 02:01:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:07Z|00223|binding|INFO|Setting lport 6926042d-b0e8-404f-86b4-5a7e05c6b07f up in Southbound
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.055 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4563069e-dace-41a5-9187-3c8b0c153522]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.056 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:07 np0005539505 NetworkManager[55134]: <info>  [1764399667.0654] device (tap6926042d-b0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:01:07 np0005539505 NetworkManager[55134]: <info>  [1764399667.0664] device (tap6926042d-b0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.091 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[02c1dbcf-9409-4706-9bf7-171fd2095456]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 systemd-udevd[223051]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.096 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[180018a9-2868-4207-a3e5-a51f9ce66de4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 NetworkManager[55134]: <info>  [1764399667.0976] manager: (tap284fab49-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/117)
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.147 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[86fee2bd-836e-4f53-90ce-aa11612abcef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.151 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[35a4ec3c-02f0-401e-93f0-647f91f2f109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 NetworkManager[55134]: <info>  [1764399667.1869] device (tap284fab49-a0): carrier: link connected
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.192 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[01698c74-78d6-485a-9823-33e921208f62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.210 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[415a037f-1a6d-4f7b-9bdf-a55cc54f03e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap284fab49-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:d8:58'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511477, 'reachable_time': 16255, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223079, 'error': None, 'target': 'ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.236 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[67502409-3bf3-4f75-ae60-b471295f37b7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:d858'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 511477, 'tstamp': 511477}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223080, 'error': None, 'target': 'ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.264 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ed977a2f-efb6-4c2f-a19a-e305ced5a0bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap284fab49-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:d8:58'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511477, 'reachable_time': 16255, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223081, 'error': None, 'target': 'ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.299 186962 DEBUG nova.network.neutron [req-9ba02415-81e0-4bef-8833-148ee346b911 req-c7a7c3d2-ef35-4cd5-9efb-bac94ffd256b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Updated VIF entry in instance network info cache for port 6926042d-b0e8-404f-86b4-5a7e05c6b07f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.301 186962 DEBUG nova.network.neutron [req-9ba02415-81e0-4bef-8833-148ee346b911 req-c7a7c3d2-ef35-4cd5-9efb-bac94ffd256b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Updating instance_info_cache with network_info: [{"id": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "address": "fa:16:3e:a4:e4:d3", "network": {"id": "284fab49-a685-4bfe-8b0e-6c2e3d7fe794", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1099851490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9a8c0812295749a6b8b144654ceafd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6926042d-b0", "ovs_interfaceid": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.315 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f9beaa85-a0dd-4b56-be0b-557d3202f4c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.328 186962 DEBUG oslo_concurrency.lockutils [req-9ba02415-81e0-4bef-8833-148ee346b911 req-c7a7c3d2-ef35-4cd5-9efb-bac94ffd256b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-af2973a1-4571-44fc-8f98-148372c33b8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.402 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[af9af4ec-cf6f-4696-9cbd-c4a708afd611]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.405 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap284fab49-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.406 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.407 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap284fab49-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.409 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:07 np0005539505 NetworkManager[55134]: <info>  [1764399667.4107] manager: (tap284fab49-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Nov 29 02:01:07 np0005539505 kernel: tap284fab49-a0: entered promiscuous mode
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.414 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap284fab49-a0, col_values=(('external_ids', {'iface-id': '56193f61-fb9f-4c89-b30b-1af79c33b873'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.418 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/284fab49-a685-4bfe-8b0e-6c2e3d7fe794.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/284fab49-a685-4bfe-8b0e-6c2e3d7fe794.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.418 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:07Z|00224|binding|INFO|Releasing lport 56193f61-fb9f-4c89-b30b-1af79c33b873 from this chassis (sb_readonly=0)
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.420 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ee2bc2a4-a738-4649-a183-9ee2f26aa73d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.422 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-284fab49-a685-4bfe-8b0e-6c2e3d7fe794
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/284fab49-a685-4bfe-8b0e-6c2e3d7fe794.pid.haproxy
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 284fab49-a685-4bfe-8b0e-6c2e3d7fe794
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:01:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:07.424 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794', 'env', 'PROCESS_TAG=haproxy-284fab49-a685-4bfe-8b0e-6c2e3d7fe794', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/284fab49-a685-4bfe-8b0e-6c2e3d7fe794.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.432 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.472 186962 DEBUG nova.compute.manager [req-0a6399c7-b667-4ab8-8d67-ba0fa625f04b req-9a26d5d9-023b-4e9e-bfc8-bc8434bd92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Received event network-vif-plugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.473 186962 DEBUG oslo_concurrency.lockutils [req-0a6399c7-b667-4ab8-8d67-ba0fa625f04b req-9a26d5d9-023b-4e9e-bfc8-bc8434bd92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.473 186962 DEBUG oslo_concurrency.lockutils [req-0a6399c7-b667-4ab8-8d67-ba0fa625f04b req-9a26d5d9-023b-4e9e-bfc8-bc8434bd92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.474 186962 DEBUG oslo_concurrency.lockutils [req-0a6399c7-b667-4ab8-8d67-ba0fa625f04b req-9a26d5d9-023b-4e9e-bfc8-bc8434bd92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.474 186962 DEBUG nova.compute.manager [req-0a6399c7-b667-4ab8-8d67-ba0fa625f04b req-9a26d5d9-023b-4e9e-bfc8-bc8434bd92d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Processing event network-vif-plugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.711 186962 DEBUG nova.compute.manager [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.714 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399667.7096405, af2973a1-4571-44fc-8f98-148372c33b8a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.715 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] VM Started (Lifecycle Event)#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.725 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.730 186962 INFO nova.virt.libvirt.driver [-] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Instance spawned successfully.#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.731 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.746 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.753 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.783 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.785 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.787 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.788 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.789 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.789 186962 DEBUG nova.virt.libvirt.driver [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.795 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.796 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399667.7125998, af2973a1-4571-44fc-8f98-148372c33b8a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:07 np0005539505 nova_compute[186958]: 2025-11-29 07:01:07.796 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:01:07 np0005539505 podman[223118]: 2025-11-29 07:01:07.807956511 +0000 UTC m=+0.027960132 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:01:08 np0005539505 nova_compute[186958]: 2025-11-29 07:01:08.064 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:08 np0005539505 nova_compute[186958]: 2025-11-29 07:01:08.072 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399667.7176285, af2973a1-4571-44fc-8f98-148372c33b8a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:08 np0005539505 nova_compute[186958]: 2025-11-29 07:01:08.072 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:01:08 np0005539505 nova_compute[186958]: 2025-11-29 07:01:08.230 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:08 np0005539505 nova_compute[186958]: 2025-11-29 07:01:08.236 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:01:08 np0005539505 nova_compute[186958]: 2025-11-29 07:01:08.412 186962 INFO nova.compute.manager [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Took 10.26 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:01:08 np0005539505 nova_compute[186958]: 2025-11-29 07:01:08.413 186962 DEBUG nova.compute.manager [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:08 np0005539505 nova_compute[186958]: 2025-11-29 07:01:08.442 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:01:08 np0005539505 nova_compute[186958]: 2025-11-29 07:01:08.528 186962 INFO nova.compute.manager [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Took 11.05 seconds to build instance.#033[00m
Nov 29 02:01:08 np0005539505 nova_compute[186958]: 2025-11-29 07:01:08.559 186962 DEBUG oslo_concurrency.lockutils [None req-363a0062-b26a-4ed0-820a-5258567eea13 ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:08 np0005539505 nova_compute[186958]: 2025-11-29 07:01:08.676 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:09 np0005539505 podman[223131]: 2025-11-29 07:01:09.20948931 +0000 UTC m=+0.520943465 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.227 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "b681523d-c882-4406-a91b-5cae6d761201" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.228 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.273 186962 DEBUG nova.compute.manager [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.395 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.396 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.402 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.404 186962 INFO nova.compute.claims [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:01:09 np0005539505 podman[223118]: 2025-11-29 07:01:09.472432697 +0000 UTC m=+1.692436318 container create c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.597 186962 DEBUG nova.compute.manager [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Received event network-vif-plugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.599 186962 DEBUG oslo_concurrency.lockutils [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.599 186962 DEBUG oslo_concurrency.lockutils [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.600 186962 DEBUG oslo_concurrency.lockutils [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.601 186962 DEBUG nova.compute.manager [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] No waiting events found dispatching network-vif-plugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.602 186962 WARNING nova.compute.manager [req-2f30626f-d560-4350-a447-20903b419d48 req-215346a2-1f91-4d36-83de-c02c1ffc5080 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Received unexpected event network-vif-plugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f for instance with vm_state active and task_state None.#033[00m
Nov 29 02:01:09 np0005539505 nova_compute[186958]: 2025-11-29 07:01:09.670 186962 DEBUG nova.compute.provider_tree [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:01:09 np0005539505 systemd[1]: Started libpod-conmon-c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a.scope.
Nov 29 02:01:09 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:01:09 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b1b249394800c68637c60a7b80c891e2a56f7912fc38d011851bb4b175d8436/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:01:10 np0005539505 nova_compute[186958]: 2025-11-29 07:01:10.189 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:10 np0005539505 podman[223118]: 2025-11-29 07:01:10.492449895 +0000 UTC m=+2.712453516 container init c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 29 02:01:10 np0005539505 podman[223118]: 2025-11-29 07:01:10.505042681 +0000 UTC m=+2.725046272 container start c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:01:10 np0005539505 neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794[223152]: [NOTICE]   (223156) : New worker (223158) forked
Nov 29 02:01:10 np0005539505 neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794[223152]: [NOTICE]   (223156) : Loading success.
Nov 29 02:01:10 np0005539505 nova_compute[186958]: 2025-11-29 07:01:10.694 186962 DEBUG nova.scheduler.client.report [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:01:11 np0005539505 nova_compute[186958]: 2025-11-29 07:01:11.604 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:11 np0005539505 nova_compute[186958]: 2025-11-29 07:01:11.605 186962 DEBUG nova.compute.manager [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:01:11 np0005539505 nova_compute[186958]: 2025-11-29 07:01:11.919 186962 DEBUG nova.compute.manager [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:01:11 np0005539505 nova_compute[186958]: 2025-11-29 07:01:11.920 186962 DEBUG nova.network.neutron [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:01:11 np0005539505 nova_compute[186958]: 2025-11-29 07:01:11.978 186962 INFO nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.048 186962 DEBUG nova.compute.manager [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.396 186962 DEBUG nova.policy [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.554 186962 DEBUG nova.compute.manager [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.556 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.557 186962 INFO nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Creating image(s)#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.559 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "/var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.559 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.561 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.587 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.691 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.694 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.695 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.720 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.790 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:12 np0005539505 nova_compute[186958]: 2025-11-29 07:01:12.792 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:13 np0005539505 nova_compute[186958]: 2025-11-29 07:01:13.321 186962 DEBUG nova.network.neutron [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Successfully created port: 26c27b84-e5c3-4a6f-8631-6517e783ca9b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:01:13 np0005539505 nova_compute[186958]: 2025-11-29 07:01:13.678 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:14 np0005539505 nova_compute[186958]: 2025-11-29 07:01:14.492 186962 DEBUG nova.network.neutron [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Successfully updated port: 26c27b84-e5c3-4a6f-8631-6517e783ca9b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:01:14 np0005539505 podman[223178]: 2025-11-29 07:01:14.753631831 +0000 UTC m=+0.077851853 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:01:14 np0005539505 podman[223179]: 2025-11-29 07:01:14.785507392 +0000 UTC m=+0.109995042 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:01:15 np0005539505 nova_compute[186958]: 2025-11-29 07:01:15.038 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "refresh_cache-b681523d-c882-4406-a91b-5cae6d761201" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:01:15 np0005539505 nova_compute[186958]: 2025-11-29 07:01:15.039 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquired lock "refresh_cache-b681523d-c882-4406-a91b-5cae6d761201" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:01:15 np0005539505 nova_compute[186958]: 2025-11-29 07:01:15.039 186962 DEBUG nova.network.neutron [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:01:15 np0005539505 nova_compute[186958]: 2025-11-29 07:01:15.191 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:15 np0005539505 nova_compute[186958]: 2025-11-29 07:01:15.600 186962 DEBUG nova.network.neutron [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:01:15 np0005539505 nova_compute[186958]: 2025-11-29 07:01:15.879 186962 DEBUG nova.compute.manager [req-ad741269-78d7-4f73-9142-19e9491e5088 req-a6eb2032-d75b-4575-bff9-e63a9150a544 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Received event network-changed-26c27b84-e5c3-4a6f-8631-6517e783ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:15 np0005539505 nova_compute[186958]: 2025-11-29 07:01:15.880 186962 DEBUG nova.compute.manager [req-ad741269-78d7-4f73-9142-19e9491e5088 req-a6eb2032-d75b-4575-bff9-e63a9150a544 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Refreshing instance network info cache due to event network-changed-26c27b84-e5c3-4a6f-8631-6517e783ca9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:01:15 np0005539505 nova_compute[186958]: 2025-11-29 07:01:15.881 186962 DEBUG oslo_concurrency.lockutils [req-ad741269-78d7-4f73-9142-19e9491e5088 req-a6eb2032-d75b-4575-bff9-e63a9150a544 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b681523d-c882-4406-a91b-5cae6d761201" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.789 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk 1073741824" returned: 0 in 3.997s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.791 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 4.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.791 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.863 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.864 186962 DEBUG nova.virt.disk.api [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Checking if we can resize image /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.865 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.919 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.921 186962 DEBUG nova.virt.disk.api [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Cannot resize image /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.921 186962 DEBUG nova.objects.instance [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'migration_context' on Instance uuid b681523d-c882-4406-a91b-5cae6d761201 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.942 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.943 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Ensure instance console log exists: /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.944 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.945 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:16 np0005539505 nova_compute[186958]: 2025-11-29 07:01:16.945 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.370 186962 DEBUG nova.network.neutron [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Updating instance_info_cache with network_info: [{"id": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "address": "fa:16:3e:ba:70:30", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26c27b84-e5", "ovs_interfaceid": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.426 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Releasing lock "refresh_cache-b681523d-c882-4406-a91b-5cae6d761201" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.427 186962 DEBUG nova.compute.manager [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Instance network_info: |[{"id": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "address": "fa:16:3e:ba:70:30", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26c27b84-e5", "ovs_interfaceid": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.427 186962 DEBUG oslo_concurrency.lockutils [req-ad741269-78d7-4f73-9142-19e9491e5088 req-a6eb2032-d75b-4575-bff9-e63a9150a544 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b681523d-c882-4406-a91b-5cae6d761201" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.428 186962 DEBUG nova.network.neutron [req-ad741269-78d7-4f73-9142-19e9491e5088 req-a6eb2032-d75b-4575-bff9-e63a9150a544 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Refreshing network info cache for port 26c27b84-e5c3-4a6f-8631-6517e783ca9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.431 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Start _get_guest_xml network_info=[{"id": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "address": "fa:16:3e:ba:70:30", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26c27b84-e5", "ovs_interfaceid": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.439 186962 WARNING nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.445 186962 DEBUG nova.virt.libvirt.host [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.446 186962 DEBUG nova.virt.libvirt.host [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.450 186962 DEBUG nova.virt.libvirt.host [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.451 186962 DEBUG nova.virt.libvirt.host [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.453 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.453 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.454 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.454 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.455 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.455 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.456 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.456 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.457 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.457 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.457 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.458 186962 DEBUG nova.virt.hardware [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.463 186962 DEBUG nova.virt.libvirt.vif [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-647782358',display_name='tempest-ServersAdminTestJSON-server-647782358',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-647782358',id=56,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-9ic89zym',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:12Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=b681523d-c882-4406-a91b-5cae6d761201,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "address": "fa:16:3e:ba:70:30", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26c27b84-e5", "ovs_interfaceid": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.464 186962 DEBUG nova.network.os_vif_util [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "address": "fa:16:3e:ba:70:30", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26c27b84-e5", "ovs_interfaceid": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.465 186962 DEBUG nova.network.os_vif_util [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:70:30,bridge_name='br-int',has_traffic_filtering=True,id=26c27b84-e5c3-4a6f-8631-6517e783ca9b,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26c27b84-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.467 186962 DEBUG nova.objects.instance [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'pci_devices' on Instance uuid b681523d-c882-4406-a91b-5cae6d761201 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.489 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  <uuid>b681523d-c882-4406-a91b-5cae6d761201</uuid>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  <name>instance-00000038</name>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersAdminTestJSON-server-647782358</nova:name>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:01:18</nova:creationTime>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:        <nova:user uuid="cd616d4c2eb44fe0a0da2df1690c0e21">tempest-ServersAdminTestJSON-1087744064-project-member</nova:user>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:        <nova:project uuid="80b4126e17a14d73b40158a57f19d091">tempest-ServersAdminTestJSON-1087744064</nova:project>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:        <nova:port uuid="26c27b84-e5c3-4a6f-8631-6517e783ca9b">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <entry name="serial">b681523d-c882-4406-a91b-5cae6d761201</entry>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <entry name="uuid">b681523d-c882-4406-a91b-5cae6d761201</entry>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk.config"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:ba:70:30"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <target dev="tap26c27b84-e5"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/console.log" append="off"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:01:18 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:01:18 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:01:18 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:01:18 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.497 186962 DEBUG nova.compute.manager [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Preparing to wait for external event network-vif-plugged-26c27b84-e5c3-4a6f-8631-6517e783ca9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.498 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "b681523d-c882-4406-a91b-5cae6d761201-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.498 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.498 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.499 186962 DEBUG nova.virt.libvirt.vif [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:01:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-647782358',display_name='tempest-ServersAdminTestJSON-server-647782358',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-647782358',id=56,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-9ic89zym',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:12Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=b681523d-c882-4406-a91b-5cae6d761201,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "address": "fa:16:3e:ba:70:30", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26c27b84-e5", "ovs_interfaceid": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.499 186962 DEBUG nova.network.os_vif_util [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "address": "fa:16:3e:ba:70:30", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26c27b84-e5", "ovs_interfaceid": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.500 186962 DEBUG nova.network.os_vif_util [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:70:30,bridge_name='br-int',has_traffic_filtering=True,id=26c27b84-e5c3-4a6f-8631-6517e783ca9b,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26c27b84-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.500 186962 DEBUG os_vif [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:70:30,bridge_name='br-int',has_traffic_filtering=True,id=26c27b84-e5c3-4a6f-8631-6517e783ca9b,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26c27b84-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.502 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.502 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.503 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.509 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.509 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap26c27b84-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.511 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap26c27b84-e5, col_values=(('external_ids', {'iface-id': '26c27b84-e5c3-4a6f-8631-6517e783ca9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:70:30', 'vm-uuid': 'b681523d-c882-4406-a91b-5cae6d761201'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.514 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.517 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:01:18 np0005539505 NetworkManager[55134]: <info>  [1764399678.5189] manager: (tap26c27b84-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.526 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.527 186962 INFO os_vif [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:70:30,bridge_name='br-int',has_traffic_filtering=True,id=26c27b84-e5c3-4a6f-8631-6517e783ca9b,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26c27b84-e5')#033[00m
Nov 29 02:01:18 np0005539505 nova_compute[186958]: 2025-11-29 07:01:18.681 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.082 186962 DEBUG oslo_concurrency.lockutils [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Acquiring lock "af2973a1-4571-44fc-8f98-148372c33b8a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.084 186962 DEBUG oslo_concurrency.lockutils [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.085 186962 DEBUG oslo_concurrency.lockutils [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Acquiring lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.085 186962 DEBUG oslo_concurrency.lockutils [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.085 186962 DEBUG oslo_concurrency.lockutils [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.092 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.093 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.093 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No VIF found with MAC fa:16:3e:ba:70:30, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.093 186962 INFO nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Using config drive#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.109 186962 INFO nova.compute.manager [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Terminating instance#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.129 186962 DEBUG nova.compute.manager [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.738 186962 INFO nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Creating config drive at /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk.config#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.744 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwtmectdy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:20 np0005539505 podman[223247]: 2025-11-29 07:01:20.757616189 +0000 UTC m=+0.075307361 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.773 186962 DEBUG nova.network.neutron [req-ad741269-78d7-4f73-9142-19e9491e5088 req-a6eb2032-d75b-4575-bff9-e63a9150a544 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Updated VIF entry in instance network info cache for port 26c27b84-e5c3-4a6f-8631-6517e783ca9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.774 186962 DEBUG nova.network.neutron [req-ad741269-78d7-4f73-9142-19e9491e5088 req-a6eb2032-d75b-4575-bff9-e63a9150a544 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Updating instance_info_cache with network_info: [{"id": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "address": "fa:16:3e:ba:70:30", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26c27b84-e5", "ovs_interfaceid": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.869 186962 DEBUG oslo_concurrency.processutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwtmectdy" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:20 np0005539505 kernel: tap26c27b84-e5: entered promiscuous mode
Nov 29 02:01:20 np0005539505 NetworkManager[55134]: <info>  [1764399680.9323] manager: (tap26c27b84-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/120)
Nov 29 02:01:20 np0005539505 systemd-udevd[223282]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.987 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:20 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:20Z|00225|binding|INFO|Claiming lport 26c27b84-e5c3-4a6f-8631-6517e783ca9b for this chassis.
Nov 29 02:01:20 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:20Z|00226|binding|INFO|26c27b84-e5c3-4a6f-8631-6517e783ca9b: Claiming fa:16:3e:ba:70:30 10.100.0.11
Nov 29 02:01:20 np0005539505 nova_compute[186958]: 2025-11-29 07:01:20.998 186962 DEBUG oslo_concurrency.lockutils [req-ad741269-78d7-4f73-9142-19e9491e5088 req-a6eb2032-d75b-4575-bff9-e63a9150a544 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b681523d-c882-4406-a91b-5cae6d761201" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:01:21 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:21Z|00227|binding|INFO|Setting lport 26c27b84-e5c3-4a6f-8631-6517e783ca9b ovn-installed in OVS
Nov 29 02:01:21 np0005539505 NetworkManager[55134]: <info>  [1764399681.0088] device (tap26c27b84-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:01:21 np0005539505 NetworkManager[55134]: <info>  [1764399681.0102] device (tap26c27b84-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:01:21 np0005539505 nova_compute[186958]: 2025-11-29 07:01:21.007 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:21 np0005539505 nova_compute[186958]: 2025-11-29 07:01:21.011 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:21 np0005539505 systemd-machined[153285]: New machine qemu-28-instance-00000038.
Nov 29 02:01:21 np0005539505 systemd[1]: Started Virtual Machine qemu-28-instance-00000038.
Nov 29 02:01:21 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:21Z|00228|binding|INFO|Setting lport 26c27b84-e5c3-4a6f-8631-6517e783ca9b up in Southbound
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.167 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:70:30 10.100.0.11'], port_security=['fa:16:3e:ba:70:30 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '2', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=26c27b84-e5c3-4a6f-8631-6517e783ca9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.170 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 26c27b84-e5c3-4a6f-8631-6517e783ca9b in datapath b97f3d85-11c0-4475-aea6-e8da158df42a bound to our chassis#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.172 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.199 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a2afbcfa-92f1-46cc-9f49-fe2a9307fc60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.240 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[dc0a6f10-50e1-4113-8a30-c22aeff61b1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.246 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[bd509442-1602-4e52-8285-5bb52d8fe140]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.286 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c9da3b63-4e46-4541-8d28-d01e601cada7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.305 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1c87cc2a-6c18-4ae5-accd-a42a38e65701]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510568, 'reachable_time': 25382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223299, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.335 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[27473670-cf27-4a77-8bb4-ef1a6ec6286d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510580, 'tstamp': 510580}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223300, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510582, 'tstamp': 510582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223300, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.340 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:21 np0005539505 nova_compute[186958]: 2025-11-29 07:01:21.342 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.344 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.344 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.345 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:21.345 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:01:21 np0005539505 nova_compute[186958]: 2025-11-29 07:01:21.562 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399681.5620046, b681523d-c882-4406-a91b-5cae6d761201 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:21 np0005539505 nova_compute[186958]: 2025-11-29 07:01:21.563 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] VM Started (Lifecycle Event)#033[00m
Nov 29 02:01:21 np0005539505 nova_compute[186958]: 2025-11-29 07:01:21.587 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:21 np0005539505 nova_compute[186958]: 2025-11-29 07:01:21.594 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399681.5622969, b681523d-c882-4406-a91b-5cae6d761201 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:21 np0005539505 nova_compute[186958]: 2025-11-29 07:01:21.594 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:01:21 np0005539505 nova_compute[186958]: 2025-11-29 07:01:21.613 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:21 np0005539505 nova_compute[186958]: 2025-11-29 07:01:21.618 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:01:21 np0005539505 nova_compute[186958]: 2025-11-29 07:01:21.639 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.225 186962 DEBUG nova.compute.manager [req-f456e19a-d8bf-435a-a9dc-4ec088821fb9 req-9b5f7c71-6245-4efa-a386-39fec38d2000 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Received event network-vif-plugged-26c27b84-e5c3-4a6f-8631-6517e783ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.226 186962 DEBUG oslo_concurrency.lockutils [req-f456e19a-d8bf-435a-a9dc-4ec088821fb9 req-9b5f7c71-6245-4efa-a386-39fec38d2000 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b681523d-c882-4406-a91b-5cae6d761201-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.226 186962 DEBUG oslo_concurrency.lockutils [req-f456e19a-d8bf-435a-a9dc-4ec088821fb9 req-9b5f7c71-6245-4efa-a386-39fec38d2000 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.226 186962 DEBUG oslo_concurrency.lockutils [req-f456e19a-d8bf-435a-a9dc-4ec088821fb9 req-9b5f7c71-6245-4efa-a386-39fec38d2000 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.227 186962 DEBUG nova.compute.manager [req-f456e19a-d8bf-435a-a9dc-4ec088821fb9 req-9b5f7c71-6245-4efa-a386-39fec38d2000 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Processing event network-vif-plugged-26c27b84-e5c3-4a6f-8631-6517e783ca9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.227 186962 DEBUG nova.compute.manager [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.231 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399683.2312717, b681523d-c882-4406-a91b-5cae6d761201 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.232 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.237 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.246 186962 INFO nova.virt.libvirt.driver [-] [instance: b681523d-c882-4406-a91b-5cae6d761201] Instance spawned successfully.#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.246 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:01:23 np0005539505 kernel: tap6926042d-b0 (unregistering): left promiscuous mode
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.255 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:23 np0005539505 NetworkManager[55134]: <info>  [1764399683.2567] device (tap6926042d-b0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.262 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:01:23 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:23Z|00229|binding|INFO|Releasing lport 6926042d-b0e8-404f-86b4-5a7e05c6b07f from this chassis (sb_readonly=0)
Nov 29 02:01:23 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:23Z|00230|binding|INFO|Setting lport 6926042d-b0e8-404f-86b4-5a7e05c6b07f down in Southbound
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.272 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:23 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:23Z|00231|binding|INFO|Removing iface tap6926042d-b0 ovn-installed in OVS
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.277 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.301 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:23 np0005539505 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000037.scope: Deactivated successfully.
Nov 29 02:01:23 np0005539505 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000037.scope: Consumed 11.988s CPU time.
Nov 29 02:01:23 np0005539505 systemd-machined[153285]: Machine qemu-27-instance-00000037 terminated.
Nov 29 02:01:23 np0005539505 podman[223312]: 2025-11-29 07:01:23.359458925 +0000 UTC m=+0.076957358 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.376 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.381 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.392 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.392 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.392 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.393 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.393 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.393 186962 DEBUG nova.virt.libvirt.driver [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.423 186962 INFO nova.virt.libvirt.driver [-] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Instance destroyed successfully.#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.424 186962 DEBUG nova.objects.instance [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lazy-loading 'resources' on Instance uuid af2973a1-4571-44fc-8f98-148372c33b8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.514 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:23.630 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:e4:d3 10.100.0.3'], port_security=['fa:16:3e:a4:e4:d3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'af2973a1-4571-44fc-8f98-148372c33b8a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-284fab49-a685-4bfe-8b0e-6c2e3d7fe794', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9a8c0812295749a6b8b144654ceafd3c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '880663a7-b218-4097-b37f-b0875e9dc214', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa3eedc4-6bbb-4a08-8fdb-c92c1707da3f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=6926042d-b0e8-404f-86b4-5a7e05c6b07f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:01:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:23.633 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 6926042d-b0e8-404f-86b4-5a7e05c6b07f in datapath 284fab49-a685-4bfe-8b0e-6c2e3d7fe794 unbound from our chassis#033[00m
Nov 29 02:01:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:23.638 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 284fab49-a685-4bfe-8b0e-6c2e3d7fe794, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:01:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:23.640 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a60f1cb4-f272-41e9-bb2d-8c3c51b7e208]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:23.641 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794 namespace which is not needed anymore#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.669 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.672 186962 DEBUG nova.virt.libvirt.vif [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:00:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataTestJSON-server-287502341',display_name='tempest-ServerMetadataTestJSON-server-287502341',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatatestjson-server-287502341',id=55,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:01:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={key1='alt1',key2='value2',key3='value3'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9a8c0812295749a6b8b144654ceafd3c',ramdisk_id='',reservation_id='r-hfpp2jag',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataTestJSON-394669812',owner_user_name='tempest-ServerMetadataTestJSON-394669812-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:01:18Z,user_data=None,user_id='ed0cd4810c054eaa9d04fe9ef74bf011',uuid=af2973a1-4571-44fc-8f98-148372c33b8a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "address": "fa:16:3e:a4:e4:d3", "network": {"id": "284fab49-a685-4bfe-8b0e-6c2e3d7fe794", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1099851490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9a8c0812295749a6b8b144654ceafd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6926042d-b0", "ovs_interfaceid": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.672 186962 DEBUG nova.network.os_vif_util [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Converting VIF {"id": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "address": "fa:16:3e:a4:e4:d3", "network": {"id": "284fab49-a685-4bfe-8b0e-6c2e3d7fe794", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1099851490-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9a8c0812295749a6b8b144654ceafd3c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6926042d-b0", "ovs_interfaceid": "6926042d-b0e8-404f-86b4-5a7e05c6b07f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.673 186962 DEBUG nova.network.os_vif_util [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:e4:d3,bridge_name='br-int',has_traffic_filtering=True,id=6926042d-b0e8-404f-86b4-5a7e05c6b07f,network=Network(284fab49-a685-4bfe-8b0e-6c2e3d7fe794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6926042d-b0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.674 186962 DEBUG os_vif [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:e4:d3,bridge_name='br-int',has_traffic_filtering=True,id=6926042d-b0e8-404f-86b4-5a7e05c6b07f,network=Network(284fab49-a685-4bfe-8b0e-6c2e3d7fe794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6926042d-b0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.675 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.676 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6926042d-b0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.680 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.683 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.686 186962 INFO os_vif [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:e4:d3,bridge_name='br-int',has_traffic_filtering=True,id=6926042d-b0e8-404f-86b4-5a7e05c6b07f,network=Network(284fab49-a685-4bfe-8b0e-6c2e3d7fe794),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6926042d-b0')#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.686 186962 INFO nova.virt.libvirt.driver [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Deleting instance files /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a_del#033[00m
Nov 29 02:01:23 np0005539505 nova_compute[186958]: 2025-11-29 07:01:23.688 186962 INFO nova.virt.libvirt.driver [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Deletion of /var/lib/nova/instances/af2973a1-4571-44fc-8f98-148372c33b8a_del complete#033[00m
Nov 29 02:01:24 np0005539505 neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794[223152]: [NOTICE]   (223156) : haproxy version is 2.8.14-c23fe91
Nov 29 02:01:24 np0005539505 neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794[223152]: [NOTICE]   (223156) : path to executable is /usr/sbin/haproxy
Nov 29 02:01:24 np0005539505 neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794[223152]: [WARNING]  (223156) : Exiting Master process...
Nov 29 02:01:24 np0005539505 neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794[223152]: [ALERT]    (223156) : Current worker (223158) exited with code 143 (Terminated)
Nov 29 02:01:24 np0005539505 neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794[223152]: [WARNING]  (223156) : All workers exited. Exiting... (0)
Nov 29 02:01:24 np0005539505 systemd[1]: libpod-c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a.scope: Deactivated successfully.
Nov 29 02:01:24 np0005539505 podman[223369]: 2025-11-29 07:01:24.086042535 +0000 UTC m=+0.343515037 container died c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:01:24 np0005539505 nova_compute[186958]: 2025-11-29 07:01:24.528 186962 INFO nova.compute.manager [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Took 11.97 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:01:24 np0005539505 nova_compute[186958]: 2025-11-29 07:01:24.530 186962 DEBUG nova.compute.manager [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:24 np0005539505 nova_compute[186958]: 2025-11-29 07:01:24.540 186962 INFO nova.compute.manager [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Took 4.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:01:24 np0005539505 nova_compute[186958]: 2025-11-29 07:01:24.541 186962 DEBUG oslo.service.loopingcall [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:01:24 np0005539505 nova_compute[186958]: 2025-11-29 07:01:24.541 186962 DEBUG nova.compute.manager [-] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:01:24 np0005539505 nova_compute[186958]: 2025-11-29 07:01:24.542 186962 DEBUG nova.network.neutron [-] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.038 186962 INFO nova.compute.manager [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Took 15.69 seconds to build instance.#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.227 186962 DEBUG oslo_concurrency.lockutils [None req-115c8a41-834a-4102-9f80-e979418d4ab0 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.999s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.438 186962 DEBUG nova.compute.manager [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Received event network-vif-plugged-26c27b84-e5c3-4a6f-8631-6517e783ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.439 186962 DEBUG oslo_concurrency.lockutils [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b681523d-c882-4406-a91b-5cae6d761201-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.440 186962 DEBUG oslo_concurrency.lockutils [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.440 186962 DEBUG oslo_concurrency.lockutils [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.440 186962 DEBUG nova.compute.manager [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] No waiting events found dispatching network-vif-plugged-26c27b84-e5c3-4a6f-8631-6517e783ca9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.440 186962 WARNING nova.compute.manager [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Received unexpected event network-vif-plugged-26c27b84-e5c3-4a6f-8631-6517e783ca9b for instance with vm_state active and task_state None.#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.440 186962 DEBUG nova.compute.manager [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Received event network-vif-unplugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.440 186962 DEBUG oslo_concurrency.lockutils [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.441 186962 DEBUG oslo_concurrency.lockutils [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.441 186962 DEBUG oslo_concurrency.lockutils [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.441 186962 DEBUG nova.compute.manager [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] No waiting events found dispatching network-vif-unplugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.441 186962 DEBUG nova.compute.manager [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Received event network-vif-unplugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.441 186962 DEBUG nova.compute.manager [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Received event network-vif-plugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.441 186962 DEBUG oslo_concurrency.lockutils [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.442 186962 DEBUG oslo_concurrency.lockutils [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.442 186962 DEBUG oslo_concurrency.lockutils [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.442 186962 DEBUG nova.compute.manager [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] No waiting events found dispatching network-vif-plugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.442 186962 WARNING nova.compute.manager [req-fd0450ec-4ab7-44ae-a460-7191635ff67e req-f7a4a587-24b0-44f8-b1f0-3d9afa82d16d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Received unexpected event network-vif-plugged-6926042d-b0e8-404f-86b4-5a7e05c6b07f for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.939 186962 DEBUG nova.network.neutron [-] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:01:25 np0005539505 nova_compute[186958]: 2025-11-29 07:01:25.972 186962 INFO nova.compute.manager [-] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Took 1.43 seconds to deallocate network for instance.#033[00m
Nov 29 02:01:26 np0005539505 systemd[1]: var-lib-containers-storage-overlay-2b1b249394800c68637c60a7b80c891e2a56f7912fc38d011851bb4b175d8436-merged.mount: Deactivated successfully.
Nov 29 02:01:26 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a-userdata-shm.mount: Deactivated successfully.
Nov 29 02:01:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:26Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:47:55 10.100.0.14
Nov 29 02:01:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:26Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:47:55 10.100.0.14
Nov 29 02:01:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:26.938 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:26.939 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:26.940 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:27 np0005539505 nova_compute[186958]: 2025-11-29 07:01:27.045 186962 DEBUG oslo_concurrency.lockutils [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:27 np0005539505 nova_compute[186958]: 2025-11-29 07:01:27.047 186962 DEBUG oslo_concurrency.lockutils [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:27 np0005539505 nova_compute[186958]: 2025-11-29 07:01:27.393 186962 DEBUG nova.compute.manager [req-f0ca7f4f-501f-484e-8dc5-1fbcde1fea25 req-09d632d5-864f-4d6d-b225-c9f1f3e9c85d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Received event network-vif-deleted-6926042d-b0e8-404f-86b4-5a7e05c6b07f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:01:27 np0005539505 nova_compute[186958]: 2025-11-29 07:01:27.508 186962 DEBUG nova.compute.provider_tree [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:01:27 np0005539505 nova_compute[186958]: 2025-11-29 07:01:27.582 186962 DEBUG nova.scheduler.client.report [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:01:27 np0005539505 podman[223369]: 2025-11-29 07:01:27.789409334 +0000 UTC m=+4.046881836 container cleanup c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:01:27 np0005539505 systemd[1]: libpod-conmon-c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a.scope: Deactivated successfully.
Nov 29 02:01:28 np0005539505 nova_compute[186958]: 2025-11-29 07:01:28.233 186962 DEBUG oslo_concurrency.lockutils [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:28 np0005539505 nova_compute[186958]: 2025-11-29 07:01:28.678 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:28 np0005539505 nova_compute[186958]: 2025-11-29 07:01:28.685 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:28 np0005539505 nova_compute[186958]: 2025-11-29 07:01:28.865 186962 INFO nova.scheduler.client.report [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Deleted allocations for instance af2973a1-4571-44fc-8f98-148372c33b8a#033[00m
Nov 29 02:01:29 np0005539505 nova_compute[186958]: 2025-11-29 07:01:29.264 186962 DEBUG oslo_concurrency.lockutils [None req-7db0f184-6f55-4413-94aa-47df8c5cd91d ed0cd4810c054eaa9d04fe9ef74bf011 9a8c0812295749a6b8b144654ceafd3c - - default default] Lock "af2973a1-4571-44fc-8f98-148372c33b8a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:32 np0005539505 podman[223401]: 2025-11-29 07:01:32.16511204 +0000 UTC m=+4.331688052 container remove c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:01:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:32.174 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8b05b1ab-f745-442c-a46b-c73bcd07d4e4]: (4, ('Sat Nov 29 07:01:23 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794 (c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a)\nc770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a\nSat Nov 29 07:01:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794 (c770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a)\nc770d500ba38a80534b6b6dfaf106463d6daef4c9738b50ecdead4534fd2132a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:32.177 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3f25b05e-bcbc-4376-900b-bd88a166e3c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:32.179 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap284fab49-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:32 np0005539505 nova_compute[186958]: 2025-11-29 07:01:32.182 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:32 np0005539505 kernel: tap284fab49-a0: left promiscuous mode
Nov 29 02:01:32 np0005539505 nova_compute[186958]: 2025-11-29 07:01:32.189 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:32.193 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e70fbe7c-48a2-4731-b74d-cb5fa10db814]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:32 np0005539505 nova_compute[186958]: 2025-11-29 07:01:32.218 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:32.398 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1b5ee56d-e17b-468c-aaff-adc1e3c47890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:32.400 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[41565a1c-3b9d-47e9-bb0a-088dec49e87a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:32.423 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[13baf104-3eb4-4362-8c0e-9c9c824eb0bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 511466, 'reachable_time': 19458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223415, 'error': None, 'target': 'ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:32 np0005539505 systemd[1]: run-netns-ovnmeta\x2d284fab49\x2da685\x2d4bfe\x2d8b0e\x2d6c2e3d7fe794.mount: Deactivated successfully.
Nov 29 02:01:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:32.433 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-284fab49-a685-4bfe-8b0e-6c2e3d7fe794 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:01:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:32.435 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[1cbeef88-1155-442e-9921-cb0d04676a6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:01:33 np0005539505 nova_compute[186958]: 2025-11-29 07:01:33.681 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:33 np0005539505 nova_compute[186958]: 2025-11-29 07:01:33.687 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:34 np0005539505 podman[223421]: 2025-11-29 07:01:34.736824995 +0000 UTC m=+0.056869340 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:01:34 np0005539505 podman[223420]: 2025-11-29 07:01:34.7720114 +0000 UTC m=+0.093309490 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 02:01:36 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:36Z|00232|binding|INFO|Releasing lport e6d6aadc-4cde-4c62-a881-70607e3666f6 from this chassis (sb_readonly=0)
Nov 29 02:01:36 np0005539505 nova_compute[186958]: 2025-11-29 07:01:36.256 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:38 np0005539505 nova_compute[186958]: 2025-11-29 07:01:38.671 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399683.4203944, af2973a1-4571-44fc-8f98-148372c33b8a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:01:38 np0005539505 nova_compute[186958]: 2025-11-29 07:01:38.671 186962 INFO nova.compute.manager [-] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:01:38 np0005539505 nova_compute[186958]: 2025-11-29 07:01:38.685 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:38 np0005539505 nova_compute[186958]: 2025-11-29 07:01:38.689 186962 DEBUG nova.compute.manager [None req-a24f4117-ae67-4e74-b050-9da44969f7cd - - - - - -] [instance: af2973a1-4571-44fc-8f98-148372c33b8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:38 np0005539505 nova_compute[186958]: 2025-11-29 07:01:38.689 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:39 np0005539505 podman[223480]: 2025-11-29 07:01:39.750885082 +0000 UTC m=+0.078717515 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 29 02:01:42 np0005539505 nova_compute[186958]: 2025-11-29 07:01:42.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:42 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:42Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ba:70:30 10.100.0.11
Nov 29 02:01:42 np0005539505 ovn_controller[95143]: 2025-11-29T07:01:42Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ba:70:30 10.100.0.11
Nov 29 02:01:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:43.519 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:01:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:43.519 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:01:43 np0005539505 nova_compute[186958]: 2025-11-29 07:01:43.520 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:43 np0005539505 nova_compute[186958]: 2025-11-29 07:01:43.686 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:43 np0005539505 nova_compute[186958]: 2025-11-29 07:01:43.691 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:45 np0005539505 podman[223500]: 2025-11-29 07:01:45.732124834 +0000 UTC m=+0.063153614 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:01:45 np0005539505 podman[223501]: 2025-11-29 07:01:45.770103972 +0000 UTC m=+0.098679062 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:01:47 np0005539505 nova_compute[186958]: 2025-11-29 07:01:47.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:47 np0005539505 nova_compute[186958]: 2025-11-29 07:01:47.456 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:48 np0005539505 nova_compute[186958]: 2025-11-29 07:01:48.408 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:48 np0005539505 nova_compute[186958]: 2025-11-29 07:01:48.688 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:48 np0005539505 nova_compute[186958]: 2025-11-29 07:01:48.693 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:50 np0005539505 nova_compute[186958]: 2025-11-29 07:01:50.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:50 np0005539505 nova_compute[186958]: 2025-11-29 07:01:50.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:01:50 np0005539505 nova_compute[186958]: 2025-11-29 07:01:50.400 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:01:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:01:50.522 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:01:51 np0005539505 podman[223553]: 2025-11-29 07:01:51.767559953 +0000 UTC m=+0.093786823 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 02:01:51 np0005539505 nova_compute[186958]: 2025-11-29 07:01:51.949 186962 INFO nova.compute.manager [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Rebuilding instance#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.244 186962 DEBUG nova.compute.manager [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.317 186962 DEBUG nova.objects.instance [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'pci_requests' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.329 186962 DEBUG nova.objects.instance [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'pci_devices' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.344 186962 DEBUG nova.objects.instance [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'resources' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.401 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.402 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.419 186962 DEBUG nova.objects.instance [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'migration_context' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.423 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.423 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.423 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.423 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.434 186962 DEBUG nova.objects.instance [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.438 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:01:52 np0005539505 nova_compute[186958]: 2025-11-29 07:01:52.506 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.129 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json" returned: 0 in 0.623s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.131 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.191 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.200 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.256 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.257 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.332 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.526 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.528 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5394MB free_disk=73.16884231567383GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.528 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.528 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.604 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 230d36aa-b1ff-4e7d-a024-af0021cd0044 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.604 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance b681523d-c882-4406-a91b-5cae6d761201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.604 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.604 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.692 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.695 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.701 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.720 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:01:53 np0005539505 podman[223590]: 2025-11-29 07:01:53.730481079 +0000 UTC m=+0.059145739 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.748 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:01:53 np0005539505 nova_compute[186958]: 2025-11-29 07:01:53.748 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:01:54 np0005539505 nova_compute[186958]: 2025-11-29 07:01:54.721 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:57 np0005539505 nova_compute[186958]: 2025-11-29 07:01:57.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:57 np0005539505 nova_compute[186958]: 2025-11-29 07:01:57.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:57 np0005539505 nova_compute[186958]: 2025-11-29 07:01:57.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:01:58 np0005539505 nova_compute[186958]: 2025-11-29 07:01:58.697 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:01:58 np0005539505 nova_compute[186958]: 2025-11-29 07:01:58.699 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:01:59 np0005539505 nova_compute[186958]: 2025-11-29 07:01:59.594 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:01:59 np0005539505 nova_compute[186958]: 2025-11-29 07:01:59.595 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:01:59 np0005539505 nova_compute[186958]: 2025-11-29 07:01:59.595 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:01:59 np0005539505 nova_compute[186958]: 2025-11-29 07:01:59.918 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-230d36aa-b1ff-4e7d-a024-af0021cd0044" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:01:59 np0005539505 nova_compute[186958]: 2025-11-29 07:01:59.918 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-230d36aa-b1ff-4e7d-a024-af0021cd0044" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:01:59 np0005539505 nova_compute[186958]: 2025-11-29 07:01:59.918 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:01:59 np0005539505 nova_compute[186958]: 2025-11-29 07:01:59.919 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:02 np0005539505 nova_compute[186958]: 2025-11-29 07:02:02.485 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:02:03 np0005539505 nova_compute[186958]: 2025-11-29 07:02:03.700 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:02:05 np0005539505 podman[223613]: 2025-11-29 07:02:05.751492473 +0000 UTC m=+0.061899418 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:02:05 np0005539505 podman[223612]: 2025-11-29 07:02:05.756948878 +0000 UTC m=+0.081639368 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:02:05 np0005539505 nova_compute[186958]: 2025-11-29 07:02:05.930 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Updating instance_info_cache with network_info: [{"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:02:06 np0005539505 nova_compute[186958]: 2025-11-29 07:02:06.433 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-230d36aa-b1ff-4e7d-a024-af0021cd0044" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:02:06 np0005539505 nova_compute[186958]: 2025-11-29 07:02:06.434 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:02:06 np0005539505 nova_compute[186958]: 2025-11-29 07:02:06.435 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:06 np0005539505 nova_compute[186958]: 2025-11-29 07:02:06.436 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:02:07 np0005539505 nova_compute[186958]: 2025-11-29 07:02:07.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:08 np0005539505 nova_compute[186958]: 2025-11-29 07:02:08.703 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539505 kernel: tape5af6202-8a (unregistering): left promiscuous mode
Nov 29 02:02:09 np0005539505 NetworkManager[55134]: <info>  [1764399729.4873] device (tape5af6202-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:02:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:02:09Z|00233|binding|INFO|Releasing lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 from this chassis (sb_readonly=0)
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.495 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:02:09Z|00234|binding|INFO|Setting lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 down in Southbound
Nov 29 02:02:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:02:09Z|00235|binding|INFO|Removing iface tape5af6202-8a ovn-installed in OVS
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.497 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.506 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:47:55 10.100.0.14'], port_security=['fa:16:3e:ff:47:55 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '4', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=e5af6202-8a71-48e2-ae69-2b3cb0d3a948) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.508 104094 INFO neutron.agent.ovn.metadata.agent [-] Port e5af6202-8a71-48e2-ae69-2b3cb0d3a948 in datapath b97f3d85-11c0-4475-aea6-e8da158df42a unbound from our chassis#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.510 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.512 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.534 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[176280a3-f596-44dd-a750-16d114ebbf6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:09 np0005539505 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000035.scope: Deactivated successfully.
Nov 29 02:02:09 np0005539505 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000035.scope: Consumed 17.396s CPU time.
Nov 29 02:02:09 np0005539505 systemd-machined[153285]: Machine qemu-26-instance-00000035 terminated.
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.569 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1e7f3bd4-f8ff-4d5e-9792-13fa6bae014f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.574 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2f52cbf5-287e-40a8-a541-260481b7a793]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.602 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[587dab9a-a4c8-4f69-a064-71df7c8d4a7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.617 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d3c5f7c8-c24c-42a2-950b-90794f6e58af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510568, 'reachable_time': 39366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223665, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.631 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb62435-4d48-4aa9-b58e-324987b87920]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510580, 'tstamp': 510580}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223666, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510582, 'tstamp': 510582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223666, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.633 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.634 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.639 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.639 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.640 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.640 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:09.640 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.774 186962 INFO nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance shutdown successfully after 17 seconds.#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.780 186962 INFO nova.virt.libvirt.driver [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance destroyed successfully.#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.785 186962 INFO nova.virt.libvirt.driver [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance destroyed successfully.#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.786 186962 DEBUG nova.virt.libvirt.vif [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:00:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1580825723',display_name='tempest-ServersAdminTestJSON-server-1580825723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1580825723',id=53,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:00:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-uhc27c80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:01:51Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=230d36aa-b1ff-4e7d-a024-af0021cd0044,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.786 186962 DEBUG nova.network.os_vif_util [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.787 186962 DEBUG nova.network.os_vif_util [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.788 186962 DEBUG os_vif [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.790 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.791 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5af6202-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.792 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.793 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.797 186962 INFO os_vif [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a')#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.797 186962 INFO nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Deleting instance files /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044_del#033[00m
Nov 29 02:02:09 np0005539505 nova_compute[186958]: 2025-11-29 07:02:09.798 186962 INFO nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Deletion of /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044_del complete#033[00m
Nov 29 02:02:09 np0005539505 podman[223683]: 2025-11-29 07:02:09.839443346 +0000 UTC m=+0.049137596 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.016 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.017 186962 INFO nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Creating image(s)#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.018 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.018 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.019 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.090 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.165 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.166 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.166 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.176 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.236 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.237 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.841 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk 1073741824" returned: 0 in 0.604s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.843 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.843 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.907 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.909 186962 DEBUG nova.virt.disk.api [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Checking if we can resize image /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.909 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.967 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.968 186962 DEBUG nova.virt.disk.api [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Cannot resize image /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.969 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.970 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Ensure instance console log exists: /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.970 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.971 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.971 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.975 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Start _get_guest_xml network_info=[{"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.979 186962 WARNING nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.987 186962 DEBUG nova.virt.libvirt.host [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.988 186962 DEBUG nova.virt.libvirt.host [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.992 186962 DEBUG nova.virt.libvirt.host [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.992 186962 DEBUG nova.virt.libvirt.host [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.994 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.994 186962 DEBUG nova.virt.hardware [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.995 186962 DEBUG nova.virt.hardware [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.995 186962 DEBUG nova.virt.hardware [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.995 186962 DEBUG nova.virt.hardware [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.996 186962 DEBUG nova.virt.hardware [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.996 186962 DEBUG nova.virt.hardware [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.996 186962 DEBUG nova.virt.hardware [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.997 186962 DEBUG nova.virt.hardware [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.997 186962 DEBUG nova.virt.hardware [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.997 186962 DEBUG nova.virt.hardware [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.998 186962 DEBUG nova.virt.hardware [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:02:10 np0005539505 nova_compute[186958]: 2025-11-29 07:02:10.998 186962 DEBUG nova.objects.instance [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.020 186962 DEBUG nova.virt.libvirt.vif [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:00:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1580825723',display_name='tempest-ServersAdminTestJSON-server-1580825723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1580825723',id=53,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:00:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-uhc27c80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:02:09Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=230d36aa-b1ff-4e7d-a024-af0021cd0044,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.021 186962 DEBUG nova.network.os_vif_util [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.022 186962 DEBUG nova.network.os_vif_util [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.024 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  <uuid>230d36aa-b1ff-4e7d-a024-af0021cd0044</uuid>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  <name>instance-00000035</name>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersAdminTestJSON-server-1580825723</nova:name>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:02:10</nova:creationTime>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:        <nova:user uuid="cd616d4c2eb44fe0a0da2df1690c0e21">tempest-ServersAdminTestJSON-1087744064-project-member</nova:user>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:        <nova:project uuid="80b4126e17a14d73b40158a57f19d091">tempest-ServersAdminTestJSON-1087744064</nova:project>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="3372b7b2-657b-4c4d-9d9d-7c5b771a630a"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:        <nova:port uuid="e5af6202-8a71-48e2-ae69-2b3cb0d3a948">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <entry name="serial">230d36aa-b1ff-4e7d-a024-af0021cd0044</entry>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <entry name="uuid">230d36aa-b1ff-4e7d-a024-af0021cd0044</entry>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:ff:47:55"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <target dev="tape5af6202-8a"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/console.log" append="off"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:02:11 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:02:11 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:02:11 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:02:11 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.024 186962 DEBUG nova.compute.manager [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Preparing to wait for external event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.024 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.025 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.025 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.026 186962 DEBUG nova.virt.libvirt.vif [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:00:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1580825723',display_name='tempest-ServersAdminTestJSON-server-1580825723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1580825723',id=53,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:00:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-uhc27c80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:02:09Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=230d36aa-b1ff-4e7d-a024-af0021cd0044,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.026 186962 DEBUG nova.network.os_vif_util [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.026 186962 DEBUG nova.network.os_vif_util [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.027 186962 DEBUG os_vif [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.027 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.028 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.028 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.031 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.031 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5af6202-8a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.031 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape5af6202-8a, col_values=(('external_ids', {'iface-id': 'e5af6202-8a71-48e2-ae69-2b3cb0d3a948', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:47:55', 'vm-uuid': '230d36aa-b1ff-4e7d-a024-af0021cd0044'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:11 np0005539505 NetworkManager[55134]: <info>  [1764399731.0342] manager: (tape5af6202-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.033 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.036 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.038 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.040 186962 INFO os_vif [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a')#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.241 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.241 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.242 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No VIF found with MAC fa:16:3e:ff:47:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.242 186962 INFO nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Using config drive#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.258 186962 DEBUG nova.objects.instance [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.282 186962 DEBUG nova.objects.instance [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'keypairs' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.651 186962 INFO nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Creating config drive at /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.661 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq72bavel execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.792 186962 DEBUG oslo_concurrency.processutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq72bavel" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:11 np0005539505 kernel: tape5af6202-8a: entered promiscuous mode
Nov 29 02:02:11 np0005539505 NetworkManager[55134]: <info>  [1764399731.8691] manager: (tape5af6202-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/122)
Nov 29 02:02:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:02:11Z|00236|binding|INFO|Claiming lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for this chassis.
Nov 29 02:02:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:02:11Z|00237|binding|INFO|e5af6202-8a71-48e2-ae69-2b3cb0d3a948: Claiming fa:16:3e:ff:47:55 10.100.0.14
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.869 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:11 np0005539505 systemd-udevd[223657]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:02:11 np0005539505 NetworkManager[55134]: <info>  [1764399731.8835] device (tape5af6202-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:02:11 np0005539505 NetworkManager[55134]: <info>  [1764399731.8842] device (tape5af6202-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:02:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:02:11Z|00238|binding|INFO|Setting lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 ovn-installed in OVS
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.886 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:11 np0005539505 nova_compute[186958]: 2025-11-29 07:02:11.890 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:11 np0005539505 systemd-machined[153285]: New machine qemu-29-instance-00000035.
Nov 29 02:02:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:02:11Z|00239|binding|INFO|Setting lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 up in Southbound
Nov 29 02:02:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:11.917 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:47:55 10.100.0.14'], port_security=['fa:16:3e:ff:47:55 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '4', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=e5af6202-8a71-48e2-ae69-2b3cb0d3a948) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:02:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:11.919 104094 INFO neutron.agent.ovn.metadata.agent [-] Port e5af6202-8a71-48e2-ae69-2b3cb0d3a948 in datapath b97f3d85-11c0-4475-aea6-e8da158df42a bound to our chassis#033[00m
Nov 29 02:02:11 np0005539505 systemd[1]: Started Virtual Machine qemu-29-instance-00000035.
Nov 29 02:02:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:11.920 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:02:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:11.933 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2558c902-9d5b-4f05-917b-13ae34ea34a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:11.961 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[48a31950-baa0-4814-b5dc-d6df93faafa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:11.964 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[43de027e-6917-4e07-b0d3-1a1a76934504]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:11.989 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb6e12a-e5cc-4765-8e66-ea04030e0dff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:12.007 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5328748f-f0bf-466e-9443-f5b2739be472]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510568, 'reachable_time': 39366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223752, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:12.022 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9a831ebb-013c-46fe-ae35-a5150dc702c6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510580, 'tstamp': 510580}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223753, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510582, 'tstamp': 510582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223753, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:02:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:12.024 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.025 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:12.027 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:12.027 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:02:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:12.027 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:12.028 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.051 186962 DEBUG nova.compute.manager [req-e55260e4-532c-4a0b-a90b-1fb8da5c3477 req-83f699ea-4008-465f-8ac7-0a54f5215508 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-unplugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.051 186962 DEBUG oslo_concurrency.lockutils [req-e55260e4-532c-4a0b-a90b-1fb8da5c3477 req-83f699ea-4008-465f-8ac7-0a54f5215508 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.051 186962 DEBUG oslo_concurrency.lockutils [req-e55260e4-532c-4a0b-a90b-1fb8da5c3477 req-83f699ea-4008-465f-8ac7-0a54f5215508 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.052 186962 DEBUG oslo_concurrency.lockutils [req-e55260e4-532c-4a0b-a90b-1fb8da5c3477 req-83f699ea-4008-465f-8ac7-0a54f5215508 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.052 186962 DEBUG nova.compute.manager [req-e55260e4-532c-4a0b-a90b-1fb8da5c3477 req-83f699ea-4008-465f-8ac7-0a54f5215508 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] No event matching network-vif-unplugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 in dict_keys([('network-vif-plugged', 'e5af6202-8a71-48e2-ae69-2b3cb0d3a948')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.052 186962 WARNING nova.compute.manager [req-e55260e4-532c-4a0b-a90b-1fb8da5c3477 req-83f699ea-4008-465f-8ac7-0a54f5215508 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received unexpected event network-vif-unplugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for instance with vm_state error and task_state rebuild_spawning.#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.305 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Removed pending event for 230d36aa-b1ff-4e7d-a024-af0021cd0044 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.306 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399732.3050752, 230d36aa-b1ff-4e7d-a024-af0021cd0044 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.306 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] VM Started (Lifecycle Event)#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.333 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.339 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399732.3061934, 230d36aa-b1ff-4e7d-a024-af0021cd0044 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.340 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.369 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.375 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:02:12 np0005539505 nova_compute[186958]: 2025-11-29 07:02:12.404 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:02:13 np0005539505 nova_compute[186958]: 2025-11-29 07:02:13.705 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:15 np0005539505 nova_compute[186958]: 2025-11-29 07:02:15.169 186962 DEBUG nova.compute.manager [req-03a275a0-6d6f-4da1-af4c-df2c99aebdf4 req-85c162b9-0468-45b2-93a3-a3a56ec564c0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:15 np0005539505 nova_compute[186958]: 2025-11-29 07:02:15.169 186962 DEBUG oslo_concurrency.lockutils [req-03a275a0-6d6f-4da1-af4c-df2c99aebdf4 req-85c162b9-0468-45b2-93a3-a3a56ec564c0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:15 np0005539505 nova_compute[186958]: 2025-11-29 07:02:15.170 186962 DEBUG oslo_concurrency.lockutils [req-03a275a0-6d6f-4da1-af4c-df2c99aebdf4 req-85c162b9-0468-45b2-93a3-a3a56ec564c0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:15 np0005539505 nova_compute[186958]: 2025-11-29 07:02:15.170 186962 DEBUG oslo_concurrency.lockutils [req-03a275a0-6d6f-4da1-af4c-df2c99aebdf4 req-85c162b9-0468-45b2-93a3-a3a56ec564c0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:15 np0005539505 nova_compute[186958]: 2025-11-29 07:02:15.170 186962 DEBUG nova.compute.manager [req-03a275a0-6d6f-4da1-af4c-df2c99aebdf4 req-85c162b9-0468-45b2-93a3-a3a56ec564c0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Processing event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:02:15 np0005539505 nova_compute[186958]: 2025-11-29 07:02:15.171 186962 DEBUG nova.compute.manager [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:02:15 np0005539505 nova_compute[186958]: 2025-11-29 07:02:15.174 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399735.1740384, 230d36aa-b1ff-4e7d-a024-af0021cd0044 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:02:15 np0005539505 nova_compute[186958]: 2025-11-29 07:02:15.174 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:02:15 np0005539505 nova_compute[186958]: 2025-11-29 07:02:15.176 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:02:15 np0005539505 nova_compute[186958]: 2025-11-29 07:02:15.178 186962 INFO nova.virt.libvirt.driver [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance spawned successfully.#033[00m
Nov 29 02:02:15 np0005539505 nova_compute[186958]: 2025-11-29 07:02:15.179 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:02:16 np0005539505 nova_compute[186958]: 2025-11-29 07:02:16.034 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:16 np0005539505 nova_compute[186958]: 2025-11-29 07:02:16.605 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:16 np0005539505 nova_compute[186958]: 2025-11-29 07:02:16.612 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:16 np0005539505 nova_compute[186958]: 2025-11-29 07:02:16.613 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:16 np0005539505 nova_compute[186958]: 2025-11-29 07:02:16.614 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:16 np0005539505 nova_compute[186958]: 2025-11-29 07:02:16.615 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:16 np0005539505 nova_compute[186958]: 2025-11-29 07:02:16.616 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:16 np0005539505 nova_compute[186958]: 2025-11-29 07:02:16.617 186962 DEBUG nova.virt.libvirt.driver [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:02:16 np0005539505 nova_compute[186958]: 2025-11-29 07:02:16.624 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:02:16 np0005539505 podman[223761]: 2025-11-29 07:02:16.765455584 +0000 UTC m=+0.072320894 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:02:16 np0005539505 podman[223762]: 2025-11-29 07:02:16.824901521 +0000 UTC m=+0.133973024 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:02:17 np0005539505 nova_compute[186958]: 2025-11-29 07:02:17.036 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:02:17 np0005539505 nova_compute[186958]: 2025-11-29 07:02:17.866 186962 DEBUG nova.compute.manager [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:18 np0005539505 nova_compute[186958]: 2025-11-29 07:02:18.171 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:18 np0005539505 nova_compute[186958]: 2025-11-29 07:02:18.172 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:18 np0005539505 nova_compute[186958]: 2025-11-29 07:02:18.172 186962 DEBUG nova.objects.instance [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:02:18 np0005539505 nova_compute[186958]: 2025-11-29 07:02:18.252 186962 DEBUG oslo_concurrency.lockutils [None req-06351993-f547-4a45-a1a5-15bd63a08155 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:18 np0005539505 nova_compute[186958]: 2025-11-29 07:02:18.707 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:19 np0005539505 nova_compute[186958]: 2025-11-29 07:02:19.410 186962 DEBUG nova.compute.manager [req-4c5dd36a-96f5-4061-9023-84a770f061f9 req-65472141-cb6d-4cd4-ae05-2278ed3db659 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:19 np0005539505 nova_compute[186958]: 2025-11-29 07:02:19.412 186962 DEBUG oslo_concurrency.lockutils [req-4c5dd36a-96f5-4061-9023-84a770f061f9 req-65472141-cb6d-4cd4-ae05-2278ed3db659 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:19 np0005539505 nova_compute[186958]: 2025-11-29 07:02:19.412 186962 DEBUG oslo_concurrency.lockutils [req-4c5dd36a-96f5-4061-9023-84a770f061f9 req-65472141-cb6d-4cd4-ae05-2278ed3db659 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:19 np0005539505 nova_compute[186958]: 2025-11-29 07:02:19.413 186962 DEBUG oslo_concurrency.lockutils [req-4c5dd36a-96f5-4061-9023-84a770f061f9 req-65472141-cb6d-4cd4-ae05-2278ed3db659 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:19 np0005539505 nova_compute[186958]: 2025-11-29 07:02:19.414 186962 DEBUG nova.compute.manager [req-4c5dd36a-96f5-4061-9023-84a770f061f9 req-65472141-cb6d-4cd4-ae05-2278ed3db659 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] No waiting events found dispatching network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:02:19 np0005539505 nova_compute[186958]: 2025-11-29 07:02:19.415 186962 WARNING nova.compute.manager [req-4c5dd36a-96f5-4061-9023-84a770f061f9 req-65472141-cb6d-4cd4-ae05-2278ed3db659 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received unexpected event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:02:21 np0005539505 nova_compute[186958]: 2025-11-29 07:02:21.037 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:21 np0005539505 nova_compute[186958]: 2025-11-29 07:02:21.498 186962 DEBUG nova.compute.manager [req-31afefb1-be14-48f3-a96b-3092dc7631d4 req-312ea7e1-ebb7-4c3e-ba5d-4c144f41d265 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:02:21 np0005539505 nova_compute[186958]: 2025-11-29 07:02:21.499 186962 DEBUG oslo_concurrency.lockutils [req-31afefb1-be14-48f3-a96b-3092dc7631d4 req-312ea7e1-ebb7-4c3e-ba5d-4c144f41d265 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:21 np0005539505 nova_compute[186958]: 2025-11-29 07:02:21.499 186962 DEBUG oslo_concurrency.lockutils [req-31afefb1-be14-48f3-a96b-3092dc7631d4 req-312ea7e1-ebb7-4c3e-ba5d-4c144f41d265 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:21 np0005539505 nova_compute[186958]: 2025-11-29 07:02:21.499 186962 DEBUG oslo_concurrency.lockutils [req-31afefb1-be14-48f3-a96b-3092dc7631d4 req-312ea7e1-ebb7-4c3e-ba5d-4c144f41d265 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:21 np0005539505 nova_compute[186958]: 2025-11-29 07:02:21.500 186962 DEBUG nova.compute.manager [req-31afefb1-be14-48f3-a96b-3092dc7631d4 req-312ea7e1-ebb7-4c3e-ba5d-4c144f41d265 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] No waiting events found dispatching network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:02:21 np0005539505 nova_compute[186958]: 2025-11-29 07:02:21.500 186962 WARNING nova.compute.manager [req-31afefb1-be14-48f3-a96b-3092dc7631d4 req-312ea7e1-ebb7-4c3e-ba5d-4c144f41d265 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received unexpected event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:02:22 np0005539505 podman[223811]: 2025-11-29 07:02:22.747107576 +0000 UTC m=+0.077892772 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:02:23 np0005539505 nova_compute[186958]: 2025-11-29 07:02:23.709 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:23 np0005539505 nova_compute[186958]: 2025-11-29 07:02:23.736 186962 INFO nova.compute.manager [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Rebuilding instance#033[00m
Nov 29 02:02:24 np0005539505 nova_compute[186958]: 2025-11-29 07:02:24.581 186962 DEBUG nova.compute.manager [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:02:24 np0005539505 nova_compute[186958]: 2025-11-29 07:02:24.676 186962 DEBUG nova.objects.instance [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'pci_requests' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:24 np0005539505 nova_compute[186958]: 2025-11-29 07:02:24.690 186962 DEBUG nova.objects.instance [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'pci_devices' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:24 np0005539505 nova_compute[186958]: 2025-11-29 07:02:24.703 186962 DEBUG nova.objects.instance [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'resources' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:24 np0005539505 nova_compute[186958]: 2025-11-29 07:02:24.718 186962 DEBUG nova.objects.instance [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'migration_context' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:02:24 np0005539505 nova_compute[186958]: 2025-11-29 07:02:24.730 186962 DEBUG nova.objects.instance [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:02:24 np0005539505 nova_compute[186958]: 2025-11-29 07:02:24.733 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:02:24 np0005539505 podman[223831]: 2025-11-29 07:02:24.747837795 +0000 UTC m=+0.074082164 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:02:26 np0005539505 nova_compute[186958]: 2025-11-29 07:02:26.041 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:26.939 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:26.940 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:26.941 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:28 np0005539505 nova_compute[186958]: 2025-11-29 07:02:28.711 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:28 np0005539505 nova_compute[186958]: 2025-11-29 07:02:28.972 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:28 np0005539505 nova_compute[186958]: 2025-11-29 07:02:28.996 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Triggering sync for uuid b681523d-c882-4406-a91b-5cae6d761201 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:02:28 np0005539505 nova_compute[186958]: 2025-11-29 07:02:28.997 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Triggering sync for uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:02:28 np0005539505 nova_compute[186958]: 2025-11-29 07:02:28.997 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "b681523d-c882-4406-a91b-5cae6d761201" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:28 np0005539505 nova_compute[186958]: 2025-11-29 07:02:28.998 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "b681523d-c882-4406-a91b-5cae6d761201" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:28 np0005539505 nova_compute[186958]: 2025-11-29 07:02:28.998 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:28 np0005539505 nova_compute[186958]: 2025-11-29 07:02:28.998 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:28 np0005539505 nova_compute[186958]: 2025-11-29 07:02:28.998 186962 INFO nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] During sync_power_state the instance has a pending task (rebuilding). Skip.#033[00m
Nov 29 02:02:28 np0005539505 nova_compute[186958]: 2025-11-29 07:02:28.999 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:29 np0005539505 nova_compute[186958]: 2025-11-29 07:02:29.056 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "b681523d-c882-4406-a91b-5cae6d761201" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:31 np0005539505 nova_compute[186958]: 2025-11-29 07:02:31.043 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:33 np0005539505 nova_compute[186958]: 2025-11-29 07:02:33.713 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:34 np0005539505 nova_compute[186958]: 2025-11-29 07:02:34.779 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:02:35 np0005539505 ovn_controller[95143]: 2025-11-29T07:02:35Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:47:55 10.100.0.14
Nov 29 02:02:35 np0005539505 ovn_controller[95143]: 2025-11-29T07:02:35Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:47:55 10.100.0.14
Nov 29 02:02:36 np0005539505 nova_compute[186958]: 2025-11-29 07:02:36.045 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:36 np0005539505 podman[223875]: 2025-11-29 07:02:36.717272294 +0000 UTC m=+0.050854344 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:02:36 np0005539505 podman[223876]: 2025-11-29 07:02:36.748279865 +0000 UTC m=+0.075776672 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:02:38 np0005539505 nova_compute[186958]: 2025-11-29 07:02:38.715 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:40 np0005539505 podman[223918]: 2025-11-29 07:02:40.757636785 +0000 UTC m=+0.083500521 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 02:02:41 np0005539505 nova_compute[186958]: 2025-11-29 07:02:41.048 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:43 np0005539505 nova_compute[186958]: 2025-11-29 07:02:43.404 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:43 np0005539505 nova_compute[186958]: 2025-11-29 07:02:43.718 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:45 np0005539505 nova_compute[186958]: 2025-11-29 07:02:45.835 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance in state 1 after 21 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:02:46 np0005539505 nova_compute[186958]: 2025-11-29 07:02:46.051 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:47 np0005539505 podman[223937]: 2025-11-29 07:02:47.729263976 +0000 UTC m=+0.066131338 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:02:47 np0005539505 podman[223938]: 2025-11-29 07:02:47.77379772 +0000 UTC m=+0.103947431 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.088 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'name': 'tempest-ServersAdminTestJSON-server-1580825723', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000035', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '80b4126e17a14d73b40158a57f19d091', 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'hostId': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.094 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b681523d-c882-4406-a91b-5cae6d761201', 'name': 'tempest-ServersAdminTestJSON-server-647782358', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000038', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '80b4126e17a14d73b40158a57f19d091', 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'hostId': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.099 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 230d36aa-b1ff-4e7d-a024-af0021cd0044 / tape5af6202-8a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.100 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/network.outgoing.packets volume: 23 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.105 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b681523d-c882-4406-a91b-5cae6d761201 / tap26c27b84-e5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.105 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2590fcc0-1316-49c3-b3a1-c03c971daed5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 23, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000035-230d36aa-b1ff-4e7d-a024-af0021cd0044-tape5af6202-8a', 'timestamp': '2025-11-29T07:02:48.095933', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'tape5af6202-8a', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:47:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5af6202-8a'}, 'message_id': '69b304e2-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.736843712, 'message_signature': '8576832cd9937a0c615e516142720d806a81024ffc1a04fdd85a53063eb8ca11'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000038-b681523d-c882-4406-a91b-5cae6d761201-tap26c27b84-e5', 'timestamp': '2025-11-29T07:02:48.095933', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'tap26c27b84-e5', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:70:30', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26c27b84-e5'}, 'message_id': '69b3d5ca-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.742478452, 'message_signature': '47bf99c55394f2f0a95a2498c2367cbb70201fc2932ece7497b42a8106ce01b6'}]}, 'timestamp': '2025-11-29 07:02:48.106648', '_unique_id': '6ca555eeda3e47b98424ac242220b1b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.109 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.113 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.113 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3433d56a-6615-475c-ae18-1e8690b8bf70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000035-230d36aa-b1ff-4e7d-a024-af0021cd0044-tape5af6202-8a', 'timestamp': '2025-11-29T07:02:48.113120', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'tape5af6202-8a', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:47:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5af6202-8a'}, 'message_id': '69b4f1d0-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.736843712, 'message_signature': 'cd58efe2e000747dba06bdf259050fa256f3e6bdefb47e83a2dd6a5f41318c2e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000038-b681523d-c882-4406-a91b-5cae6d761201-tap26c27b84-e5', 'timestamp': '2025-11-29T07:02:48.113120', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'tap26c27b84-e5', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:70:30', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26c27b84-e5'}, 'message_id': '69b506c0-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.742478452, 'message_signature': 'e8937f5c6cb681a4a1de0eb0a9e92cc7134a500881c52556042bf66c4c374831'}]}, 'timestamp': '2025-11-29 07:02:48.114376', '_unique_id': '8f5e5672ec9a4635ad5603d9d540e5e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.115 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.118 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.119 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3638ce8d-8620-4d63-b216-83455280087a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000035-230d36aa-b1ff-4e7d-a024-af0021cd0044-tape5af6202-8a', 'timestamp': '2025-11-29T07:02:48.118376', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'tape5af6202-8a', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:47:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5af6202-8a'}, 'message_id': '69b5bf48-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.736843712, 'message_signature': '4841c130d488d669b2f7c4b829362723ae9648b67cb5d26725f29c8e54d54c24'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000038-b681523d-c882-4406-a91b-5cae6d761201-tap26c27b84-e5', 'timestamp': '2025-11-29T07:02:48.118376', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'tap26c27b84-e5', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:70:30', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26c27b84-e5'}, 'message_id': '69b5dd70-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.742478452, 'message_signature': '631d75a9c2fdb5f08166c2571935793347a02634813f037e4bc93dc8407c64fc'}]}, 'timestamp': '2025-11-29 07:02:48.119921', '_unique_id': '1f7eda701e0a4f99a816c1873709fec9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.123 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.124 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91bb9cc2-993e-4f61-8b87-0f91ad6cd087', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000035-230d36aa-b1ff-4e7d-a024-af0021cd0044-tape5af6202-8a', 'timestamp': '2025-11-29T07:02:48.123676', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'tape5af6202-8a', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:47:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5af6202-8a'}, 'message_id': '69b6868a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.736843712, 'message_signature': '4bb36b127b5efb048ed81cab0aebc160e3da05733dc929eefaa651b61e15bf60'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000038-b681523d-c882-4406-a91b-5cae6d761201-tap26c27b84-e5', 'timestamp': '2025-11-29T07:02:48.123676', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'tap26c27b84-e5', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:70:30', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26c27b84-e5'}, 'message_id': '69b69d00-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.742478452, 'message_signature': '4aac5d08fc0f1a8ca57bd24b1d971843bae97c2a2537c66ed11d5d99d5043243'}]}, 'timestamp': '2025-11-29 07:02:48.124797', '_unique_id': 'a84876b3111f415f91a3caf1831677f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.125 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.172 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.read.latency volume: 543266816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.173 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.read.latency volume: 269557843 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.204 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.read.latency volume: 467359721 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.205 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.read.latency volume: 19210136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc8c4aa8-d9df-4932-a93f-3f86ebbac705', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 543266816, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-vda', 'timestamp': '2025-11-29T07:02:48.127731', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69bdff46-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': '24fa8ba9d95ee9a7bb8b134ce0da4870d7aae536db9a5ce40620e11dacfee3e4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 269557843, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-sda', 'timestamp': '2025-11-29T07:02:48.127731', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69be183c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': '4a1842ce22316935abcd51f798f1ccf940a4c828b89137b789640b6be6672c08'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 467359721, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-vda', 'timestamp': '2025-11-29T07:02:48.127731', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c2f05a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': 'bb9b63fb5a0a26133cf41e470e8c85133cbbe25503e82b811ee022837a121c7e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19210136, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-sda', 'timestamp': '2025-11-29T07:02:48.127731', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c3040a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': 'c396cd777dc61ec44b0fc17e84893b498d97270026c563551dc5233b317d055d'}]}, 'timestamp': '2025-11-29 07:02:48.206052', '_unique_id': 'a8b555f047414c308518d82e2a75aeef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.209 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.write.requests volume: 334 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.209 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.210 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.write.requests volume: 321 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.210 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00018b3a-276c-4275-b6ac-a00d95fde579', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 334, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-vda', 'timestamp': '2025-11-29T07:02:48.209488', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c39da2-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': '4f69fcd1d677c135e1a9c450f9534e6a55f8c00d9c936653ec22737009a982d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-sda', 'timestamp': '2025-11-29T07:02:48.209488', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c3aefa-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': 'cec59f46bbde0deafdbf57a4715b3f4356966f046f3a2e67a98e87598b7e0f2f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 321, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-vda', 'timestamp': '2025-11-29T07:02:48.209488', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c3c1f6-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': 'c70ee72727519c1848a4a2fc91eb6df2608faa31d7aad64038dc19ae3a192def'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-sda', 'timestamp': '2025-11-29T07:02:48.209488', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c3d65a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': 'e3107a2b8affc91e41c5d256c0fe07b39819674418c7c60fc6215894072ffed6'}]}, 'timestamp': '2025-11-29 07:02:48.211459', '_unique_id': '71b786e60e64440397c500e1c3e4649a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.214 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.214 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c82daf81-19ab-4413-b122-7f5b8695e42d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000035-230d36aa-b1ff-4e7d-a024-af0021cd0044-tape5af6202-8a', 'timestamp': '2025-11-29T07:02:48.214073', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'tape5af6202-8a', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:47:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5af6202-8a'}, 'message_id': '69c45256-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.736843712, 'message_signature': '787a88a3ab6092b7a99914004841c745f765e2432e4d21366d7a41578065fef9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000038-b681523d-c882-4406-a91b-5cae6d761201-tap26c27b84-e5', 'timestamp': '2025-11-29T07:02:48.214073', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'tap26c27b84-e5', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:70:30', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26c27b84-e5'}, 'message_id': '69c46584-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.742478452, 'message_signature': 'd30025ab7db27f115770eb0ca73095a60060e5f513774784e375e397ea354772'}]}, 'timestamp': '2025-11-29 07:02:48.215101', '_unique_id': '805859c99e0c4cfbbb65176b0a559d43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.231 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.232 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.246 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.247 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7b286f5-29bd-44fa-b5b2-a22f45ca35b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-vda', 'timestamp': '2025-11-29T07:02:48.218041', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c709c4-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.858919607, 'message_signature': '03d6cb40b77ce50fd1a2be7dee680cea7dcb832bca353f895bdd0b53ff7a6725'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-sda', 'timestamp': '2025-11-29T07:02:48.218041', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c71e1e-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.858919607, 'message_signature': '796b405382208a63e55e0a576e73b42206da0635204274dc549877c2d9a002e7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-vda', 'timestamp': '2025-11-29T07:02:48.218041', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69c954ae-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.873740468, 'message_signature': 'd74f3db6dc833b47c6d74083745cebe5829b7c07051ed1d1698e42a5dd3aa85a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-sda', 'timestamp': '2025-11-29T07:02:48.218041', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69c96962-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.873740468, 'message_signature': '1c4a79cb61928e987658da6304070d8ca8b03f8dd309611be41d2fe09a0f4223'}]}, 'timestamp': '2025-11-29 07:02:48.247956', '_unique_id': '38ee697e12b745d2a8209ddfa755e83b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.249 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.250 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.250 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.251 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1580825723>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-647782358>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1580825723>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-647782358>]
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.251 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.252 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1580825723>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-647782358>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1580825723>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-647782358>]
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.252 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.253 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1580825723>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-647782358>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1580825723>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-647782358>]
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.253 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.write.latency volume: 73496891721 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.254 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.254 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.write.latency volume: 64980598536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.254 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '171fc7a0-a315-4268-b6bf-549c11727c2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 73496891721, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-vda', 'timestamp': '2025-11-29T07:02:48.253555', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69ca5688-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': 'de520b056537449b339ea85625b20e7154fef824382ff76f856490aa3fcc3dc2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-sda', 'timestamp': '2025-11-29T07:02:48.253555', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69ca6952-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': '27b5b9f64b18878f408ade8b3bd9d337ccb0cd68a2f8ba7312063a34439fd4bf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 64980598536, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-vda', 'timestamp': '2025-11-29T07:02:48.253555', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69ca7a3c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': '911a34a103f74d8f4eb2c2d3a947cefdb9657690e3157626175e32ce17bb6b92'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-sda', 'timestamp': '2025-11-29T07:02:48.253555', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69ca8a36-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': '2f0cb1e6f709930359deddb55186347797ee4d3a194fa84686f6a1e0fe74a75b'}]}, 'timestamp': '2025-11-29 07:02:48.255372', '_unique_id': '465cc5e636dc47daaf87f3feb3f54296'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.256 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.257 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/network.outgoing.bytes volume: 2250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.258 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75914aab-8329-4294-98c0-b16c9bb6d1bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2250, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000035-230d36aa-b1ff-4e7d-a024-af0021cd0044-tape5af6202-8a', 'timestamp': '2025-11-29T07:02:48.257895', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'tape5af6202-8a', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:47:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5af6202-8a'}, 'message_id': '69cb0042-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.736843712, 'message_signature': '39214e1f6a6dce33204c8795f06c1fa7fdcb8eefc6b82482715ef7c06cb7f179'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000038-b681523d-c882-4406-a91b-5cae6d761201-tap26c27b84-e5', 'timestamp': '2025-11-29T07:02:48.257895', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'tap26c27b84-e5', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:70:30', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26c27b84-e5'}, 'message_id': '69cb1474-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.742478452, 'message_signature': '54f738b2e87d806331a3a1c8175234a4736fcc9f883b15641f66dd44f4378876'}]}, 'timestamp': '2025-11-29 07:02:48.258898', '_unique_id': 'c54c1eddf8534cf692a42ff584cac572'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.260 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.261 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.284 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/memory.usage volume: 43.10546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.309 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/memory.usage volume: 46.19140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92ac7e7b-7584-45ea-8699-a46b29d9893f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.10546875, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'timestamp': '2025-11-29T07:02:48.261402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '69cf13f8-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.925195549, 'message_signature': '151f643bff9849061a5b2190afa92d00e1aa8af8a4a9d97eed62f1238de2eb3a'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.19140625, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'timestamp': '2025-11-29T07:02:48.261402', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '69d2ecda-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.950537278, 'message_signature': '0101daf4605a9c20edaf0db929aa5377d28a2ccc438dde385483044507d638e4'}]}, 'timestamp': '2025-11-29 07:02:48.310368', '_unique_id': 'd2157f3db62749918a7ea0db2e89ed43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.311 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.313 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.read.requests volume: 1103 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.313 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.314 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.read.requests volume: 1088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.314 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da6223eb-e4cf-44b2-9082-33d0b3f5ec8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1103, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-vda', 'timestamp': '2025-11-29T07:02:48.313376', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69d3774a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': '97ef41d23e2351f23b0ce84eb8b71400afb3da6cc571ceb2f85174dacee339d9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-sda', 'timestamp': '2025-11-29T07:02:48.313376', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69d38870-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': 'a03e4bc77f8f2cefabfd4515f7b5b1c6f57c6f23c1887bf3957f0520167f4183'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1088, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-vda', 'timestamp': '2025-11-29T07:02:48.313376', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69d39acc-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': 'f72b12b0f7a1c2895be36ade4a680923b0ee525e0cbe82d12a66b5facbc3ab05'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-sda', 'timestamp': '2025-11-29T07:02:48.313376', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69d3aae4-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': 'e95d731dcfff3b31750f5af11f82d324fb793507cd4871a8045ff218c4db36f6'}]}, 'timestamp': '2025-11-29 07:02:48.315155', '_unique_id': '38d9d8fd3c3e4ab188a0b3bf519312d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.316 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.317 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.317 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.318 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/network.incoming.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57de15b9-f585-49d3-8d61-86e0042c7724', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000035-230d36aa-b1ff-4e7d-a024-af0021cd0044-tape5af6202-8a', 'timestamp': '2025-11-29T07:02:48.317825', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'tape5af6202-8a', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:47:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5af6202-8a'}, 'message_id': '69d425aa-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.736843712, 'message_signature': '8a001bf51eedcb658790df4fd068cbf6a339e19cd94e20e34a07b96c771cd7bf'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000038-b681523d-c882-4406-a91b-5cae6d761201-tap26c27b84-e5', 'timestamp': '2025-11-29T07:02:48.317825', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'tap26c27b84-e5', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:70:30', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26c27b84-e5'}, 'message_id': '69d43eaa-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.742478452, 'message_signature': '3b515b75b05d9e622fa71b2a2677ebd86ad089993880a565364f87b79fa619f5'}]}, 'timestamp': '2025-11-29 07:02:48.319004', '_unique_id': '1456fe81fca44303af5b50825cf9c683'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.320 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.322 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.322 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.322 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.323 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d78455a-66dc-4e52-ad1c-16cedd17ed4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-vda', 'timestamp': '2025-11-29T07:02:48.321997', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69d4ca0a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.858919607, 'message_signature': '26223f03ea8083c9ef3d567496ae0089037cadc1d4eb230de722cd2999308042'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-sda', 'timestamp': '2025-11-29T07:02:48.321997', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69d4db4e-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.858919607, 'message_signature': '13181af75711e6d2e6f4f1f10b4135891c39453d0f561303665b7f2fa80a6251'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-vda', 'timestamp': '2025-11-29T07:02:48.321997', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69d4ec1a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.873740468, 'message_signature': '5b26ed774b6eefcb09346de8206c8714a84f7624d113d9da3e477bd4ac9c058f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-sda', 'timestamp': '2025-11-29T07:02:48.321997', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69d4fe12-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.873740468, 'message_signature': 'dead1c5c64f9074735df66c293b508f7b6c1badf3040d3d8099fa8ccda506bc9'}]}, 'timestamp': '2025-11-29 07:02:48.323881', '_unique_id': '5cef831e9d4c48de89a7a3738ddc61aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.325 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.326 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.326 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.write.bytes volume: 72962048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.327 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.327 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.write.bytes volume: 72888320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.328 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '535bd58a-41d7-4c62-9437-2a8a1db7415b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72962048, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-vda', 'timestamp': '2025-11-29T07:02:48.326787', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69d58328-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': '8c181b84f259f17375993e4c1e1cd9a2cae53640ec46034e576e11d90a555d97'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-sda', 'timestamp': '2025-11-29T07:02:48.326787', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69d597e6-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': '1c8ac430c163f02637a20a6553cd36fe3f4b2528d80101de859c642b2c22a220'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72888320, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-vda', 'timestamp': '2025-11-29T07:02:48.326787', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69d5abf0-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': '85573ba96a0520a07d1751924c1c28424ba98b1dc563b8e3c89a1f11ded191bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-sda', 'timestamp': '2025-11-29T07:02:48.326787', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69d5bf5a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': 'fdc60bc3441bf906e784b34d1581e020cb318236e2db7b7618c9959d320f9a3d'}]}, 'timestamp': '2025-11-29 07:02:48.328800', '_unique_id': '8e2212a8399c4b14b980f080241ced97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.329 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.331 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.331 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/cpu volume: 11340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.332 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/cpu volume: 12520000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2bef7b6-169b-46b7-97dc-5bcaca9912e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11340000000, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'timestamp': '2025-11-29T07:02:48.331535', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '69d63ea8-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.925195549, 'message_signature': 'c3a8ef12b1743dc2366c5303ee5d7055cda9512e940f8ad4178f0ca6007f8654'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12520000000, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'timestamp': '2025-11-29T07:02:48.331535', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '69d652c6-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.950537278, 'message_signature': 'd0d75f5861dcfa3e1d3f643d373e96d7643e14a1171794d92da452163e05cf54'}]}, 'timestamp': '2025-11-29 07:02:48.332621', '_unique_id': '2be8aa4b464344e3b002ff0accc137ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.333 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.335 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.335 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.335 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd5d43fe-b5fe-47d6-ae09-1f2e49609b65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000035-230d36aa-b1ff-4e7d-a024-af0021cd0044-tape5af6202-8a', 'timestamp': '2025-11-29T07:02:48.335171', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'tape5af6202-8a', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:47:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5af6202-8a'}, 'message_id': '69d6ccd8-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.736843712, 'message_signature': 'b6010a984fd5b24a1a1ce511d68fce8d396f64d9a63e15def8573fffba145c92'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000038-b681523d-c882-4406-a91b-5cae6d761201-tap26c27b84-e5', 'timestamp': '2025-11-29T07:02:48.335171', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'tap26c27b84-e5', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:70:30', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26c27b84-e5'}, 'message_id': '69d6e22c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.742478452, 'message_signature': 'ffa6bd68ed6ea7366e2e511bd36c2499cfda652c729e41eee0bc9b8141038d5c'}]}, 'timestamp': '2025-11-29 07:02:48.336310', '_unique_id': '1af7d30bd7254f76b05e1c6b2071a71f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.337 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.339 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.339 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.339 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1580825723>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-647782358>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAdminTestJSON-server-1580825723>, <NovaLikeServer: tempest-ServersAdminTestJSON-server-647782358>]
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.339 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.340 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.340 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdca350b-9487-441a-8f6d-bee13494f9e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000035-230d36aa-b1ff-4e7d-a024-af0021cd0044-tape5af6202-8a', 'timestamp': '2025-11-29T07:02:48.340044', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'tape5af6202-8a', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:47:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5af6202-8a'}, 'message_id': '69d78cc2-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.736843712, 'message_signature': '4e8c200936579b591fda299d03f08eaaadda69bb2255dc29e676eab0ba7eca24'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000038-b681523d-c882-4406-a91b-5cae6d761201-tap26c27b84-e5', 'timestamp': '2025-11-29T07:02:48.340044', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'tap26c27b84-e5', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:70:30', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26c27b84-e5'}, 'message_id': '69d7a05e-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.742478452, 'message_signature': '5798d0be7f94627b284dc9b8d3a9ac41de87ecee72eed05115c6605ea2ca5e1a'}]}, 'timestamp': '2025-11-29 07:02:48.341360', '_unique_id': 'ea34e1dd2ff743cfaefc5187e31ca9c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.344 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.read.bytes volume: 30775808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.344 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.345 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.read.bytes volume: 30296576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.345 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc1f5d3d-93e2-4ace-a319-26a1f3cebc04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30775808, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-vda', 'timestamp': '2025-11-29T07:02:48.344046', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69d82812-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': '883d912d9f5f7d76fad3adb0f3d3b0ec38d0c7466deaab92b99b8c41bbbda350'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-sda', 'timestamp': '2025-11-29T07:02:48.344046', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69d83abe-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.768636115, 'message_signature': '4917f5f69946435a69ceb9b7fdb0cfd552bce0727e479cca7317ee5320774994'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30296576, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-vda', 'timestamp': '2025-11-29T07:02:48.344046', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69d84d92-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': 'fdbbc7f4c9fd031e18020c0db4dabeef17b1179e56f3a354cab89d81f79b7841'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-sda', 'timestamp': '2025-11-29T07:02:48.344046', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69d85efe-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.814648981, 'message_signature': '9a344f64d80f4dd8d7b5c3d6b95f62e65cf4a13907660ccd4990504c48ab1a13'}]}, 'timestamp': '2025-11-29 07:02:48.345994', '_unique_id': '7039ea0a1a62450488f3276d8740d502'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.347 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.348 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.348 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/network.incoming.bytes volume: 1388 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.349 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/network.incoming.bytes volume: 1808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08017ddc-7bfa-42ea-bdaa-0c999497f06f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1388, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000035-230d36aa-b1ff-4e7d-a024-af0021cd0044-tape5af6202-8a', 'timestamp': '2025-11-29T07:02:48.348698', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'tape5af6202-8a', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:47:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape5af6202-8a'}, 'message_id': '69d8db90-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.736843712, 'message_signature': '4434fbbdcd2f465a37d3beccc3f0b2fc8388d43488bae25892f5ea25e25d0e0e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1808, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'instance-00000038-b681523d-c882-4406-a91b-5cae6d761201-tap26c27b84-e5', 'timestamp': '2025-11-29T07:02:48.348698', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'tap26c27b84-e5', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ba:70:30', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap26c27b84-e5'}, 'message_id': '69d8f4a4-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.742478452, 'message_signature': '19af748b078b29128ac1fca2ba72f1d64117ea23c9e8a325568510c333387830'}]}, 'timestamp': '2025-11-29 07:02:48.349865', '_unique_id': '0e854acb3a4140f89be51bb4c166a8a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.351 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.352 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.352 12 DEBUG ceilometer.compute.pollsters [-] 230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.352 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.353 12 DEBUG ceilometer.compute.pollsters [-] b681523d-c882-4406-a91b-5cae6d761201/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce3bd397-4a53-4137-b088-1b7482bfb78c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-vda', 'timestamp': '2025-11-29T07:02:48.352322', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69d964ac-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.858919607, 'message_signature': 'ba366fb8e66e6c6cf914cab79ada44ea151d6178791aeaac52ba1c2ac20737b4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044-sda', 'timestamp': '2025-11-29T07:02:48.352322', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-1580825723', 'name': 'instance-00000035', 'instance_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a'}, 'image_ref': '3372b7b2-657b-4c4d-9d9d-7c5b771a630a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69d96f24-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.858919607, 'message_signature': '170e739774102b0589fc195f2f1ea65560834ba0c7fdd360b9c79db9aa28f905'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-vda', 'timestamp': '2025-11-29T07:02:48.352322', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69d978e8-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.873740468, 'message_signature': '1d12dc3908d6aee8a5164b932abc6d191a8300c99cd6d7973fad4766403dcb72'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cd616d4c2eb44fe0a0da2df1690c0e21', 'user_name': None, 'project_id': '80b4126e17a14d73b40158a57f19d091', 'project_name': None, 'resource_id': 'b681523d-c882-4406-a91b-5cae6d761201-sda', 'timestamp': '2025-11-29T07:02:48.352322', 'resource_metadata': {'display_name': 'tempest-ServersAdminTestJSON-server-647782358', 'name': 'instance-00000038', 'instance_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'instance_type': 'm1.nano', 'host': '7becbdcbce092d4c18ce66cd96ac149b4e9f76cb44456fb360e8010a', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69d98450-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5215.873740468, 'message_signature': 'f9d2e96064f9634216847753b0aba4dc86b45c43c77a1956b2ea8bd1f018c86e'}]}, 'timestamp': '2025-11-29 07:02:48.353414', '_unique_id': '82486dad67bf4f9a8ca7f57442c8be8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:02:48.354 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:02:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:48.532 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:02:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:48.533 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:02:48 np0005539505 nova_compute[186958]: 2025-11-29 07:02:48.533 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:48 np0005539505 nova_compute[186958]: 2025-11-29 07:02:48.720 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:49 np0005539505 nova_compute[186958]: 2025-11-29 07:02:49.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:51 np0005539505 nova_compute[186958]: 2025-11-29 07:02:51.054 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:52 np0005539505 nova_compute[186958]: 2025-11-29 07:02:52.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:53 np0005539505 nova_compute[186958]: 2025-11-29 07:02:53.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:53 np0005539505 nova_compute[186958]: 2025-11-29 07:02:53.437 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:53 np0005539505 nova_compute[186958]: 2025-11-29 07:02:53.437 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:53 np0005539505 nova_compute[186958]: 2025-11-29 07:02:53.438 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:53 np0005539505 nova_compute[186958]: 2025-11-29 07:02:53.438 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:02:53 np0005539505 nova_compute[186958]: 2025-11-29 07:02:53.543 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:53 np0005539505 podman[224001]: 2025-11-29 07:02:53.595577415 +0000 UTC m=+0.091533389 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 29 02:02:53 np0005539505 nova_compute[186958]: 2025-11-29 07:02:53.723 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.293 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json" returned: 0 in 0.750s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.295 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.397 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.405 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.472 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.473 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.525 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:02:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:02:54.536 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.704 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.706 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5417MB free_disk=73.16889572143555GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.706 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.707 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.909 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 230d36aa-b1ff-4e7d-a024-af0021cd0044 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.909 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance b681523d-c882-4406-a91b-5cae6d761201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.909 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:02:54 np0005539505 nova_compute[186958]: 2025-11-29 07:02:54.910 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:02:55 np0005539505 nova_compute[186958]: 2025-11-29 07:02:55.048 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:02:55 np0005539505 podman[224033]: 2025-11-29 07:02:55.772645428 +0000 UTC m=+0.089337646 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0)
Nov 29 02:02:55 np0005539505 nova_compute[186958]: 2025-11-29 07:02:55.777 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:02:55 np0005539505 nova_compute[186958]: 2025-11-29 07:02:55.983 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:02:55 np0005539505 nova_compute[186958]: 2025-11-29 07:02:55.983 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:02:56 np0005539505 nova_compute[186958]: 2025-11-29 07:02:56.056 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:56 np0005539505 nova_compute[186958]: 2025-11-29 07:02:56.889 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance in state 1 after 32 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:02:56 np0005539505 nova_compute[186958]: 2025-11-29 07:02:56.979 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:58 np0005539505 nova_compute[186958]: 2025-11-29 07:02:58.725 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:02:59 np0005539505 nova_compute[186958]: 2025-11-29 07:02:59.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:59 np0005539505 nova_compute[186958]: 2025-11-29 07:02:59.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:02:59 np0005539505 nova_compute[186958]: 2025-11-29 07:02:59.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:03:01 np0005539505 nova_compute[186958]: 2025-11-29 07:03:01.059 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:01 np0005539505 nova_compute[186958]: 2025-11-29 07:03:01.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:01 np0005539505 nova_compute[186958]: 2025-11-29 07:03:01.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:03:02 np0005539505 nova_compute[186958]: 2025-11-29 07:03:02.319 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-b681523d-c882-4406-a91b-5cae6d761201" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:03:02 np0005539505 nova_compute[186958]: 2025-11-29 07:03:02.319 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-b681523d-c882-4406-a91b-5cae6d761201" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:03:02 np0005539505 nova_compute[186958]: 2025-11-29 07:03:02.319 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:03:03 np0005539505 nova_compute[186958]: 2025-11-29 07:03:03.727 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:04 np0005539505 nova_compute[186958]: 2025-11-29 07:03:04.287 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] Updating instance_info_cache with network_info: [{"id": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "address": "fa:16:3e:ba:70:30", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26c27b84-e5", "ovs_interfaceid": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:03:04 np0005539505 nova_compute[186958]: 2025-11-29 07:03:04.313 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-b681523d-c882-4406-a91b-5cae6d761201" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:03:04 np0005539505 nova_compute[186958]: 2025-11-29 07:03:04.313 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:03:06 np0005539505 nova_compute[186958]: 2025-11-29 07:03:06.062 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:07 np0005539505 podman[224055]: 2025-11-29 07:03:07.765126732 +0000 UTC m=+0.073243590 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:03:07 np0005539505 podman[224054]: 2025-11-29 07:03:07.772576074 +0000 UTC m=+0.086106316 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible)
Nov 29 02:03:07 np0005539505 nova_compute[186958]: 2025-11-29 07:03:07.946 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance in state 1 after 43 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:03:08 np0005539505 nova_compute[186958]: 2025-11-29 07:03:08.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:08 np0005539505 nova_compute[186958]: 2025-11-29 07:03:08.755 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:11 np0005539505 nova_compute[186958]: 2025-11-29 07:03:11.065 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:11 np0005539505 podman[224100]: 2025-11-29 07:03:11.715309355 +0000 UTC m=+0.048729804 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:03:13 np0005539505 nova_compute[186958]: 2025-11-29 07:03:13.757 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:14 np0005539505 kernel: tape5af6202-8a (unregistering): left promiscuous mode
Nov 29 02:03:14 np0005539505 NetworkManager[55134]: <info>  [1764399794.9529] device (tape5af6202-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:03:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:14Z|00240|binding|INFO|Releasing lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 from this chassis (sb_readonly=0)
Nov 29 02:03:14 np0005539505 nova_compute[186958]: 2025-11-29 07:03:14.963 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:14Z|00241|binding|INFO|Setting lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 down in Southbound
Nov 29 02:03:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:14Z|00242|binding|INFO|Removing iface tape5af6202-8a ovn-installed in OVS
Nov 29 02:03:14 np0005539505 nova_compute[186958]: 2025-11-29 07:03:14.967 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:14 np0005539505 nova_compute[186958]: 2025-11-29 07:03:14.979 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:14 np0005539505 nova_compute[186958]: 2025-11-29 07:03:14.981 186962 INFO nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance shutdown successfully after 50 seconds.#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.017 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:47:55 10.100.0.14'], port_security=['fa:16:3e:ff:47:55 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '6', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=e5af6202-8a71-48e2-ae69-2b3cb0d3a948) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.019 104094 INFO neutron.agent.ovn.metadata.agent [-] Port e5af6202-8a71-48e2-ae69-2b3cb0d3a948 in datapath b97f3d85-11c0-4475-aea6-e8da158df42a unbound from our chassis#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.021 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:03:15 np0005539505 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000035.scope: Deactivated successfully.
Nov 29 02:03:15 np0005539505 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000035.scope: Consumed 14.958s CPU time.
Nov 29 02:03:15 np0005539505 systemd-machined[153285]: Machine qemu-29-instance-00000035 terminated.
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.038 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e09fb98d-c25b-4d4a-ba07-f797dbf04d31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.070 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[22034c94-5c7d-483f-b285-3da9c39a8b5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.074 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[bd0ee6c9-7f68-4a7c-a54e-a010126ec2d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.103 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[27c1df6d-aacf-438a-ad64-dd2812825eeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.122 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[192f3c17-6065-464e-ba6d-3eb55e35bcb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510568, 'reachable_time': 39366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224130, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.139 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a397bf-548e-4d8f-93d8-1f8b469c9503]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510580, 'tstamp': 510580}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224131, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510582, 'tstamp': 510582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224131, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.142 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.143 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.148 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.149 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.150 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.150 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:15.150 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:15 np0005539505 kernel: tape5af6202-8a: entered promiscuous mode
Nov 29 02:03:15 np0005539505 kernel: tape5af6202-8a (unregistering): left promiscuous mode
Nov 29 02:03:15 np0005539505 NetworkManager[55134]: <info>  [1764399795.1996] manager: (tape5af6202-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/123)
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.203 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.245 186962 INFO nova.virt.libvirt.driver [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance destroyed successfully.#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.249 186962 INFO nova.virt.libvirt.driver [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance destroyed successfully.#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.249 186962 DEBUG nova.virt.libvirt.vif [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:00:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1580825723',display_name='tempest-ServersAdminTestJSON-server-1580825723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1580825723',id=53,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:02:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-uhc27c80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:02:22Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=230d36aa-b1ff-4e7d-a024-af0021cd0044,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.250 186962 DEBUG nova.network.os_vif_util [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.250 186962 DEBUG nova.network.os_vif_util [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.251 186962 DEBUG os_vif [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.252 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.253 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5af6202-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.254 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.256 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.259 186962 INFO os_vif [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a')#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.259 186962 INFO nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Deleting instance files /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044_del#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.260 186962 INFO nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Deletion of /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044_del complete#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.496 186962 DEBUG nova.compute.manager [req-078873fb-5ef7-418d-944d-076ee97f7b48 req-b7d188c0-234f-41b5-a67e-db9b0702d0ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-unplugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.496 186962 DEBUG oslo_concurrency.lockutils [req-078873fb-5ef7-418d-944d-076ee97f7b48 req-b7d188c0-234f-41b5-a67e-db9b0702d0ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.496 186962 DEBUG oslo_concurrency.lockutils [req-078873fb-5ef7-418d-944d-076ee97f7b48 req-b7d188c0-234f-41b5-a67e-db9b0702d0ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.497 186962 DEBUG oslo_concurrency.lockutils [req-078873fb-5ef7-418d-944d-076ee97f7b48 req-b7d188c0-234f-41b5-a67e-db9b0702d0ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.497 186962 DEBUG nova.compute.manager [req-078873fb-5ef7-418d-944d-076ee97f7b48 req-b7d188c0-234f-41b5-a67e-db9b0702d0ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] No waiting events found dispatching network-vif-unplugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:15 np0005539505 nova_compute[186958]: 2025-11-29 07:03:15.497 186962 WARNING nova.compute.manager [req-078873fb-5ef7-418d-944d-076ee97f7b48 req-b7d188c0-234f-41b5-a67e-db9b0702d0ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received unexpected event network-vif-unplugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for instance with vm_state active and task_state rebuilding.#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.463 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.463 186962 INFO nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Creating image(s)#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.464 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.464 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.465 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.478 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.531 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.532 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.533 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.544 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.605 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.607 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.987 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk 1073741824" returned: 0 in 0.380s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.988 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:16 np0005539505 nova_compute[186958]: 2025-11-29 07:03:16.989 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.049 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.050 186962 DEBUG nova.virt.disk.api [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Checking if we can resize image /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.051 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.127 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.128 186962 DEBUG nova.virt.disk.api [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Cannot resize image /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.129 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.129 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Ensure instance console log exists: /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.130 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.130 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.130 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.133 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Start _get_guest_xml network_info=[{"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.139 186962 WARNING nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.147 186962 DEBUG nova.virt.libvirt.host [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.148 186962 DEBUG nova.virt.libvirt.host [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.152 186962 DEBUG nova.virt.libvirt.host [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.152 186962 DEBUG nova.virt.libvirt.host [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.154 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.154 186962 DEBUG nova.virt.hardware [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.154 186962 DEBUG nova.virt.hardware [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.155 186962 DEBUG nova.virt.hardware [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.155 186962 DEBUG nova.virt.hardware [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.155 186962 DEBUG nova.virt.hardware [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.155 186962 DEBUG nova.virt.hardware [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.156 186962 DEBUG nova.virt.hardware [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.156 186962 DEBUG nova.virt.hardware [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.156 186962 DEBUG nova.virt.hardware [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.156 186962 DEBUG nova.virt.hardware [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.157 186962 DEBUG nova.virt.hardware [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.157 186962 DEBUG nova.objects.instance [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.178 186962 DEBUG nova.virt.libvirt.vif [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:00:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1580825723',display_name='tempest-ServersAdminTestJSON-server-1580825723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1580825723',id=53,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:02:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-uhc27c80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:03:15Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=230d36aa-b1ff-4e7d-a024-af0021cd0044,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.178 186962 DEBUG nova.network.os_vif_util [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.179 186962 DEBUG nova.network.os_vif_util [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.181 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  <uuid>230d36aa-b1ff-4e7d-a024-af0021cd0044</uuid>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  <name>instance-00000035</name>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersAdminTestJSON-server-1580825723</nova:name>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:03:17</nova:creationTime>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:        <nova:user uuid="cd616d4c2eb44fe0a0da2df1690c0e21">tempest-ServersAdminTestJSON-1087744064-project-member</nova:user>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:        <nova:project uuid="80b4126e17a14d73b40158a57f19d091">tempest-ServersAdminTestJSON-1087744064</nova:project>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:        <nova:port uuid="e5af6202-8a71-48e2-ae69-2b3cb0d3a948">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <entry name="serial">230d36aa-b1ff-4e7d-a024-af0021cd0044</entry>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <entry name="uuid">230d36aa-b1ff-4e7d-a024-af0021cd0044</entry>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:ff:47:55"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <target dev="tape5af6202-8a"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/console.log" append="off"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:03:17 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:03:17 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:03:17 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:03:17 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.182 186962 DEBUG nova.virt.libvirt.vif [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:00:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1580825723',display_name='tempest-ServersAdminTestJSON-server-1580825723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1580825723',id=53,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:02:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-uhc27c80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:03:15Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=230d36aa-b1ff-4e7d-a024-af0021cd0044,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.183 186962 DEBUG nova.network.os_vif_util [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.183 186962 DEBUG nova.network.os_vif_util [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.183 186962 DEBUG os_vif [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.184 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.185 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.185 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.189 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.189 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5af6202-8a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.190 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape5af6202-8a, col_values=(('external_ids', {'iface-id': 'e5af6202-8a71-48e2-ae69-2b3cb0d3a948', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:47:55', 'vm-uuid': '230d36aa-b1ff-4e7d-a024-af0021cd0044'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.192 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:17 np0005539505 NetworkManager[55134]: <info>  [1764399797.1930] manager: (tape5af6202-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.194 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.198 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.199 186962 INFO os_vif [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a')#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.299 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.300 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.300 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] No VIF found with MAC fa:16:3e:ff:47:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.301 186962 INFO nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Using config drive#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.318 186962 DEBUG nova.objects.instance [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.350 186962 DEBUG nova.objects.instance [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'keypairs' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.641 186962 DEBUG nova.compute.manager [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.642 186962 DEBUG oslo_concurrency.lockutils [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.642 186962 DEBUG oslo_concurrency.lockutils [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.642 186962 DEBUG oslo_concurrency.lockutils [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.643 186962 DEBUG nova.compute.manager [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] No waiting events found dispatching network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:17 np0005539505 nova_compute[186958]: 2025-11-29 07:03:17.643 186962 WARNING nova.compute.manager [req-709aa596-fec7-43a7-8e75-69f0e31cfe5c req-c785c232-70e8-4a8e-85b0-c38a3242535a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received unexpected event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 29 02:03:18 np0005539505 nova_compute[186958]: 2025-11-29 07:03:18.312 186962 INFO nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Creating config drive at /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config#033[00m
Nov 29 02:03:18 np0005539505 nova_compute[186958]: 2025-11-29 07:03:18.318 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4ozc8yj1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:03:18 np0005539505 nova_compute[186958]: 2025-11-29 07:03:18.447 186962 DEBUG oslo_concurrency.processutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4ozc8yj1" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:03:18 np0005539505 kernel: tape5af6202-8a: entered promiscuous mode
Nov 29 02:03:18 np0005539505 NetworkManager[55134]: <info>  [1764399798.5326] manager: (tape5af6202-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Nov 29 02:03:18 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:18Z|00243|binding|INFO|Claiming lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for this chassis.
Nov 29 02:03:18 np0005539505 nova_compute[186958]: 2025-11-29 07:03:18.547 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:18 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:18Z|00244|binding|INFO|e5af6202-8a71-48e2-ae69-2b3cb0d3a948: Claiming fa:16:3e:ff:47:55 10.100.0.14
Nov 29 02:03:18 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:18Z|00245|binding|INFO|Setting lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 ovn-installed in OVS
Nov 29 02:03:18 np0005539505 nova_compute[186958]: 2025-11-29 07:03:18.572 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:18 np0005539505 systemd-udevd[224202]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:03:18 np0005539505 systemd-machined[153285]: New machine qemu-30-instance-00000035.
Nov 29 02:03:18 np0005539505 NetworkManager[55134]: <info>  [1764399798.5914] device (tape5af6202-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:03:18 np0005539505 NetworkManager[55134]: <info>  [1764399798.5922] device (tape5af6202-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:03:18 np0005539505 systemd[1]: Started Virtual Machine qemu-30-instance-00000035.
Nov 29 02:03:18 np0005539505 podman[224176]: 2025-11-29 07:03:18.626621515 +0000 UTC m=+0.100033390 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:03:18 np0005539505 podman[224177]: 2025-11-29 07:03:18.687477262 +0000 UTC m=+0.158158170 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 02:03:18 np0005539505 nova_compute[186958]: 2025-11-29 07:03:18.760 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.076 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Removed pending event for 230d36aa-b1ff-4e7d-a024-af0021cd0044 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.076 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399799.0754833, 230d36aa-b1ff-4e7d-a024-af0021cd0044 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.077 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.080 186962 DEBUG nova.compute.manager [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.081 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.085 186962 INFO nova.virt.libvirt.driver [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance spawned successfully.#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.085 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:03:19 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:19Z|00246|binding|INFO|Setting lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 up in Southbound
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.107 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:47:55 10.100.0.14'], port_security=['fa:16:3e:ff:47:55 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '7', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=e5af6202-8a71-48e2-ae69-2b3cb0d3a948) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.110 104094 INFO neutron.agent.ovn.metadata.agent [-] Port e5af6202-8a71-48e2-ae69-2b3cb0d3a948 in datapath b97f3d85-11c0-4475-aea6-e8da158df42a bound to our chassis#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.113 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.132 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0ceeb91d-92ce-4984-bce5-05e388b566a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.144 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.148 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.162 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.163 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.164 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.165 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.165 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.166 186962 DEBUG nova.virt.libvirt.driver [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.168 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1118b088-adcd-40bb-b16b-5d237f0a3edd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.172 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[384633a9-9e73-4971-89ec-2556dc8c7bb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.205 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[71378688-918f-451d-aef9-12c4605843b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.223 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e12e787e-d020-4e79-b1fe-cdc54d88ad9f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510568, 'reachable_time': 39366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224255, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.230 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.230 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399799.0796802, 230d36aa-b1ff-4e7d-a024-af0021cd0044 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.230 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] VM Started (Lifecycle Event)#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.242 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cbeaa207-5d8a-40bd-b826-9229967813c9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510580, 'tstamp': 510580}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224256, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510582, 'tstamp': 510582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224256, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.244 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.291 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.292 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.292 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.293 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.293 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:19.294 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.502 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.505 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.562 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.588 186962 DEBUG nova.compute.manager [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.732 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.733 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.733 186962 DEBUG nova.objects.instance [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:03:19 np0005539505 nova_compute[186958]: 2025-11-29 07:03:19.810 186962 DEBUG oslo_concurrency.lockutils [None req-6e497f3d-da6b-4865-a33b-6db821297e5e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:21 np0005539505 nova_compute[186958]: 2025-11-29 07:03:21.789 186962 DEBUG nova.compute.manager [req-82cb033f-0b0e-4231-b3b9-e3038c4cbb36 req-27119d9b-b1ad-4e85-bbd5-045c585b1ead 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:21 np0005539505 nova_compute[186958]: 2025-11-29 07:03:21.790 186962 DEBUG oslo_concurrency.lockutils [req-82cb033f-0b0e-4231-b3b9-e3038c4cbb36 req-27119d9b-b1ad-4e85-bbd5-045c585b1ead 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:21 np0005539505 nova_compute[186958]: 2025-11-29 07:03:21.790 186962 DEBUG oslo_concurrency.lockutils [req-82cb033f-0b0e-4231-b3b9-e3038c4cbb36 req-27119d9b-b1ad-4e85-bbd5-045c585b1ead 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:21 np0005539505 nova_compute[186958]: 2025-11-29 07:03:21.791 186962 DEBUG oslo_concurrency.lockutils [req-82cb033f-0b0e-4231-b3b9-e3038c4cbb36 req-27119d9b-b1ad-4e85-bbd5-045c585b1ead 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:21 np0005539505 nova_compute[186958]: 2025-11-29 07:03:21.791 186962 DEBUG nova.compute.manager [req-82cb033f-0b0e-4231-b3b9-e3038c4cbb36 req-27119d9b-b1ad-4e85-bbd5-045c585b1ead 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] No waiting events found dispatching network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:21 np0005539505 nova_compute[186958]: 2025-11-29 07:03:21.791 186962 WARNING nova.compute.manager [req-82cb033f-0b0e-4231-b3b9-e3038c4cbb36 req-27119d9b-b1ad-4e85-bbd5-045c585b1ead 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received unexpected event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:03:22 np0005539505 nova_compute[186958]: 2025-11-29 07:03:22.194 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:23 np0005539505 podman[224257]: 2025-11-29 07:03:23.73110664 +0000 UTC m=+0.063241186 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 02:03:23 np0005539505 nova_compute[186958]: 2025-11-29 07:03:23.763 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:24 np0005539505 nova_compute[186958]: 2025-11-29 07:03:24.371 186962 DEBUG nova.compute.manager [req-80f977a1-368e-4d44-ac81-45727813b68b req-19b03b26-7f40-4cf6-b315-9531f69d4af0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:24 np0005539505 nova_compute[186958]: 2025-11-29 07:03:24.372 186962 DEBUG oslo_concurrency.lockutils [req-80f977a1-368e-4d44-ac81-45727813b68b req-19b03b26-7f40-4cf6-b315-9531f69d4af0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:24 np0005539505 nova_compute[186958]: 2025-11-29 07:03:24.372 186962 DEBUG oslo_concurrency.lockutils [req-80f977a1-368e-4d44-ac81-45727813b68b req-19b03b26-7f40-4cf6-b315-9531f69d4af0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:24 np0005539505 nova_compute[186958]: 2025-11-29 07:03:24.372 186962 DEBUG oslo_concurrency.lockutils [req-80f977a1-368e-4d44-ac81-45727813b68b req-19b03b26-7f40-4cf6-b315-9531f69d4af0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:24 np0005539505 nova_compute[186958]: 2025-11-29 07:03:24.373 186962 DEBUG nova.compute.manager [req-80f977a1-368e-4d44-ac81-45727813b68b req-19b03b26-7f40-4cf6-b315-9531f69d4af0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] No waiting events found dispatching network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:24 np0005539505 nova_compute[186958]: 2025-11-29 07:03:24.373 186962 WARNING nova.compute.manager [req-80f977a1-368e-4d44-ac81-45727813b68b req-19b03b26-7f40-4cf6-b315-9531f69d4af0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received unexpected event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:03:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:25.722 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:25 np0005539505 nova_compute[186958]: 2025-11-29 07:03:25.722 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:25.726 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:03:26 np0005539505 podman[224278]: 2025-11-29 07:03:26.743781523 +0000 UTC m=+0.070458171 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:03:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:26.940 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:26.941 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:26.941 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:27 np0005539505 nova_compute[186958]: 2025-11-29 07:03:27.196 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:28 np0005539505 nova_compute[186958]: 2025-11-29 07:03:28.764 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:31 np0005539505 nova_compute[186958]: 2025-11-29 07:03:31.909 186962 DEBUG oslo_concurrency.lockutils [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "b681523d-c882-4406-a91b-5cae6d761201" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:31 np0005539505 nova_compute[186958]: 2025-11-29 07:03:31.910 186962 DEBUG oslo_concurrency.lockutils [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:31 np0005539505 nova_compute[186958]: 2025-11-29 07:03:31.910 186962 DEBUG oslo_concurrency.lockutils [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "b681523d-c882-4406-a91b-5cae6d761201-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:31 np0005539505 nova_compute[186958]: 2025-11-29 07:03:31.911 186962 DEBUG oslo_concurrency.lockutils [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:31 np0005539505 nova_compute[186958]: 2025-11-29 07:03:31.911 186962 DEBUG oslo_concurrency.lockutils [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:31 np0005539505 nova_compute[186958]: 2025-11-29 07:03:31.960 186962 INFO nova.compute.manager [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Terminating instance#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.040 186962 DEBUG nova.compute.manager [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.200 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539505 kernel: tap26c27b84-e5 (unregistering): left promiscuous mode
Nov 29 02:03:32 np0005539505 NetworkManager[55134]: <info>  [1764399812.2148] device (tap26c27b84-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.225 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:32Z|00247|binding|INFO|Releasing lport 26c27b84-e5c3-4a6f-8631-6517e783ca9b from this chassis (sb_readonly=0)
Nov 29 02:03:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:32Z|00248|binding|INFO|Setting lport 26c27b84-e5c3-4a6f-8631-6517e783ca9b down in Southbound
Nov 29 02:03:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:32Z|00249|binding|INFO|Removing iface tap26c27b84-e5 ovn-installed in OVS
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.227 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.239 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.250 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:70:30 10.100.0.11'], port_security=['fa:16:3e:ba:70:30 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b681523d-c882-4406-a91b-5cae6d761201', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '4', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=26c27b84-e5c3-4a6f-8631-6517e783ca9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.252 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 26c27b84-e5c3-4a6f-8631-6517e783ca9b in datapath b97f3d85-11c0-4475-aea6-e8da158df42a unbound from our chassis#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.254 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b97f3d85-11c0-4475-aea6-e8da158df42a#033[00m
Nov 29 02:03:32 np0005539505 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000038.scope: Deactivated successfully.
Nov 29 02:03:32 np0005539505 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000038.scope: Consumed 18.596s CPU time.
Nov 29 02:03:32 np0005539505 systemd-machined[153285]: Machine qemu-28-instance-00000038 terminated.
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.272 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[30c4d791-20ac-4ccc-bb68-1b71af767100]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.310 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c4426d-88bc-4f0c-8549-23fc3e752642]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.314 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b90cab5b-a1e2-4fa9-99a1-8eb0862ec763]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.344 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c6b5d0a0-243c-4dd8-b936-9361a6fe53a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.367 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3b281174-2248-4155-a0c8-216633253e54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb97f3d85-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:53:e2:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510568, 'reachable_time': 39366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224321, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.384 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[59580d01-9442-4be4-a263-b2ab44aa48ca]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510580, 'tstamp': 510580}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224322, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb97f3d85-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510582, 'tstamp': 510582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224322, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.386 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.388 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.392 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.393 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97f3d85-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.394 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.394 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb97f3d85-10, col_values=(('external_ids', {'iface-id': 'e6d6aadc-4cde-4c62-a881-70607e3666f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.394 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.467 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.472 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.514 186962 INFO nova.virt.libvirt.driver [-] [instance: b681523d-c882-4406-a91b-5cae6d761201] Instance destroyed successfully.#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.515 186962 DEBUG nova.objects.instance [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'resources' on Instance uuid b681523d-c882-4406-a91b-5cae6d761201 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.539 186962 DEBUG nova.virt.libvirt.vif [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:01:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-647782358',display_name='tempest-ServersAdminTestJSON-server-647782358',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-647782358',id=56,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:01:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-9ic89zym',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:01:24Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=b681523d-c882-4406-a91b-5cae6d761201,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "address": "fa:16:3e:ba:70:30", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26c27b84-e5", "ovs_interfaceid": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.539 186962 DEBUG nova.network.os_vif_util [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "address": "fa:16:3e:ba:70:30", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26c27b84-e5", "ovs_interfaceid": "26c27b84-e5c3-4a6f-8631-6517e783ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.540 186962 DEBUG nova.network.os_vif_util [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ba:70:30,bridge_name='br-int',has_traffic_filtering=True,id=26c27b84-e5c3-4a6f-8631-6517e783ca9b,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26c27b84-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.541 186962 DEBUG os_vif [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:70:30,bridge_name='br-int',has_traffic_filtering=True,id=26c27b84-e5c3-4a6f-8631-6517e783ca9b,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26c27b84-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.543 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.544 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap26c27b84-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.545 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.547 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.549 186962 INFO os_vif [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ba:70:30,bridge_name='br-int',has_traffic_filtering=True,id=26c27b84-e5c3-4a6f-8631-6517e783ca9b,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26c27b84-e5')#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.550 186962 INFO nova.virt.libvirt.driver [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Deleting instance files /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201_del#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.551 186962 INFO nova.virt.libvirt.driver [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Deletion of /var/lib/nova/instances/b681523d-c882-4406-a91b-5cae6d761201_del complete#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.632 186962 INFO nova.compute.manager [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Took 0.59 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.633 186962 DEBUG oslo.service.loopingcall [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.633 186962 DEBUG nova.compute.manager [-] [instance: b681523d-c882-4406-a91b-5cae6d761201] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:03:32 np0005539505 nova_compute[186958]: 2025-11-29 07:03:32.634 186962 DEBUG nova.network.neutron [-] [instance: b681523d-c882-4406-a91b-5cae6d761201] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:03:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:32.729 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:33 np0005539505 nova_compute[186958]: 2025-11-29 07:03:33.259 186962 DEBUG nova.network.neutron [-] [instance: b681523d-c882-4406-a91b-5cae6d761201] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:03:33 np0005539505 nova_compute[186958]: 2025-11-29 07:03:33.277 186962 INFO nova.compute.manager [-] [instance: b681523d-c882-4406-a91b-5cae6d761201] Took 0.64 seconds to deallocate network for instance.#033[00m
Nov 29 02:03:33 np0005539505 nova_compute[186958]: 2025-11-29 07:03:33.357 186962 DEBUG oslo_concurrency.lockutils [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:33 np0005539505 nova_compute[186958]: 2025-11-29 07:03:33.358 186962 DEBUG oslo_concurrency.lockutils [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:33 np0005539505 nova_compute[186958]: 2025-11-29 07:03:33.456 186962 DEBUG nova.compute.manager [req-a92ebca0-b68a-43b8-8d8a-39150b7b095a req-3ed63cb1-2a4c-40b1-a498-4baf44cc6a3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b681523d-c882-4406-a91b-5cae6d761201] Received event network-vif-deleted-26c27b84-e5c3-4a6f-8631-6517e783ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:33 np0005539505 nova_compute[186958]: 2025-11-29 07:03:33.458 186962 DEBUG nova.compute.provider_tree [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:03:33 np0005539505 nova_compute[186958]: 2025-11-29 07:03:33.474 186962 DEBUG nova.scheduler.client.report [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:03:33 np0005539505 nova_compute[186958]: 2025-11-29 07:03:33.501 186962 DEBUG oslo_concurrency.lockutils [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:33 np0005539505 nova_compute[186958]: 2025-11-29 07:03:33.531 186962 INFO nova.scheduler.client.report [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Deleted allocations for instance b681523d-c882-4406-a91b-5cae6d761201#033[00m
Nov 29 02:03:33 np0005539505 nova_compute[186958]: 2025-11-29 07:03:33.616 186962 DEBUG oslo_concurrency.lockutils [None req-5c43d94a-8246-482a-a7dc-d61316dd330e cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "b681523d-c882-4406-a91b-5cae6d761201" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:33 np0005539505 nova_compute[186958]: 2025-11-29 07:03:33.766 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:34Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:47:55 10.100.0.14
Nov 29 02:03:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:34Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:47:55 10.100.0.14
Nov 29 02:03:37 np0005539505 nova_compute[186958]: 2025-11-29 07:03:37.546 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:37 np0005539505 nova_compute[186958]: 2025-11-29 07:03:37.815 186962 DEBUG oslo_concurrency.lockutils [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:37 np0005539505 nova_compute[186958]: 2025-11-29 07:03:37.815 186962 DEBUG oslo_concurrency.lockutils [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:37 np0005539505 nova_compute[186958]: 2025-11-29 07:03:37.815 186962 DEBUG oslo_concurrency.lockutils [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:37 np0005539505 nova_compute[186958]: 2025-11-29 07:03:37.816 186962 DEBUG oslo_concurrency.lockutils [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:37 np0005539505 nova_compute[186958]: 2025-11-29 07:03:37.816 186962 DEBUG oslo_concurrency.lockutils [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:37 np0005539505 nova_compute[186958]: 2025-11-29 07:03:37.840 186962 INFO nova.compute.manager [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Terminating instance#033[00m
Nov 29 02:03:37 np0005539505 nova_compute[186958]: 2025-11-29 07:03:37.851 186962 DEBUG nova.compute.manager [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:03:37 np0005539505 kernel: tape5af6202-8a (unregistering): left promiscuous mode
Nov 29 02:03:37 np0005539505 NetworkManager[55134]: <info>  [1764399817.8816] device (tape5af6202-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:03:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:37Z|00250|binding|INFO|Releasing lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 from this chassis (sb_readonly=0)
Nov 29 02:03:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:37Z|00251|binding|INFO|Setting lport e5af6202-8a71-48e2-ae69-2b3cb0d3a948 down in Southbound
Nov 29 02:03:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:03:37Z|00252|binding|INFO|Removing iface tape5af6202-8a ovn-installed in OVS
Nov 29 02:03:37 np0005539505 nova_compute[186958]: 2025-11-29 07:03:37.885 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:37.898 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:47:55 10.100.0.14'], port_security=['fa:16:3e:ff:47:55 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '230d36aa-b1ff-4e7d-a024-af0021cd0044', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b97f3d85-11c0-4475-aea6-e8da158df42a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80b4126e17a14d73b40158a57f19d091', 'neutron:revision_number': '8', 'neutron:security_group_ids': '95da808a-355c-4cca-8e02-4813ef09195a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fd7ef61-4cb3-485a-8b86-3b1a506a9944, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=e5af6202-8a71-48e2-ae69-2b3cb0d3a948) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:03:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:37.900 104094 INFO neutron.agent.ovn.metadata.agent [-] Port e5af6202-8a71-48e2-ae69-2b3cb0d3a948 in datapath b97f3d85-11c0-4475-aea6-e8da158df42a unbound from our chassis#033[00m
Nov 29 02:03:37 np0005539505 nova_compute[186958]: 2025-11-29 07:03:37.902 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:37.902 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b97f3d85-11c0-4475-aea6-e8da158df42a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:03:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:37.903 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b36d671d-a729-4dba-b374-365303d0c72f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:37.904 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a namespace which is not needed anymore#033[00m
Nov 29 02:03:37 np0005539505 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000035.scope: Deactivated successfully.
Nov 29 02:03:37 np0005539505 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000035.scope: Consumed 13.178s CPU time.
Nov 29 02:03:37 np0005539505 systemd-machined[153285]: Machine qemu-30-instance-00000035 terminated.
Nov 29 02:03:37 np0005539505 podman[224356]: 2025-11-29 07:03:37.970988163 +0000 UTC m=+0.058110020 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:03:37 np0005539505 podman[224354]: 2025-11-29 07:03:37.979142455 +0000 UTC m=+0.068385512 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.076 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.083 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.119 186962 INFO nova.virt.libvirt.driver [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Instance destroyed successfully.#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.120 186962 DEBUG nova.objects.instance [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lazy-loading 'resources' on Instance uuid 230d36aa-b1ff-4e7d-a024-af0021cd0044 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.134 186962 DEBUG nova.virt.libvirt.vif [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:00:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1580825723',display_name='tempest-ServersAdminTestJSON-server-1580825723',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1580825723',id=53,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:03:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='80b4126e17a14d73b40158a57f19d091',ramdisk_id='',reservation_id='r-uhc27c80',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1087744064',owner_user_name='tempest-ServersAdminTestJSON-1087744064-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:03:26Z,user_data=None,user_id='cd616d4c2eb44fe0a0da2df1690c0e21',uuid=230d36aa-b1ff-4e7d-a024-af0021cd0044,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.134 186962 DEBUG nova.network.os_vif_util [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converting VIF {"id": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "address": "fa:16:3e:ff:47:55", "network": {"id": "b97f3d85-11c0-4475-aea6-e8da158df42a", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1300681246-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "80b4126e17a14d73b40158a57f19d091", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5af6202-8a", "ovs_interfaceid": "e5af6202-8a71-48e2-ae69-2b3cb0d3a948", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.135 186962 DEBUG nova.network.os_vif_util [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.135 186962 DEBUG os_vif [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.137 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.137 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5af6202-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.139 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.142 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.145 186962 INFO os_vif [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:47:55,bridge_name='br-int',has_traffic_filtering=True,id=e5af6202-8a71-48e2-ae69-2b3cb0d3a948,network=Network(b97f3d85-11c0-4475-aea6-e8da158df42a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5af6202-8a')#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.145 186962 INFO nova.virt.libvirt.driver [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Deleting instance files /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044_del#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.146 186962 INFO nova.virt.libvirt.driver [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Deletion of /var/lib/nova/instances/230d36aa-b1ff-4e7d-a024-af0021cd0044_del complete#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.156 186962 DEBUG nova.compute.manager [req-ea28bc03-b4f0-4a22-863b-03ed94dd1d76 req-77b2b3ab-4e00-4983-a851-a8aafa6beda0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-unplugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.157 186962 DEBUG oslo_concurrency.lockutils [req-ea28bc03-b4f0-4a22-863b-03ed94dd1d76 req-77b2b3ab-4e00-4983-a851-a8aafa6beda0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.157 186962 DEBUG oslo_concurrency.lockutils [req-ea28bc03-b4f0-4a22-863b-03ed94dd1d76 req-77b2b3ab-4e00-4983-a851-a8aafa6beda0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.158 186962 DEBUG oslo_concurrency.lockutils [req-ea28bc03-b4f0-4a22-863b-03ed94dd1d76 req-77b2b3ab-4e00-4983-a851-a8aafa6beda0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.158 186962 DEBUG nova.compute.manager [req-ea28bc03-b4f0-4a22-863b-03ed94dd1d76 req-77b2b3ab-4e00-4983-a851-a8aafa6beda0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] No waiting events found dispatching network-vif-unplugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.158 186962 DEBUG nova.compute.manager [req-ea28bc03-b4f0-4a22-863b-03ed94dd1d76 req-77b2b3ab-4e00-4983-a851-a8aafa6beda0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-unplugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.249 186962 INFO nova.compute.manager [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.250 186962 DEBUG oslo.service.loopingcall [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.251 186962 DEBUG nova.compute.manager [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.251 186962 DEBUG nova.network.neutron [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:03:38 np0005539505 nova_compute[186958]: 2025-11-29 07:03:38.768 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:38 np0005539505 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222954]: [NOTICE]   (222958) : haproxy version is 2.8.14-c23fe91
Nov 29 02:03:38 np0005539505 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222954]: [NOTICE]   (222958) : path to executable is /usr/sbin/haproxy
Nov 29 02:03:38 np0005539505 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222954]: [WARNING]  (222958) : Exiting Master process...
Nov 29 02:03:38 np0005539505 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222954]: [WARNING]  (222958) : Exiting Master process...
Nov 29 02:03:38 np0005539505 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222954]: [ALERT]    (222958) : Current worker (222960) exited with code 143 (Terminated)
Nov 29 02:03:38 np0005539505 neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a[222954]: [WARNING]  (222958) : All workers exited. Exiting... (0)
Nov 29 02:03:38 np0005539505 systemd[1]: libpod-8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda.scope: Deactivated successfully.
Nov 29 02:03:38 np0005539505 podman[224417]: 2025-11-29 07:03:38.890484993 +0000 UTC m=+0.893507583 container died 8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:03:39 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda-userdata-shm.mount: Deactivated successfully.
Nov 29 02:03:39 np0005539505 systemd[1]: var-lib-containers-storage-overlay-5450dfa4ad9e8f43aa0a93cc8ce577eb1baecfca75dfda4cb3225eab79e11f85-merged.mount: Deactivated successfully.
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.238 186962 DEBUG nova.compute.manager [req-b863a30c-3134-4a8c-b1da-11045abca039 req-68f1b389-101e-40a8-aefe-668c851ba528 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.239 186962 DEBUG oslo_concurrency.lockutils [req-b863a30c-3134-4a8c-b1da-11045abca039 req-68f1b389-101e-40a8-aefe-668c851ba528 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.239 186962 DEBUG oslo_concurrency.lockutils [req-b863a30c-3134-4a8c-b1da-11045abca039 req-68f1b389-101e-40a8-aefe-668c851ba528 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.239 186962 DEBUG oslo_concurrency.lockutils [req-b863a30c-3134-4a8c-b1da-11045abca039 req-68f1b389-101e-40a8-aefe-668c851ba528 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.240 186962 DEBUG nova.compute.manager [req-b863a30c-3134-4a8c-b1da-11045abca039 req-68f1b389-101e-40a8-aefe-668c851ba528 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] No waiting events found dispatching network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.240 186962 WARNING nova.compute.manager [req-b863a30c-3134-4a8c-b1da-11045abca039 req-68f1b389-101e-40a8-aefe-668c851ba528 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received unexpected event network-vif-plugged-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.267 186962 DEBUG nova.network.neutron [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.286 186962 INFO nova.compute.manager [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Took 2.03 seconds to deallocate network for instance.#033[00m
Nov 29 02:03:40 np0005539505 podman[224417]: 2025-11-29 07:03:40.354480168 +0000 UTC m=+2.357502758 container cleanup 8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:03:40 np0005539505 systemd[1]: libpod-conmon-8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda.scope: Deactivated successfully.
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.386 186962 DEBUG oslo_concurrency.lockutils [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.387 186962 DEBUG oslo_concurrency.lockutils [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.426 186962 DEBUG nova.compute.manager [req-3c176d77-2a92-4bf4-b64d-581109bcc9ba req-a844d8e6-5d2c-4a93-ba2c-4342e1e5d125 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Received event network-vif-deleted-e5af6202-8a71-48e2-ae69-2b3cb0d3a948 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.465 186962 DEBUG nova.compute.provider_tree [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.483 186962 DEBUG nova.scheduler.client.report [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.514 186962 DEBUG oslo_concurrency.lockutils [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.543 186962 INFO nova.scheduler.client.report [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Deleted allocations for instance 230d36aa-b1ff-4e7d-a024-af0021cd0044#033[00m
Nov 29 02:03:40 np0005539505 nova_compute[186958]: 2025-11-29 07:03:40.626 186962 DEBUG oslo_concurrency.lockutils [None req-a45e26e7-544c-4b2b-a4c9-cd0704a776f1 cd616d4c2eb44fe0a0da2df1690c0e21 80b4126e17a14d73b40158a57f19d091 - - default default] Lock "230d36aa-b1ff-4e7d-a024-af0021cd0044" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:41 np0005539505 podman[224466]: 2025-11-29 07:03:41.09553592 +0000 UTC m=+0.717670630 container remove 8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:03:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:41.101 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc91bc0-1e53-47ce-b454-dbffb0055f67]: (4, ('Sat Nov 29 07:03:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a (8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda)\n8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda\nSat Nov 29 07:03:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a (8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda)\n8a9e8b8bf7ac0f76bda72f6d31473b5682132e9045cc8275fc2f28e2faf80dda\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:41.103 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e6bf21-2553-4724-a7fb-66021e1cc2ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:41.104 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97f3d85-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:03:41 np0005539505 nova_compute[186958]: 2025-11-29 07:03:41.106 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:41 np0005539505 kernel: tapb97f3d85-10: left promiscuous mode
Nov 29 02:03:41 np0005539505 nova_compute[186958]: 2025-11-29 07:03:41.128 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:41.132 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[76a8e51c-3be3-4e6d-8606-44207e27adc2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:41.168 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[df6e532f-8d43-4497-a092-b8eb234b3e1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:41.169 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[055c4ae1-4049-4971-b589-ec2e31812ab9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:41.189 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[08480af0-d322-4e3e-a0e3-ffc1d6c37c8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510561, 'reachable_time': 30469, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224480, 'error': None, 'target': 'ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:41 np0005539505 systemd[1]: run-netns-ovnmeta\x2db97f3d85\x2d11c0\x2d4475\x2daea6\x2de8da158df42a.mount: Deactivated successfully.
Nov 29 02:03:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:41.196 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b97f3d85-11c0-4475-aea6-e8da158df42a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:03:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:03:41.197 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[c86a4ecc-f6cf-4a67-906e-1822e5d16704]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:03:42 np0005539505 podman[224485]: 2025-11-29 07:03:42.720843874 +0000 UTC m=+0.055678281 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:03:43 np0005539505 nova_compute[186958]: 2025-11-29 07:03:43.179 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:43 np0005539505 nova_compute[186958]: 2025-11-29 07:03:43.770 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:44 np0005539505 nova_compute[186958]: 2025-11-29 07:03:44.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:47 np0005539505 nova_compute[186958]: 2025-11-29 07:03:47.384 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:47 np0005539505 nova_compute[186958]: 2025-11-29 07:03:47.513 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399812.5122526, b681523d-c882-4406-a91b-5cae6d761201 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:47 np0005539505 nova_compute[186958]: 2025-11-29 07:03:47.514 186962 INFO nova.compute.manager [-] [instance: b681523d-c882-4406-a91b-5cae6d761201] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:03:47 np0005539505 nova_compute[186958]: 2025-11-29 07:03:47.537 186962 DEBUG nova.compute.manager [None req-b0a74972-3436-41d3-a395-9b5cd205adf0 - - - - - -] [instance: b681523d-c882-4406-a91b-5cae6d761201] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:48 np0005539505 nova_compute[186958]: 2025-11-29 07:03:48.181 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:48 np0005539505 nova_compute[186958]: 2025-11-29 07:03:48.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:48 np0005539505 podman[224506]: 2025-11-29 07:03:48.709768203 +0000 UTC m=+0.045631707 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:03:48 np0005539505 nova_compute[186958]: 2025-11-29 07:03:48.822 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:48 np0005539505 podman[224530]: 2025-11-29 07:03:48.841059089 +0000 UTC m=+0.107277906 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:03:50 np0005539505 nova_compute[186958]: 2025-11-29 07:03:50.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:52 np0005539505 nova_compute[186958]: 2025-11-29 07:03:52.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.118 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399818.1166317, 230d36aa-b1ff-4e7d-a024-af0021cd0044 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.119 186962 INFO nova.compute.manager [-] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.228 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.430 186962 DEBUG nova.compute.manager [None req-64da6961-4f5c-4704-9d71-f39a62d0d61b - - - - - -] [instance: 230d36aa-b1ff-4e7d-a024-af0021cd0044] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.499 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.499 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.500 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.500 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.699 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.701 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5736MB free_disk=73.22625350952148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.701 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.702 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.824 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.987 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:03:53 np0005539505 nova_compute[186958]: 2025-11-29 07:03:53.987 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:03:54 np0005539505 nova_compute[186958]: 2025-11-29 07:03:54.016 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:03:54 np0005539505 nova_compute[186958]: 2025-11-29 07:03:54.034 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:03:54 np0005539505 nova_compute[186958]: 2025-11-29 07:03:54.034 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:03:54 np0005539505 nova_compute[186958]: 2025-11-29 07:03:54.051 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:03:54 np0005539505 nova_compute[186958]: 2025-11-29 07:03:54.084 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:03:54 np0005539505 nova_compute[186958]: 2025-11-29 07:03:54.123 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:03:54 np0005539505 nova_compute[186958]: 2025-11-29 07:03:54.262 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:03:54 np0005539505 nova_compute[186958]: 2025-11-29 07:03:54.472 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:03:54 np0005539505 nova_compute[186958]: 2025-11-29 07:03:54.473 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:03:54 np0005539505 podman[224557]: 2025-11-29 07:03:54.80959083 +0000 UTC m=+0.132802121 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:03:57 np0005539505 nova_compute[186958]: 2025-11-29 07:03:57.468 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:03:57 np0005539505 podman[224578]: 2025-11-29 07:03:57.782992316 +0000 UTC m=+0.116360214 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:03:58 np0005539505 nova_compute[186958]: 2025-11-29 07:03:58.231 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:58 np0005539505 nova_compute[186958]: 2025-11-29 07:03:58.825 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:03:59 np0005539505 nova_compute[186958]: 2025-11-29 07:03:59.247 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "c95d4144-0323-43ff-ad5b-29887709efba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:59 np0005539505 nova_compute[186958]: 2025-11-29 07:03:59.247 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:59 np0005539505 nova_compute[186958]: 2025-11-29 07:03:59.317 186962 DEBUG nova.compute.manager [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:03:59 np0005539505 nova_compute[186958]: 2025-11-29 07:03:59.784 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:03:59 np0005539505 nova_compute[186958]: 2025-11-29 07:03:59.784 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:03:59 np0005539505 nova_compute[186958]: 2025-11-29 07:03:59.790 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:03:59 np0005539505 nova_compute[186958]: 2025-11-29 07:03:59.790 186962 INFO nova.compute.claims [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:04:00 np0005539505 nova_compute[186958]: 2025-11-29 07:04:00.119 186962 DEBUG nova.compute.provider_tree [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:04:00 np0005539505 nova_compute[186958]: 2025-11-29 07:04:00.300 186962 DEBUG nova.scheduler.client.report [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:04:00 np0005539505 nova_compute[186958]: 2025-11-29 07:04:00.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:00 np0005539505 nova_compute[186958]: 2025-11-29 07:04:00.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:04:00 np0005539505 nova_compute[186958]: 2025-11-29 07:04:00.441 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:00 np0005539505 nova_compute[186958]: 2025-11-29 07:04:00.442 186962 DEBUG nova.compute.manager [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:04:00 np0005539505 nova_compute[186958]: 2025-11-29 07:04:00.696 186962 DEBUG nova.compute.manager [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:04:00 np0005539505 nova_compute[186958]: 2025-11-29 07:04:00.696 186962 DEBUG nova.network.neutron [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:04:00 np0005539505 nova_compute[186958]: 2025-11-29 07:04:00.859 186962 INFO nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.223 186962 DEBUG nova.compute.manager [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.448 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.448 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.449 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.596 186962 DEBUG nova.policy [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.859 186962 DEBUG nova.compute.manager [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.860 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.860 186962 INFO nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Creating image(s)#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.861 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "/var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.861 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.861 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.873 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.964 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.965 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.966 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:01 np0005539505 nova_compute[186958]: 2025-11-29 07:04:01.976 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:02 np0005539505 nova_compute[186958]: 2025-11-29 07:04:02.026 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:02 np0005539505 nova_compute[186958]: 2025-11-29 07:04:02.027 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:03 np0005539505 nova_compute[186958]: 2025-11-29 07:04:03.247 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:03 np0005539505 nova_compute[186958]: 2025-11-29 07:04:03.822 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk 1073741824" returned: 0 in 1.795s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:03 np0005539505 nova_compute[186958]: 2025-11-29 07:04:03.823 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:03 np0005539505 nova_compute[186958]: 2025-11-29 07:04:03.824 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:03 np0005539505 nova_compute[186958]: 2025-11-29 07:04:03.841 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:03 np0005539505 nova_compute[186958]: 2025-11-29 07:04:03.879 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:03 np0005539505 nova_compute[186958]: 2025-11-29 07:04:03.880 186962 DEBUG nova.virt.disk.api [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Checking if we can resize image /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:04:03 np0005539505 nova_compute[186958]: 2025-11-29 07:04:03.881 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:03 np0005539505 nova_compute[186958]: 2025-11-29 07:04:03.957 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:03 np0005539505 nova_compute[186958]: 2025-11-29 07:04:03.958 186962 DEBUG nova.virt.disk.api [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Cannot resize image /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:04:03 np0005539505 nova_compute[186958]: 2025-11-29 07:04:03.959 186962 DEBUG nova.objects.instance [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'migration_context' on Instance uuid c95d4144-0323-43ff-ad5b-29887709efba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:04 np0005539505 nova_compute[186958]: 2025-11-29 07:04:04.365 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:04:04 np0005539505 nova_compute[186958]: 2025-11-29 07:04:04.366 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Ensure instance console log exists: /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:04:04 np0005539505 nova_compute[186958]: 2025-11-29 07:04:04.367 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:04 np0005539505 nova_compute[186958]: 2025-11-29 07:04:04.369 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:04 np0005539505 nova_compute[186958]: 2025-11-29 07:04:04.369 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:04 np0005539505 nova_compute[186958]: 2025-11-29 07:04:04.398 186962 DEBUG nova.network.neutron [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Successfully created port: 43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:04:08 np0005539505 nova_compute[186958]: 2025-11-29 07:04:08.102 186962 DEBUG nova.network.neutron [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Successfully updated port: 43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:04:08 np0005539505 nova_compute[186958]: 2025-11-29 07:04:08.199 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "refresh_cache-c95d4144-0323-43ff-ad5b-29887709efba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:04:08 np0005539505 nova_compute[186958]: 2025-11-29 07:04:08.200 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquired lock "refresh_cache-c95d4144-0323-43ff-ad5b-29887709efba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:04:08 np0005539505 nova_compute[186958]: 2025-11-29 07:04:08.200 186962 DEBUG nova.network.neutron [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:04:08 np0005539505 nova_compute[186958]: 2025-11-29 07:04:08.251 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:08 np0005539505 nova_compute[186958]: 2025-11-29 07:04:08.291 186962 DEBUG nova.compute.manager [req-92e0b081-a12a-4392-bec5-ff68b24029e7 req-f33c70f7-6d43-4ac1-a8da-9f6900bb54ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Received event network-changed-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:08 np0005539505 nova_compute[186958]: 2025-11-29 07:04:08.292 186962 DEBUG nova.compute.manager [req-92e0b081-a12a-4392-bec5-ff68b24029e7 req-f33c70f7-6d43-4ac1-a8da-9f6900bb54ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Refreshing instance network info cache due to event network-changed-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:04:08 np0005539505 nova_compute[186958]: 2025-11-29 07:04:08.292 186962 DEBUG oslo_concurrency.lockutils [req-92e0b081-a12a-4392-bec5-ff68b24029e7 req-f33c70f7-6d43-4ac1-a8da-9f6900bb54ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-c95d4144-0323-43ff-ad5b-29887709efba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:04:08 np0005539505 nova_compute[186958]: 2025-11-29 07:04:08.563 186962 DEBUG nova.network.neutron [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:04:08 np0005539505 podman[224614]: 2025-11-29 07:04:08.750549699 +0000 UTC m=+0.083974415 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:04:08 np0005539505 podman[224615]: 2025-11-29 07:04:08.755408157 +0000 UTC m=+0.080493416 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:04:08 np0005539505 nova_compute[186958]: 2025-11-29 07:04:08.885 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:09.110 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:04:09 np0005539505 nova_compute[186958]: 2025-11-29 07:04:09.111 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:09.112 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:04:10 np0005539505 nova_compute[186958]: 2025-11-29 07:04:10.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.279 186962 DEBUG nova.network.neutron [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Updating instance_info_cache with network_info: [{"id": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "address": "fa:16:3e:06:2d:5e", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43bb4672-a2", "ovs_interfaceid": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.519 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Releasing lock "refresh_cache-c95d4144-0323-43ff-ad5b-29887709efba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.520 186962 DEBUG nova.compute.manager [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Instance network_info: |[{"id": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "address": "fa:16:3e:06:2d:5e", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43bb4672-a2", "ovs_interfaceid": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.521 186962 DEBUG oslo_concurrency.lockutils [req-92e0b081-a12a-4392-bec5-ff68b24029e7 req-f33c70f7-6d43-4ac1-a8da-9f6900bb54ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-c95d4144-0323-43ff-ad5b-29887709efba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.521 186962 DEBUG nova.network.neutron [req-92e0b081-a12a-4392-bec5-ff68b24029e7 req-f33c70f7-6d43-4ac1-a8da-9f6900bb54ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Refreshing network info cache for port 43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.525 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Start _get_guest_xml network_info=[{"id": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "address": "fa:16:3e:06:2d:5e", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43bb4672-a2", "ovs_interfaceid": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.531 186962 WARNING nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.548 186962 DEBUG nova.virt.libvirt.host [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.549 186962 DEBUG nova.virt.libvirt.host [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.552 186962 DEBUG nova.virt.libvirt.host [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.553 186962 DEBUG nova.virt.libvirt.host [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.555 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.555 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.555 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.556 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.556 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.556 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.557 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.557 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.557 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.557 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.558 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.558 186962 DEBUG nova.virt.hardware [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.563 186962 DEBUG nova.virt.libvirt.vif [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:03:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1459520177',display_name='tempest-ServersTestJSON-server-1459520177',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1459520177',id=65,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-cyfaej9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:04:01Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=c95d4144-0323-43ff-ad5b-29887709efba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "address": "fa:16:3e:06:2d:5e", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43bb4672-a2", "ovs_interfaceid": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.563 186962 DEBUG nova.network.os_vif_util [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "address": "fa:16:3e:06:2d:5e", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43bb4672-a2", "ovs_interfaceid": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.564 186962 DEBUG nova.network.os_vif_util [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:2d:5e,bridge_name='br-int',has_traffic_filtering=True,id=43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43bb4672-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.565 186962 DEBUG nova.objects.instance [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'pci_devices' on Instance uuid c95d4144-0323-43ff-ad5b-29887709efba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.662 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  <uuid>c95d4144-0323-43ff-ad5b-29887709efba</uuid>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  <name>instance-00000041</name>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersTestJSON-server-1459520177</nova:name>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:04:11</nova:creationTime>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:        <nova:user uuid="f2f86d3bd4814a09966b869dd539a6c9">tempest-ServersTestJSON-373958708-project-member</nova:user>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:        <nova:project uuid="1dba9539037a4e9dbf33cba140fe21fe">tempest-ServersTestJSON-373958708</nova:project>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:        <nova:port uuid="43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <entry name="serial">c95d4144-0323-43ff-ad5b-29887709efba</entry>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <entry name="uuid">c95d4144-0323-43ff-ad5b-29887709efba</entry>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk.config"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:06:2d:5e"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <target dev="tap43bb4672-a2"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/console.log" append="off"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:04:11 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:04:11 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:04:11 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:04:11 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.663 186962 DEBUG nova.compute.manager [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Preparing to wait for external event network-vif-plugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.664 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "c95d4144-0323-43ff-ad5b-29887709efba-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.664 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.665 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.666 186962 DEBUG nova.virt.libvirt.vif [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:03:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1459520177',display_name='tempest-ServersTestJSON-server-1459520177',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1459520177',id=65,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-cyfaej9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:04:01Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=c95d4144-0323-43ff-ad5b-29887709efba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "address": "fa:16:3e:06:2d:5e", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43bb4672-a2", "ovs_interfaceid": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.666 186962 DEBUG nova.network.os_vif_util [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "address": "fa:16:3e:06:2d:5e", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43bb4672-a2", "ovs_interfaceid": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.667 186962 DEBUG nova.network.os_vif_util [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:2d:5e,bridge_name='br-int',has_traffic_filtering=True,id=43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43bb4672-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.668 186962 DEBUG os_vif [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:2d:5e,bridge_name='br-int',has_traffic_filtering=True,id=43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43bb4672-a2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.669 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.669 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.670 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.673 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.673 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap43bb4672-a2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.674 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap43bb4672-a2, col_values=(('external_ids', {'iface-id': '43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:2d:5e', 'vm-uuid': 'c95d4144-0323-43ff-ad5b-29887709efba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.675 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:11 np0005539505 NetworkManager[55134]: <info>  [1764399851.6779] manager: (tap43bb4672-a2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.679 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.685 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:11 np0005539505 nova_compute[186958]: 2025-11-29 07:04:11.686 186962 INFO os_vif [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:2d:5e,bridge_name='br-int',has_traffic_filtering=True,id=43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43bb4672-a2')#033[00m
Nov 29 02:04:13 np0005539505 podman[224662]: 2025-11-29 07:04:13.768187318 +0000 UTC m=+0.082700749 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:04:13 np0005539505 nova_compute[186958]: 2025-11-29 07:04:13.890 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:14.116 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:14 np0005539505 nova_compute[186958]: 2025-11-29 07:04:14.235 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:14 np0005539505 nova_compute[186958]: 2025-11-29 07:04:14.236 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:14 np0005539505 nova_compute[186958]: 2025-11-29 07:04:14.236 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No VIF found with MAC fa:16:3e:06:2d:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:04:14 np0005539505 nova_compute[186958]: 2025-11-29 07:04:14.237 186962 INFO nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Using config drive#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.264 186962 INFO nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Creating config drive at /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk.config#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.269 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp35rs0g8v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.393 186962 DEBUG oslo_concurrency.processutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp35rs0g8v" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:15 np0005539505 kernel: tap43bb4672-a2: entered promiscuous mode
Nov 29 02:04:15 np0005539505 NetworkManager[55134]: <info>  [1764399855.4774] manager: (tap43bb4672-a2): new Tun device (/org/freedesktop/NetworkManager/Devices/127)
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.477 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:15Z|00253|binding|INFO|Claiming lport 43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 for this chassis.
Nov 29 02:04:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:15Z|00254|binding|INFO|43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825: Claiming fa:16:3e:06:2d:5e 10.100.0.12
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.482 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.493 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:2d:5e 10.100.0.12'], port_security=['fa:16:3e:06:2d:5e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c95d4144-0323-43ff-ad5b-29887709efba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.494 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 in datapath 9cf3a513-f54e-430e-b018-befaa643b464 bound to our chassis#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.497 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9cf3a513-f54e-430e-b018-befaa643b464#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.508 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ef344adf-c89b-4725-b1d6-bdabfa868e7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.509 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9cf3a513-f1 in ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.511 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9cf3a513-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.511 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ca85272b-0eb7-456f-847e-7e5bc0327c32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 systemd-udevd[224701]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.512 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e96867fa-8225-4932-a4d9-0a3df73343a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 systemd-machined[153285]: New machine qemu-31-instance-00000041.
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.527 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[d3be2990-40c9-4bb7-b803-2d4dd31a4f9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.536 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:15 np0005539505 NetworkManager[55134]: <info>  [1764399855.5380] device (tap43bb4672-a2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:04:15 np0005539505 NetworkManager[55134]: <info>  [1764399855.5394] device (tap43bb4672-a2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:04:15 np0005539505 systemd[1]: Started Virtual Machine qemu-31-instance-00000041.
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.541 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2c7e98a9-30c2-43f5-bba1-cd3f8dfdd3ac]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:15Z|00255|binding|INFO|Setting lport 43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 ovn-installed in OVS
Nov 29 02:04:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:15Z|00256|binding|INFO|Setting lport 43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 up in Southbound
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.546 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.566 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b69a9bef-c6b7-434c-bea3-f30835cf18a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.571 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[abe20c89-60ac-4328-a6a8-f2b4a61fdc30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 NetworkManager[55134]: <info>  [1764399855.5731] manager: (tap9cf3a513-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/128)
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.599 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3970d7ed-7a53-4b6c-9e97-8042719def5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.602 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[96d6fbfe-191c-4708-aec9-ed329b536d58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 NetworkManager[55134]: <info>  [1764399855.6233] device (tap9cf3a513-f0): carrier: link connected
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.627 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a45f5f17-0979-4012-b0a2-cb5b62211b7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.644 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[72ebf0db-0ff2-4802-9322-2eb4a898e8e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530320, 'reachable_time': 20347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224734, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.648 186962 DEBUG nova.network.neutron [req-92e0b081-a12a-4392-bec5-ff68b24029e7 req-f33c70f7-6d43-4ac1-a8da-9f6900bb54ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Updated VIF entry in instance network info cache for port 43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.648 186962 DEBUG nova.network.neutron [req-92e0b081-a12a-4392-bec5-ff68b24029e7 req-f33c70f7-6d43-4ac1-a8da-9f6900bb54ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Updating instance_info_cache with network_info: [{"id": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "address": "fa:16:3e:06:2d:5e", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43bb4672-a2", "ovs_interfaceid": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.663 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[611e20d4-5f64-45f2-9ad9-de187a49ae41]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:28ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 530320, 'tstamp': 530320}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224735, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.678 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[97cecb6d-24a8-40a3-b36a-f22fec6cae92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530320, 'reachable_time': 20347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224736, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.705 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[19625096-67ee-4719-8d08-c88432148f1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.720 186962 DEBUG oslo_concurrency.lockutils [req-92e0b081-a12a-4392-bec5-ff68b24029e7 req-f33c70f7-6d43-4ac1-a8da-9f6900bb54ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-c95d4144-0323-43ff-ad5b-29887709efba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.756 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a8907ec8-870d-4e1f-a2aa-ed0acae27f7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.757 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.758 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.758 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9cf3a513-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.760 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:15 np0005539505 kernel: tap9cf3a513-f0: entered promiscuous mode
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.762 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:15 np0005539505 NetworkManager[55134]: <info>  [1764399855.7626] manager: (tap9cf3a513-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.766 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9cf3a513-f0, col_values=(('external_ids', {'iface-id': 'ed5aef73-67a0-4ad1-8aea-9c411786c18e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.767 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:15Z|00257|binding|INFO|Releasing lport ed5aef73-67a0-4ad1-8aea-9c411786c18e from this chassis (sb_readonly=0)
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.768 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.771 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.778 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.780 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[526b5934-2048-4b8b-830f-9fb2ca754795]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.781 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:04:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:15.782 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'env', 'PROCESS_TAG=haproxy-9cf3a513-f54e-430e-b018-befaa643b464', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9cf3a513-f54e-430e-b018-befaa643b464.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.882 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399855.8815126, c95d4144-0323-43ff-ad5b-29887709efba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.882 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] VM Started (Lifecycle Event)#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.991 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.996 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399855.8821354, c95d4144-0323-43ff-ad5b-29887709efba => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:15 np0005539505 nova_compute[186958]: 2025-11-29 07:04:15.996 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:04:16 np0005539505 podman[224773]: 2025-11-29 07:04:16.121641558 +0000 UTC m=+0.022150530 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.227 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.231 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.260 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.677 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.942 186962 DEBUG nova.compute.manager [req-d7911ca9-e42c-4853-8276-110116f24dfd req-3a678534-790a-4eed-9df2-6ec719fcf88e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Received event network-vif-plugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.943 186962 DEBUG oslo_concurrency.lockutils [req-d7911ca9-e42c-4853-8276-110116f24dfd req-3a678534-790a-4eed-9df2-6ec719fcf88e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "c95d4144-0323-43ff-ad5b-29887709efba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.943 186962 DEBUG oslo_concurrency.lockutils [req-d7911ca9-e42c-4853-8276-110116f24dfd req-3a678534-790a-4eed-9df2-6ec719fcf88e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.943 186962 DEBUG oslo_concurrency.lockutils [req-d7911ca9-e42c-4853-8276-110116f24dfd req-3a678534-790a-4eed-9df2-6ec719fcf88e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.944 186962 DEBUG nova.compute.manager [req-d7911ca9-e42c-4853-8276-110116f24dfd req-3a678534-790a-4eed-9df2-6ec719fcf88e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Processing event network-vif-plugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.945 186962 DEBUG nova.compute.manager [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.951 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399856.9504755, c95d4144-0323-43ff-ad5b-29887709efba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.951 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.954 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.960 186962 INFO nova.virt.libvirt.driver [-] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Instance spawned successfully.#033[00m
Nov 29 02:04:16 np0005539505 nova_compute[186958]: 2025-11-29 07:04:16.961 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:04:17 np0005539505 podman[224773]: 2025-11-29 07:04:17.649921376 +0000 UTC m=+1.550430328 container create 968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:04:17 np0005539505 nova_compute[186958]: 2025-11-29 07:04:17.912 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:17 np0005539505 nova_compute[186958]: 2025-11-29 07:04:17.918 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:17 np0005539505 nova_compute[186958]: 2025-11-29 07:04:17.924 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:17 np0005539505 nova_compute[186958]: 2025-11-29 07:04:17.925 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:17 np0005539505 nova_compute[186958]: 2025-11-29 07:04:17.925 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:17 np0005539505 nova_compute[186958]: 2025-11-29 07:04:17.925 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:17 np0005539505 nova_compute[186958]: 2025-11-29 07:04:17.926 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:17 np0005539505 nova_compute[186958]: 2025-11-29 07:04:17.926 186962 DEBUG nova.virt.libvirt.driver [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:18 np0005539505 systemd[1]: Started libpod-conmon-968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e.scope.
Nov 29 02:04:18 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:04:18 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e1f9f9dcbf5b9113c382fd8112d4275e94e21bed2e6545ef1c7af5d22e711a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:04:18 np0005539505 nova_compute[186958]: 2025-11-29 07:04:18.591 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:18 np0005539505 nova_compute[186958]: 2025-11-29 07:04:18.779 186962 INFO nova.compute.manager [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Took 16.92 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:04:18 np0005539505 nova_compute[186958]: 2025-11-29 07:04:18.779 186962 DEBUG nova.compute.manager [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:18 np0005539505 nova_compute[186958]: 2025-11-29 07:04:18.892 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:19 np0005539505 podman[224773]: 2025-11-29 07:04:19.016134155 +0000 UTC m=+2.916643197 container init 968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 02:04:19 np0005539505 podman[224773]: 2025-11-29 07:04:19.02197164 +0000 UTC m=+2.922480592 container start 968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:04:19 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[224788]: [NOTICE]   (224792) : New worker (224794) forked
Nov 29 02:04:19 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[224788]: [NOTICE]   (224792) : Loading success.
Nov 29 02:04:19 np0005539505 nova_compute[186958]: 2025-11-29 07:04:19.503 186962 DEBUG nova.compute.manager [req-4a683ba3-8914-401f-8983-9bf0ee35f5c7 req-36a8aabc-80a6-4e74-a677-72cda48d47e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Received event network-vif-plugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:19 np0005539505 nova_compute[186958]: 2025-11-29 07:04:19.503 186962 DEBUG oslo_concurrency.lockutils [req-4a683ba3-8914-401f-8983-9bf0ee35f5c7 req-36a8aabc-80a6-4e74-a677-72cda48d47e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "c95d4144-0323-43ff-ad5b-29887709efba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:19 np0005539505 nova_compute[186958]: 2025-11-29 07:04:19.504 186962 DEBUG oslo_concurrency.lockutils [req-4a683ba3-8914-401f-8983-9bf0ee35f5c7 req-36a8aabc-80a6-4e74-a677-72cda48d47e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:19 np0005539505 nova_compute[186958]: 2025-11-29 07:04:19.504 186962 DEBUG oslo_concurrency.lockutils [req-4a683ba3-8914-401f-8983-9bf0ee35f5c7 req-36a8aabc-80a6-4e74-a677-72cda48d47e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:19 np0005539505 nova_compute[186958]: 2025-11-29 07:04:19.504 186962 DEBUG nova.compute.manager [req-4a683ba3-8914-401f-8983-9bf0ee35f5c7 req-36a8aabc-80a6-4e74-a677-72cda48d47e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] No waiting events found dispatching network-vif-plugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:04:19 np0005539505 nova_compute[186958]: 2025-11-29 07:04:19.505 186962 WARNING nova.compute.manager [req-4a683ba3-8914-401f-8983-9bf0ee35f5c7 req-36a8aabc-80a6-4e74-a677-72cda48d47e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Received unexpected event network-vif-plugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:04:19 np0005539505 podman[224803]: 2025-11-29 07:04:19.743701406 +0000 UTC m=+0.075779182 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:04:19 np0005539505 podman[224804]: 2025-11-29 07:04:19.792311896 +0000 UTC m=+0.124379632 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 02:04:19 np0005539505 nova_compute[186958]: 2025-11-29 07:04:19.848 186962 INFO nova.compute.manager [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Took 20.15 seconds to build instance.#033[00m
Nov 29 02:04:19 np0005539505 nova_compute[186958]: 2025-11-29 07:04:19.950 186962 DEBUG oslo_concurrency.lockutils [None req-e6340223-a271-4b2e-a01b-d6c50601eb69 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:21 np0005539505 nova_compute[186958]: 2025-11-29 07:04:21.680 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:23 np0005539505 nova_compute[186958]: 2025-11-29 07:04:23.894 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:23 np0005539505 nova_compute[186958]: 2025-11-29 07:04:23.973 186962 DEBUG oslo_concurrency.lockutils [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "c95d4144-0323-43ff-ad5b-29887709efba" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:23 np0005539505 nova_compute[186958]: 2025-11-29 07:04:23.974 186962 DEBUG oslo_concurrency.lockutils [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:23 np0005539505 nova_compute[186958]: 2025-11-29 07:04:23.974 186962 DEBUG oslo_concurrency.lockutils [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "c95d4144-0323-43ff-ad5b-29887709efba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:23 np0005539505 nova_compute[186958]: 2025-11-29 07:04:23.974 186962 DEBUG oslo_concurrency.lockutils [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:23 np0005539505 nova_compute[186958]: 2025-11-29 07:04:23.975 186962 DEBUG oslo_concurrency.lockutils [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.249 186962 INFO nova.compute.manager [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Terminating instance#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.284 186962 DEBUG nova.compute.manager [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:04:24 np0005539505 kernel: tap43bb4672-a2 (unregistering): left promiscuous mode
Nov 29 02:04:24 np0005539505 NetworkManager[55134]: <info>  [1764399864.3040] device (tap43bb4672-a2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:04:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:24Z|00258|binding|INFO|Releasing lport 43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 from this chassis (sb_readonly=0)
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.313 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:24Z|00259|binding|INFO|Setting lport 43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 down in Southbound
Nov 29 02:04:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:24Z|00260|binding|INFO|Removing iface tap43bb4672-a2 ovn-installed in OVS
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.314 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:24.322 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:2d:5e 10.100.0.12'], port_security=['fa:16:3e:06:2d:5e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c95d4144-0323-43ff-ad5b-29887709efba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:04:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:24.324 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 in datapath 9cf3a513-f54e-430e-b018-befaa643b464 unbound from our chassis#033[00m
Nov 29 02:04:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:24.325 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cf3a513-f54e-430e-b018-befaa643b464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:04:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:24.326 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0b9a9ff3-ac7b-41e1-b559-f64f3680b2ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:24.327 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace which is not needed anymore#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.328 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:24 np0005539505 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000041.scope: Deactivated successfully.
Nov 29 02:04:24 np0005539505 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000041.scope: Consumed 7.801s CPU time.
Nov 29 02:04:24 np0005539505 systemd-machined[153285]: Machine qemu-31-instance-00000041 terminated.
Nov 29 02:04:24 np0005539505 kernel: tap43bb4672-a2: entered promiscuous mode
Nov 29 02:04:24 np0005539505 kernel: tap43bb4672-a2 (unregistering): left promiscuous mode
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.514 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.588 186962 INFO nova.virt.libvirt.driver [-] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Instance destroyed successfully.#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.588 186962 DEBUG nova.objects.instance [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'resources' on Instance uuid c95d4144-0323-43ff-ad5b-29887709efba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.627 186962 DEBUG nova.virt.libvirt.vif [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:03:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1459520177',display_name='tempest-ServersTestJSON-server-1459520177',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1459520177',id=65,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:18Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-cyfaej9j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:04:18Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=c95d4144-0323-43ff-ad5b-29887709efba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "address": "fa:16:3e:06:2d:5e", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43bb4672-a2", "ovs_interfaceid": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.628 186962 DEBUG nova.network.os_vif_util [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "address": "fa:16:3e:06:2d:5e", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap43bb4672-a2", "ovs_interfaceid": "43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.628 186962 DEBUG nova.network.os_vif_util [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:2d:5e,bridge_name='br-int',has_traffic_filtering=True,id=43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43bb4672-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.629 186962 DEBUG os_vif [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:2d:5e,bridge_name='br-int',has_traffic_filtering=True,id=43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43bb4672-a2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.631 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.631 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap43bb4672-a2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.633 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.634 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.636 186962 INFO os_vif [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:2d:5e,bridge_name='br-int',has_traffic_filtering=True,id=43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap43bb4672-a2')#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.636 186962 INFO nova.virt.libvirt.driver [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Deleting instance files /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba_del#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.637 186962 INFO nova.virt.libvirt.driver [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Deletion of /var/lib/nova/instances/c95d4144-0323-43ff-ad5b-29887709efba_del complete#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.712 186962 DEBUG nova.compute.manager [req-99861c72-aa5b-4eea-aa22-b4d437f45251 req-f3cf8f36-c606-4839-984d-fd837cfdb98b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Received event network-vif-unplugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.712 186962 DEBUG oslo_concurrency.lockutils [req-99861c72-aa5b-4eea-aa22-b4d437f45251 req-f3cf8f36-c606-4839-984d-fd837cfdb98b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "c95d4144-0323-43ff-ad5b-29887709efba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.712 186962 DEBUG oslo_concurrency.lockutils [req-99861c72-aa5b-4eea-aa22-b4d437f45251 req-f3cf8f36-c606-4839-984d-fd837cfdb98b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.713 186962 DEBUG oslo_concurrency.lockutils [req-99861c72-aa5b-4eea-aa22-b4d437f45251 req-f3cf8f36-c606-4839-984d-fd837cfdb98b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.713 186962 DEBUG nova.compute.manager [req-99861c72-aa5b-4eea-aa22-b4d437f45251 req-f3cf8f36-c606-4839-984d-fd837cfdb98b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] No waiting events found dispatching network-vif-unplugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.713 186962 DEBUG nova.compute.manager [req-99861c72-aa5b-4eea-aa22-b4d437f45251 req-f3cf8f36-c606-4839-984d-fd837cfdb98b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Received event network-vif-unplugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.774 186962 INFO nova.compute.manager [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Took 0.49 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.775 186962 DEBUG oslo.service.loopingcall [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.775 186962 DEBUG nova.compute.manager [-] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:04:24 np0005539505 nova_compute[186958]: 2025-11-29 07:04:24.775 186962 DEBUG nova.network.neutron [-] [instance: c95d4144-0323-43ff-ad5b-29887709efba] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:04:26 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[224788]: [NOTICE]   (224792) : haproxy version is 2.8.14-c23fe91
Nov 29 02:04:26 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[224788]: [NOTICE]   (224792) : path to executable is /usr/sbin/haproxy
Nov 29 02:04:26 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[224788]: [WARNING]  (224792) : Exiting Master process...
Nov 29 02:04:26 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[224788]: [WARNING]  (224792) : Exiting Master process...
Nov 29 02:04:26 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[224788]: [ALERT]    (224792) : Current worker (224794) exited with code 143 (Terminated)
Nov 29 02:04:26 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[224788]: [WARNING]  (224792) : All workers exited. Exiting... (0)
Nov 29 02:04:26 np0005539505 systemd[1]: libpod-968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e.scope: Deactivated successfully.
Nov 29 02:04:26 np0005539505 podman[224878]: 2025-11-29 07:04:26.361795962 +0000 UTC m=+1.951628666 container died 968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:04:26 np0005539505 nova_compute[186958]: 2025-11-29 07:04:26.502 186962 DEBUG nova.network.neutron [-] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:04:26 np0005539505 nova_compute[186958]: 2025-11-29 07:04:26.530 186962 INFO nova.compute.manager [-] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Took 1.76 seconds to deallocate network for instance.#033[00m
Nov 29 02:04:26 np0005539505 nova_compute[186958]: 2025-11-29 07:04:26.723 186962 DEBUG nova.compute.manager [req-54aa54e4-7f5a-48e4-9c2c-141896c9e132 req-c5ab91ce-37cd-494d-bdeb-63d0baf80183 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Received event network-vif-deleted-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:26 np0005539505 nova_compute[186958]: 2025-11-29 07:04:26.934 186962 DEBUG nova.compute.manager [req-e2e5b0a0-e5e7-4059-8512-bcb7648c3596 req-c6d64f14-853b-4178-9f32-85fa9e92dcea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Received event network-vif-plugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:26 np0005539505 nova_compute[186958]: 2025-11-29 07:04:26.935 186962 DEBUG oslo_concurrency.lockutils [req-e2e5b0a0-e5e7-4059-8512-bcb7648c3596 req-c6d64f14-853b-4178-9f32-85fa9e92dcea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "c95d4144-0323-43ff-ad5b-29887709efba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:26 np0005539505 nova_compute[186958]: 2025-11-29 07:04:26.935 186962 DEBUG oslo_concurrency.lockutils [req-e2e5b0a0-e5e7-4059-8512-bcb7648c3596 req-c6d64f14-853b-4178-9f32-85fa9e92dcea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:26 np0005539505 nova_compute[186958]: 2025-11-29 07:04:26.936 186962 DEBUG oslo_concurrency.lockutils [req-e2e5b0a0-e5e7-4059-8512-bcb7648c3596 req-c6d64f14-853b-4178-9f32-85fa9e92dcea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:26 np0005539505 nova_compute[186958]: 2025-11-29 07:04:26.936 186962 DEBUG nova.compute.manager [req-e2e5b0a0-e5e7-4059-8512-bcb7648c3596 req-c6d64f14-853b-4178-9f32-85fa9e92dcea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] No waiting events found dispatching network-vif-plugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:04:26 np0005539505 nova_compute[186958]: 2025-11-29 07:04:26.936 186962 WARNING nova.compute.manager [req-e2e5b0a0-e5e7-4059-8512-bcb7648c3596 req-c6d64f14-853b-4178-9f32-85fa9e92dcea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Received unexpected event network-vif-plugged-43bb4672-a2ab-47ed-b5d0-1f6d8f1ef825 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:04:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:26.942 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:26.942 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:26.942 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:26 np0005539505 nova_compute[186958]: 2025-11-29 07:04:26.946 186962 DEBUG oslo_concurrency.lockutils [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:26 np0005539505 nova_compute[186958]: 2025-11-29 07:04:26.947 186962 DEBUG oslo_concurrency.lockutils [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:27 np0005539505 nova_compute[186958]: 2025-11-29 07:04:27.022 186962 DEBUG nova.compute.provider_tree [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:04:27 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e-userdata-shm.mount: Deactivated successfully.
Nov 29 02:04:27 np0005539505 systemd[1]: var-lib-containers-storage-overlay-0e1f9f9dcbf5b9113c382fd8112d4275e94e21bed2e6545ef1c7af5d22e711a5-merged.mount: Deactivated successfully.
Nov 29 02:04:27 np0005539505 podman[224908]: 2025-11-29 07:04:27.164739193 +0000 UTC m=+1.495187970 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 29 02:04:27 np0005539505 podman[224878]: 2025-11-29 07:04:27.196922256 +0000 UTC m=+2.786754960 container cleanup 968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:04:27 np0005539505 systemd[1]: libpod-conmon-968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e.scope: Deactivated successfully.
Nov 29 02:04:27 np0005539505 nova_compute[186958]: 2025-11-29 07:04:27.273 186962 DEBUG nova.scheduler.client.report [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:04:27 np0005539505 nova_compute[186958]: 2025-11-29 07:04:27.302 186962 DEBUG oslo_concurrency.lockutils [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:27 np0005539505 nova_compute[186958]: 2025-11-29 07:04:27.321 186962 INFO nova.scheduler.client.report [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Deleted allocations for instance c95d4144-0323-43ff-ad5b-29887709efba#033[00m
Nov 29 02:04:27 np0005539505 podman[224943]: 2025-11-29 07:04:27.342645363 +0000 UTC m=+0.123097625 container remove 968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 02:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:27.348 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0a0fc1b8-82d9-49a6-9104-561bb3647030]: (4, ('Sat Nov 29 07:04:24 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e)\n968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e\nSat Nov 29 07:04:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e)\n968eb2c4eca7f2ff9e81a7e8d9c76d9936097c25825da622ee72489fa4fab50e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:27.349 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[616b6686-4134-4d45-a082-1e118334601f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:27.350 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:27 np0005539505 nova_compute[186958]: 2025-11-29 07:04:27.352 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:27 np0005539505 kernel: tap9cf3a513-f0: left promiscuous mode
Nov 29 02:04:27 np0005539505 nova_compute[186958]: 2025-11-29 07:04:27.366 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:27.368 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[efe7e4bb-3c1b-4c55-a857-7d8b47f21e6d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:27.387 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1dcba6dc-4871-4b1d-9ec6-8466c9267cad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:27.388 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d401fd02-3295-4a83-b0b5-77ad58840700]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:27.407 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4617c3e1-1fa4-4491-9646-a9aac30416a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530314, 'reachable_time': 16864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224958, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:27.410 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:27.410 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[63b3b5b5-8032-4d3f-a014-3fe4914cefed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:27 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9cf3a513\x2df54e\x2d430e\x2db018\x2dbefaa643b464.mount: Deactivated successfully.
Nov 29 02:04:27 np0005539505 nova_compute[186958]: 2025-11-29 07:04:27.449 186962 DEBUG oslo_concurrency.lockutils [None req-5ece34f2-8c4b-4eb2-ac7f-37af857ff616 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "c95d4144-0323-43ff-ad5b-29887709efba" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.476s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:28 np0005539505 podman[224963]: 2025-11-29 07:04:28.72927581 +0000 UTC m=+0.061658860 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 02:04:28 np0005539505 nova_compute[186958]: 2025-11-29 07:04:28.895 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:29 np0005539505 nova_compute[186958]: 2025-11-29 07:04:29.634 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:33 np0005539505 nova_compute[186958]: 2025-11-29 07:04:33.897 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:34 np0005539505 nova_compute[186958]: 2025-11-29 07:04:34.450 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:34 np0005539505 nova_compute[186958]: 2025-11-29 07:04:34.450 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:34 np0005539505 nova_compute[186958]: 2025-11-29 07:04:34.635 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:35 np0005539505 nova_compute[186958]: 2025-11-29 07:04:35.933 186962 DEBUG nova.compute.manager [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.314 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.314 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.319 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.320 186962 INFO nova.compute.claims [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.452 186962 DEBUG nova.compute.provider_tree [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.467 186962 DEBUG nova.scheduler.client.report [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.486 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.486 186962 DEBUG nova.compute.manager [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.545 186962 DEBUG nova.compute.manager [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.545 186962 DEBUG nova.network.neutron [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.560 186962 INFO nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.578 186962 DEBUG nova.compute.manager [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.723 186962 DEBUG nova.compute.manager [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.725 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.726 186962 INFO nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Creating image(s)#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.727 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "/var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.728 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.729 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.755 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.825 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.826 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.827 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.844 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.902 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:37 np0005539505 nova_compute[186958]: 2025-11-29 07:04:37.903 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:38 np0005539505 nova_compute[186958]: 2025-11-29 07:04:38.122 186962 DEBUG nova.policy [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:04:38 np0005539505 nova_compute[186958]: 2025-11-29 07:04:38.614 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk 1073741824" returned: 0 in 0.711s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:38 np0005539505 nova_compute[186958]: 2025-11-29 07:04:38.615 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.788s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:38 np0005539505 nova_compute[186958]: 2025-11-29 07:04:38.616 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:38 np0005539505 nova_compute[186958]: 2025-11-29 07:04:38.688 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:38 np0005539505 nova_compute[186958]: 2025-11-29 07:04:38.690 186962 DEBUG nova.virt.disk.api [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Checking if we can resize image /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:04:38 np0005539505 nova_compute[186958]: 2025-11-29 07:04:38.691 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:38 np0005539505 nova_compute[186958]: 2025-11-29 07:04:38.748 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:38 np0005539505 nova_compute[186958]: 2025-11-29 07:04:38.750 186962 DEBUG nova.virt.disk.api [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Cannot resize image /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:04:38 np0005539505 nova_compute[186958]: 2025-11-29 07:04:38.751 186962 DEBUG nova.objects.instance [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'migration_context' on Instance uuid ed6db5c3-998e-480f-bda7-6a55dad0a78c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:38 np0005539505 nova_compute[186958]: 2025-11-29 07:04:38.898 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:39 np0005539505 nova_compute[186958]: 2025-11-29 07:04:39.587 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399864.5860345, c95d4144-0323-43ff-ad5b-29887709efba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:39 np0005539505 nova_compute[186958]: 2025-11-29 07:04:39.588 186962 INFO nova.compute.manager [-] [instance: c95d4144-0323-43ff-ad5b-29887709efba] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:04:39 np0005539505 nova_compute[186958]: 2025-11-29 07:04:39.638 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:39 np0005539505 podman[224999]: 2025-11-29 07:04:39.731023022 +0000 UTC m=+0.051216614 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:04:39 np0005539505 podman[224998]: 2025-11-29 07:04:39.736710993 +0000 UTC m=+0.064266264 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=)
Nov 29 02:04:40 np0005539505 nova_compute[186958]: 2025-11-29 07:04:40.593 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:04:40 np0005539505 nova_compute[186958]: 2025-11-29 07:04:40.594 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Ensure instance console log exists: /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:04:40 np0005539505 nova_compute[186958]: 2025-11-29 07:04:40.594 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:40 np0005539505 nova_compute[186958]: 2025-11-29 07:04:40.595 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:40 np0005539505 nova_compute[186958]: 2025-11-29 07:04:40.595 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:40 np0005539505 nova_compute[186958]: 2025-11-29 07:04:40.597 186962 DEBUG nova.compute.manager [None req-318d0805-2a7e-4d7d-9e25-ecf564139a7e - - - - - -] [instance: c95d4144-0323-43ff-ad5b-29887709efba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:41 np0005539505 nova_compute[186958]: 2025-11-29 07:04:41.808 186962 DEBUG nova.network.neutron [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Successfully created port: 722c1241-cdc7-49b5-8a52-ce1fe790e0cd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.416 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "b5fe1733-1d14-4b12-870c-69a44f532ef4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.417 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b5fe1733-1d14-4b12-870c-69a44f532ef4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.435 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.493 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "b7801223-d966-4047-b510-680042881897" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.494 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b7801223-d966-4047-b510-680042881897" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.518 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.566 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.567 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.575 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.575 186962 INFO nova.compute.claims [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.620 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.733 186962 DEBUG nova.compute.provider_tree [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.748 186962 DEBUG nova.scheduler.client.report [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.776 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.777 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.785 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.785 186962 INFO nova.compute.claims [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.811 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "6219c7b2-8afd-4c62-a5d3-8c38d1a8ff38" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.812 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "6219c7b2-8afd-4c62-a5d3-8c38d1a8ff38" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.842 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "6219c7b2-8afd-4c62-a5d3-8c38d1a8ff38" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.843 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.923 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.923 186962 DEBUG nova.network.neutron [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.941 186962 INFO nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.965 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:04:42 np0005539505 nova_compute[186958]: 2025-11-29 07:04:42.984 186962 DEBUG nova.compute.provider_tree [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.001 186962 DEBUG nova.scheduler.client.report [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.031 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.049 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "6219c7b2-8afd-4c62-a5d3-8c38d1a8ff38" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.050 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "6219c7b2-8afd-4c62-a5d3-8c38d1a8ff38" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.249 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "6219c7b2-8afd-4c62-a5d3-8c38d1a8ff38" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.250 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.255 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.256 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.257 186962 INFO nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Creating image(s)#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.257 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "/var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.258 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "/var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.258 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "/var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.276 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.334 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.335 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.336 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.349 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.373 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.374 186962 DEBUG nova.network.neutron [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.409 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.410 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.431 186962 INFO nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.455 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.557 186962 DEBUG nova.network.neutron [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.557 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.576 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.578 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.578 186962 INFO nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Creating image(s)#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.579 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "/var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.579 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "/var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.580 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "/var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.596 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.654 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.655 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.662 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk 1073741824" returned: 0 in 0.252s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.663 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.664 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.681 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.693 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.726 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.727 186962 DEBUG nova.virt.disk.api [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Checking if we can resize image /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.727 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.757 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.758 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.789 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.790 186962 DEBUG nova.virt.disk.api [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Cannot resize image /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.790 186962 DEBUG nova.objects.instance [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'migration_context' on Instance uuid b5fe1733-1d14-4b12-870c-69a44f532ef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.795 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.796 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.796 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.818 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.819 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Ensure instance console log exists: /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.820 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.820 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.820 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.822 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.829 186962 WARNING nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.836 186962 DEBUG nova.virt.libvirt.host [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.836 186962 DEBUG nova.virt.libvirt.host [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.839 186962 DEBUG nova.virt.libvirt.host [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.840 186962 DEBUG nova.virt.libvirt.host [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.842 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.842 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.842 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.843 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.843 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.843 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.843 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.843 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.843 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.844 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.844 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.844 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.848 186962 DEBUG nova.objects.instance [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'pci_devices' on Instance uuid b5fe1733-1d14-4b12-870c-69a44f532ef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.857 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.857 186962 DEBUG nova.virt.disk.api [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Checking if we can resize image /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.858 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.882 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  <uuid>b5fe1733-1d14-4b12-870c-69a44f532ef4</uuid>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  <name>instance-00000045</name>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersOnMultiNodesTest-server-1980003394-1</nova:name>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:04:43</nova:creationTime>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:04:43 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:        <nova:user uuid="0c56214d54944034ac2500edac59a239">tempest-ServersOnMultiNodesTest-2086403841-project-member</nova:user>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:        <nova:project uuid="d09f64becda14f30b831bdf7371d586b">tempest-ServersOnMultiNodesTest-2086403841</nova:project>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <nova:ports/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <entry name="serial">b5fe1733-1d14-4b12-870c-69a44f532ef4</entry>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <entry name="uuid">b5fe1733-1d14-4b12-870c-69a44f532ef4</entry>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.config"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/console.log" append="off"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:04:43 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:04:43 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:04:43 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:04:43 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.898 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.929 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.930 186962 DEBUG nova.virt.disk.api [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Cannot resize image /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.930 186962 DEBUG nova.objects.instance [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'migration_context' on Instance uuid b7801223-d966-4047-b510-680042881897 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.947 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.947 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Ensure instance console log exists: /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.948 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.948 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:43 np0005539505 nova_compute[186958]: 2025-11-29 07:04:43.948 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:43 np0005539505 podman[225071]: 2025-11-29 07:04:43.990006158 +0000 UTC m=+0.073958270 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.302 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.303 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.303 186962 INFO nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Using config drive#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.306 186962 DEBUG nova.network.neutron [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Successfully updated port: 722c1241-cdc7-49b5-8a52-ce1fe790e0cd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.348 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "refresh_cache-ed6db5c3-998e-480f-bda7-6a55dad0a78c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.348 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquired lock "refresh_cache-ed6db5c3-998e-480f-bda7-6a55dad0a78c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.348 186962 DEBUG nova.network.neutron [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.517 186962 DEBUG nova.compute.manager [req-440191de-0fce-4eb6-b5f2-347302a09120 req-67b2b869-21ba-41dd-8cf4-bbe99d946556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Received event network-changed-722c1241-cdc7-49b5-8a52-ce1fe790e0cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.518 186962 DEBUG nova.compute.manager [req-440191de-0fce-4eb6-b5f2-347302a09120 req-67b2b869-21ba-41dd-8cf4-bbe99d946556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Refreshing instance network info cache due to event network-changed-722c1241-cdc7-49b5-8a52-ce1fe790e0cd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.519 186962 DEBUG oslo_concurrency.lockutils [req-440191de-0fce-4eb6-b5f2-347302a09120 req-67b2b869-21ba-41dd-8cf4-bbe99d946556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-ed6db5c3-998e-480f-bda7-6a55dad0a78c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.535 186962 DEBUG nova.network.neutron [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.535 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.538 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.542 186962 WARNING nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.549 186962 INFO nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Creating config drive at /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.config#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.559 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpovp_6_o2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.580 186962 DEBUG nova.network.neutron [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.585 186962 DEBUG nova.virt.libvirt.host [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.586 186962 DEBUG nova.virt.libvirt.host [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.589 186962 DEBUG nova.virt.libvirt.host [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.590 186962 DEBUG nova.virt.libvirt.host [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.592 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.592 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.593 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.593 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.593 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.594 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.594 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.594 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.594 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.595 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.595 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.595 186962 DEBUG nova.virt.hardware [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.600 186962 DEBUG nova.objects.instance [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'pci_devices' on Instance uuid b7801223-d966-4047-b510-680042881897 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.641 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.661 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  <uuid>b7801223-d966-4047-b510-680042881897</uuid>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  <name>instance-00000046</name>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersOnMultiNodesTest-server-1980003394-2</nova:name>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:04:44</nova:creationTime>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:04:44 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:        <nova:user uuid="0c56214d54944034ac2500edac59a239">tempest-ServersOnMultiNodesTest-2086403841-project-member</nova:user>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:        <nova:project uuid="d09f64becda14f30b831bdf7371d586b">tempest-ServersOnMultiNodesTest-2086403841</nova:project>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <nova:ports/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <entry name="serial">b7801223-d966-4047-b510-680042881897</entry>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <entry name="uuid">b7801223-d966-4047-b510-680042881897</entry>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk.config"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/console.log" append="off"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:04:44 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:04:44 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:04:44 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:04:44 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:04:44 np0005539505 nova_compute[186958]: 2025-11-29 07:04:44.684 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpovp_6_o2" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:44 np0005539505 systemd-machined[153285]: New machine qemu-32-instance-00000045.
Nov 29 02:04:44 np0005539505 systemd[1]: Started Virtual Machine qemu-32-instance-00000045.
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.013 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399885.0127168, b5fe1733-1d14-4b12-870c-69a44f532ef4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.013 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.016 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.016 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.023 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.024 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.025 186962 INFO nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Using config drive#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.028 186962 INFO nova.virt.libvirt.driver [-] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Instance spawned successfully.#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.029 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.062 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.068 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.071 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.071 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.072 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.072 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.072 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.073 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.091 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.092 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399885.015126, b5fe1733-1d14-4b12-870c-69a44f532ef4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.092 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] VM Started (Lifecycle Event)#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.139 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.143 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.232 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.258 186962 INFO nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Creating config drive at /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk.config#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.263 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczsgqojz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.305 186962 INFO nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Took 2.05 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.306 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.421 186962 DEBUG oslo_concurrency.processutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpczsgqojz" returned: 0 in 0.158s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.456 186962 INFO nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Took 2.94 seconds to build instance.#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.494 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b5fe1733-1d14-4b12-870c-69a44f532ef4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:45 np0005539505 systemd-machined[153285]: New machine qemu-33-instance-00000046.
Nov 29 02:04:45 np0005539505 systemd[1]: Started Virtual Machine qemu-33-instance-00000046.
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.785 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399885.7845125, b7801223-d966-4047-b510-680042881897 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.785 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7801223-d966-4047-b510-680042881897] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.787 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.787 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.792 186962 INFO nova.virt.libvirt.driver [-] [instance: b7801223-d966-4047-b510-680042881897] Instance spawned successfully.#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.793 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.817 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7801223-d966-4047-b510-680042881897] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.821 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.821 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.822 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.822 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.822 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.823 186962 DEBUG nova.virt.libvirt.driver [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.826 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7801223-d966-4047-b510-680042881897] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.861 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7801223-d966-4047-b510-680042881897] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.861 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399885.785265, b7801223-d966-4047-b510-680042881897 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.861 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7801223-d966-4047-b510-680042881897] VM Started (Lifecycle Event)#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.908 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7801223-d966-4047-b510-680042881897] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.911 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7801223-d966-4047-b510-680042881897] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.940 186962 INFO nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Took 2.36 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.940 186962 DEBUG nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:45 np0005539505 nova_compute[186958]: 2025-11-29 07:04:45.942 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7801223-d966-4047-b510-680042881897] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.018 186962 INFO nova.compute.manager [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Took 3.44 seconds to build instance.#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.039 186962 DEBUG oslo_concurrency.lockutils [None req-1c01b706-841d-4cf7-b2fb-6b36a24e1031 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b7801223-d966-4047-b510-680042881897" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.545s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.290 186962 DEBUG nova.network.neutron [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Updating instance_info_cache with network_info: [{"id": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "address": "fa:16:3e:81:85:55", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722c1241-cd", "ovs_interfaceid": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.315 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Releasing lock "refresh_cache-ed6db5c3-998e-480f-bda7-6a55dad0a78c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.315 186962 DEBUG nova.compute.manager [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Instance network_info: |[{"id": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "address": "fa:16:3e:81:85:55", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722c1241-cd", "ovs_interfaceid": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.316 186962 DEBUG oslo_concurrency.lockutils [req-440191de-0fce-4eb6-b5f2-347302a09120 req-67b2b869-21ba-41dd-8cf4-bbe99d946556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-ed6db5c3-998e-480f-bda7-6a55dad0a78c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.316 186962 DEBUG nova.network.neutron [req-440191de-0fce-4eb6-b5f2-347302a09120 req-67b2b869-21ba-41dd-8cf4-bbe99d946556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Refreshing network info cache for port 722c1241-cdc7-49b5-8a52-ce1fe790e0cd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.320 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Start _get_guest_xml network_info=[{"id": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "address": "fa:16:3e:81:85:55", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722c1241-cd", "ovs_interfaceid": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.324 186962 WARNING nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.329 186962 DEBUG nova.virt.libvirt.host [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.330 186962 DEBUG nova.virt.libvirt.host [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.334 186962 DEBUG nova.virt.libvirt.host [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.335 186962 DEBUG nova.virt.libvirt.host [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.337 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.337 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.338 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.338 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.338 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.339 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.339 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.339 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.339 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.340 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.340 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.340 186962 DEBUG nova.virt.hardware [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.344 186962 DEBUG nova.virt.libvirt.vif [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:04:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1678810749',display_name='tempest-ServersTestJSON-server-1678810749',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1678810749',id=68,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/b7cPkRmr/+XAGjdn1bjkhWE1hfawkyCthn1Jv+xORYaGlZIECh8qzRWf08923NzTC3Jw4d3d0KGg2a4yxoR8IWQQuNynMDQfeOWEXjmB4LfHDu0XqZbnooU9N/rdZLw==',key_name='tempest-key-93338227',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-4jg1c0dy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:04:37Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=ed6db5c3-998e-480f-bda7-6a55dad0a78c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "address": "fa:16:3e:81:85:55", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722c1241-cd", "ovs_interfaceid": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.345 186962 DEBUG nova.network.os_vif_util [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "address": "fa:16:3e:81:85:55", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722c1241-cd", "ovs_interfaceid": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.346 186962 DEBUG nova.network.os_vif_util [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:85:55,bridge_name='br-int',has_traffic_filtering=True,id=722c1241-cdc7-49b5-8a52-ce1fe790e0cd,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722c1241-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.347 186962 DEBUG nova.objects.instance [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'pci_devices' on Instance uuid ed6db5c3-998e-480f-bda7-6a55dad0a78c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.365 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  <uuid>ed6db5c3-998e-480f-bda7-6a55dad0a78c</uuid>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  <name>instance-00000044</name>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersTestJSON-server-1678810749</nova:name>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:04:46</nova:creationTime>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:        <nova:user uuid="f2f86d3bd4814a09966b869dd539a6c9">tempest-ServersTestJSON-373958708-project-member</nova:user>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:        <nova:project uuid="1dba9539037a4e9dbf33cba140fe21fe">tempest-ServersTestJSON-373958708</nova:project>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:        <nova:port uuid="722c1241-cdc7-49b5-8a52-ce1fe790e0cd">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <entry name="serial">ed6db5c3-998e-480f-bda7-6a55dad0a78c</entry>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <entry name="uuid">ed6db5c3-998e-480f-bda7-6a55dad0a78c</entry>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.config"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:81:85:55"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <target dev="tap722c1241-cd"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/console.log" append="off"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:04:46 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:04:46 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:04:46 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:04:46 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.367 186962 DEBUG nova.compute.manager [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Preparing to wait for external event network-vif-plugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.367 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.368 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.368 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.369 186962 DEBUG nova.virt.libvirt.vif [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:04:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1678810749',display_name='tempest-ServersTestJSON-server-1678810749',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1678810749',id=68,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/b7cPkRmr/+XAGjdn1bjkhWE1hfawkyCthn1Jv+xORYaGlZIECh8qzRWf08923NzTC3Jw4d3d0KGg2a4yxoR8IWQQuNynMDQfeOWEXjmB4LfHDu0XqZbnooU9N/rdZLw==',key_name='tempest-key-93338227',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-4jg1c0dy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:04:37Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=ed6db5c3-998e-480f-bda7-6a55dad0a78c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "address": "fa:16:3e:81:85:55", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722c1241-cd", "ovs_interfaceid": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.369 186962 DEBUG nova.network.os_vif_util [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "address": "fa:16:3e:81:85:55", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722c1241-cd", "ovs_interfaceid": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.370 186962 DEBUG nova.network.os_vif_util [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:85:55,bridge_name='br-int',has_traffic_filtering=True,id=722c1241-cdc7-49b5-8a52-ce1fe790e0cd,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722c1241-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.370 186962 DEBUG os_vif [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:85:55,bridge_name='br-int',has_traffic_filtering=True,id=722c1241-cdc7-49b5-8a52-ce1fe790e0cd,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722c1241-cd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.371 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.371 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.372 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.378 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.378 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap722c1241-cd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.379 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap722c1241-cd, col_values=(('external_ids', {'iface-id': '722c1241-cdc7-49b5-8a52-ce1fe790e0cd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:81:85:55', 'vm-uuid': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.381 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:46 np0005539505 NetworkManager[55134]: <info>  [1764399886.3840] manager: (tap722c1241-cd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.385 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.388 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.391 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.393 186962 INFO os_vif [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:85:55,bridge_name='br-int',has_traffic_filtering=True,id=722c1241-cdc7-49b5-8a52-ce1fe790e0cd,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722c1241-cd')#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.466 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.467 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.467 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No VIF found with MAC fa:16:3e:81:85:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:04:46 np0005539505 nova_compute[186958]: 2025-11-29 07:04:46.468 186962 INFO nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Using config drive#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.144 186962 INFO nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Creating config drive at /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.config#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.150 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp94hv_qws execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.278 186962 DEBUG oslo_concurrency.processutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp94hv_qws" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:47 np0005539505 NetworkManager[55134]: <info>  [1764399887.3408] manager: (tap722c1241-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/131)
Nov 29 02:04:47 np0005539505 kernel: tap722c1241-cd: entered promiscuous mode
Nov 29 02:04:47 np0005539505 systemd-udevd[225114]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.344 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:47Z|00261|binding|INFO|Claiming lport 722c1241-cdc7-49b5-8a52-ce1fe790e0cd for this chassis.
Nov 29 02:04:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:47Z|00262|binding|INFO|722c1241-cdc7-49b5-8a52-ce1fe790e0cd: Claiming fa:16:3e:81:85:55 10.100.0.12
Nov 29 02:04:47 np0005539505 NetworkManager[55134]: <info>  [1764399887.3591] device (tap722c1241-cd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:04:47 np0005539505 NetworkManager[55134]: <info>  [1764399887.3601] device (tap722c1241-cd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.364 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:85:55 10.100.0.12'], port_security=['fa:16:3e:81:85:55 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=722c1241-cdc7-49b5-8a52-ce1fe790e0cd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:04:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:47Z|00263|binding|INFO|Setting lport 722c1241-cdc7-49b5-8a52-ce1fe790e0cd ovn-installed in OVS
Nov 29 02:04:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:47Z|00264|binding|INFO|Setting lport 722c1241-cdc7-49b5-8a52-ce1fe790e0cd up in Southbound
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.366 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 722c1241-cdc7-49b5-8a52-ce1fe790e0cd in datapath 9cf3a513-f54e-430e-b018-befaa643b464 bound to our chassis#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.368 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9cf3a513-f54e-430e-b018-befaa643b464#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.371 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.375 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.382 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[129ba5fb-a75e-45b2-9c03-7a982399e89f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.383 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9cf3a513-f1 in ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.384 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9cf3a513-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.385 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4c6ce4-db40-49fa-a856-81d4ed9c0c7f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.385 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1f43e6cc-f147-423f-8c9b-2b038a36e01f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 systemd-machined[153285]: New machine qemu-34-instance-00000044.
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.396 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[4fd9021c-d55d-4a94-b689-42a19f99df62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 systemd[1]: Started Virtual Machine qemu-34-instance-00000044.
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.409 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[348dfb17-abb7-4be5-ac27-f080813ecdd6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.450 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a97b9d10-9f58-40ca-8ca1-5d45528ee705]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.459 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[baaa6aa1-b0e0-4e62-ad44-5f83e3d2ddee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 NetworkManager[55134]: <info>  [1764399887.4605] manager: (tap9cf3a513-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/132)
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.512 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c8814d32-d8a0-45ec-8163-fe0ace6cc6af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.515 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[af904399-b947-4d3b-9c11-362d800d4d41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 NetworkManager[55134]: <info>  [1764399887.5444] device (tap9cf3a513-f0): carrier: link connected
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.553 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[9591df6b-6ad2-457a-a2dd-4daf62c1c9df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.572 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[45c72edf-ef88-4419-bf58-0025e1bd3dab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533512, 'reachable_time': 41279, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225196, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.591 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2d449a0a-417d-47eb-8243-4804df6666be]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:28ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533512, 'tstamp': 533512}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225201, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.606 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[04c2930f-cb2a-47b4-830c-88a57efbe915]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 85], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533512, 'reachable_time': 41279, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225203, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.641 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6c78c9b9-a103-4f33-adc8-442fdf7353d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.669 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399887.6689281, ed6db5c3-998e-480f-bda7-6a55dad0a78c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.670 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] VM Started (Lifecycle Event)#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.710 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6804a3-d435-4555-915e-37f2a1d01c51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.712 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.713 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.714 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9cf3a513-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.761 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:47 np0005539505 NetworkManager[55134]: <info>  [1764399887.7619] manager: (tap9cf3a513-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Nov 29 02:04:47 np0005539505 kernel: tap9cf3a513-f0: entered promiscuous mode
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.764 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9cf3a513-f0, col_values=(('external_ids', {'iface-id': 'ed5aef73-67a0-4ad1-8aea-9c411786c18e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.765 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:47Z|00265|binding|INFO|Releasing lport ed5aef73-67a0-4ad1-8aea-9c411786c18e from this chassis (sb_readonly=0)
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.779 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.779 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.781 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7dad16f1-3eaa-49db-93ac-273989539f1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.781 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:04:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:47.782 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'env', 'PROCESS_TAG=haproxy-9cf3a513-f54e-430e-b018-befaa643b464', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9cf3a513-f54e-430e-b018-befaa643b464.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.969 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.975 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399887.6700094, ed6db5c3-998e-480f-bda7-6a55dad0a78c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:47 np0005539505 nova_compute[186958]: 2025-11-29 07:04:47.976 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.003 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.007 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.027 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.086 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7801223-d966-4047-b510-680042881897', 'name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000046', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd09f64becda14f30b831bdf7371d586b', 'user_id': '0c56214d54944034ac2500edac59a239', 'hostId': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.089 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'name': 'tempest-ServersTestJSON-server-1678810749', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000044', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '1dba9539037a4e9dbf33cba140fe21fe', 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'hostId': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.090 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000045', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd09f64becda14f30b831bdf7371d586b', 'user_id': '0c56214d54944034ac2500edac59a239', 'hostId': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.091 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.095 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ed6db5c3-998e-480f-bda7-6a55dad0a78c / tap722c1241-cd inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.095 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5badfc6d-6c66-4703-afae-0b21173c11ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000044-ed6db5c3-998e-480f-bda7-6a55dad0a78c-tap722c1241-cd', 'timestamp': '2025-11-29T07:04:48.091287', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'tap722c1241-cd', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:85:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap722c1241-cd'}, 'message_id': 'b138cc16-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.73366115, 'message_signature': 'f2e2c897c116f0c7cbfccfcf8997826c5b8559c9436984800b0bea554f782157'}]}, 'timestamp': '2025-11-29 07:04:48.097883', '_unique_id': '9a5edc37f00a479ca249b2729ec6d40b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.099 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.112 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.113 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.124 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.124 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.136 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.136 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c468e02-6466-4e7b-93c4-6e0b319b47b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-vda', 'timestamp': '2025-11-29T07:04:48.100533', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13b7420-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.741374899, 'message_signature': 'c4ccc7571554ca41143164038fab2f29a9e7c997356d28feb5584251e8563cca'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-sda', 'timestamp': '2025-11-29T07:04:48.100533', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13b7f7e-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.741374899, 'message_signature': '9a555944f1ba3a39599dce8c71bbb9efbe943f6fd8517e574cf6faaaf754b83d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-vda', 'timestamp': '2025-11-29T07:04:48.100533', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13d1ff0-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.754592064, 'message_signature': '24e8fb93b62fad11ea7cdf7fc685654a0e72f71d48d3c8d80c300bcf8d02a62b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-sda', 'timestamp': '2025-11-29T07:04:48.100533', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b13d2e6e-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.754592064, 'message_signature': 'e56236d3e05e3762d93a42a1b928d9a5aa30ac34b5d762b6f57217ed3d0f458f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-vda', 'timestamp': '2025-11-29T07:04:48.100533', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b13efd20-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.765664028, 'message_signature': 'b4a6e5989beff1a12d6aa03cdc94625666205742ad445e4710b0b4bb72a54dd7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-sda', 'timestamp': '2025-11-29T07:04:48.100533', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: k_name': 'sda'}, 'message_id': 'b13f06d0-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.765664028, 'message_signature': '80feb0cf3ed3038eba8156472f7dcc72c8ecaaf61ec7699c9c89d98ddab3987e'}]}, 'timestamp': '2025-11-29 07:04:48.136888', '_unique_id': 'e98287ebf4d544068bbb6745ac7b6f6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2734eba5-d42b-4c82-ac19-ec130cd4742f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000044-ed6db5c3-998e-480f-bda7-6a55dad0a78c-tap722c1241-cd', 'timestamp': '2025-11-29T07:04:48.138937', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'tap722c1241-cd', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:85:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap722c1241-cd'}, 'message_id': 'b13f626a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.73366115, 'message_signature': 'a6940f406321435655af2f01f779c231865b79e305d7faef878a268e5f180e52'}]}, 'timestamp': '2025-11-29 07:04:48.139265', '_unique_id': 'a145628daa5d424b9b6ae2e4bd3a05d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.169 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.170 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.196 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.197 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.233 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.233 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de7c562e-fd31-45aa-baef-f4140f6da038', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-vda', 'timestamp': '2025-11-29T07:04:48.140606', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1441fee-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': 'ab10f737a46b341149838b1bdf572f2f74de39ec330eefa014f17627a5107a5a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-sda', 'timestamp': '2025-11-29T07:04:48.140606', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b1442d90-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': '9834784b42a415b4aa4ad9799d75f64589f7bddf1279d46df75458a1d21686b2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-vda', 'timestamp': '2025-11-29T07:04:48.140606', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b1483c3c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': 'b4eb58a034302a0793a339e27a5d07b2a347f31c7b8d66514bd1fe36b7fefc30'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-sda', 'timestamp': '2025-11-29T07:04:48.140606', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b148489e-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': 'a19f2b3f35806a457ba0697c376853433a94ca1ca9500b0938462a8fe5fccc18'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-vda', 'timestamp': '2025-11-29T07:04:48.140606', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14dd0de-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': '87d347722d060b393410efce73013b00116e280fd0b89bef73dc3face3c41ac1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-sda', 'timestamp': '2025-11-29T07:04:48.140606', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: , 'disk_name': 'sda'}, 'message_id': 'b14ddc3c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': '71bb69480b69e993dab97e6a52323977699627055046422fb5033b6efa97a786'}]}, 'timestamp': '2025-11-29 07:04:48.234133', '_unique_id': 'ea99b960dea84192b7d482814c8f7bf8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.236 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.236 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.236 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.236 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.237 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.237 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bbcfc2d-9204-4e7c-85ce-55626c16a85a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-vda', 'timestamp': '2025-11-29T07:04:48.236127', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14e3650-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': '8084b56e58ef6e291c6ff64dfcca1dfe9e5e7fc071dc29955d9e5d0700ad0ef5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-sda', 'timestamp': '2025-11-29T07:04:48.236127', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b14e3ec0-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': 'b2e9ec174d92844aa3e872ff2b9f074fe2b977e24d9c71872e4720ee6b4c1f4e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-vda', 'timestamp': '2025-11-29T07:04:48.236127', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14e46c2-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': '57e7c29715d2d8c821d7204cc8c56fbae13d127d7cdfe9ac0f1992996c1ac14f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-sda', 'timestamp': '2025-11-29T07:04:48.236127', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b14e4f5a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': '6aa24fc9edb977ff20db6516f6e473c92b99fba191325ad58f1da4ea52d04d9e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-vda', 'timestamp': '2025-11-29T07:04:48.236127', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14e5b8a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': '112180fc6c218156bd4f2b1de4ed48823987f891abd76994d4be022390b2e41c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-sda', 'timestamp': '2025-11-29T07:04:48.236127', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architec
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: , 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b14e671a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': '6c228a3284369a9e9c23b6b467415888327a7f6560380ee0339a3d6714b6b5c5'}]}, 'timestamp': '2025-11-29 07:04:48.237693', '_unique_id': '897b2838226b493491288bb693cec5a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:04:48.137 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.239 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.239 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.239 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-2>, <NovaLikeServer: tempest-ServersTestJSON-server-1678810749>, <NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-2>, <NovaLikeServer: tempest-ServersTestJSON-server-1678810749>, <NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-1>]
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.239 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.239 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.240 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.240 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.240 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3578e03a-f0c6-40bf-99fe-6a497c8d0641', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-vda', 'timestamp': '2025-11-29T07:04:48.239944', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14ec9c6-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.741374899, 'message_signature': '7d2f4ffa6b53e0aef35ba71d5bf274d8a99e35fea67a736b18a4136f2f6d3e1e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-sda', 'timestamp': '2025-11-29T07:04:48.239944', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b14ed312-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.741374899, 'message_signature': 'f06b2d7333ca24e5ceb1126f9ae35e3464302161689c35a5cb093cef765048f6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-vda', 'timestamp': '2025-11-29T07:04:48.239944', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14edaf6-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.754592064, 'message_signature': 'cbc40ac07b2c1ecf606d1c16b62154db5a5e0fde3964de5fb589fdc76fc2a5d5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-sda', 'timestamp': '2025-11-29T07:04:48.239944', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b14ee9ec-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.754592064, 'message_signature': '02cd15b81071a30b1617e7b5841741e38c83a232ea48996cc0fc959cb42b8c81'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-vda', 'timestamp': '2025-11-29T07:04:48.239944', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14ef28e-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.765664028, 'message_signature': '90d4d85f24e2dfd0383a043c7c7f398997b39c1904d40ae4966d06cbd033d1f8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-sda', 'timestamp': '2025-11-29T07:04:48.239944', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: k_name': 'sda'}, 'message_id': 'b14efaa4-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.765664028, 'message_signature': '95c1397ca6439373f86187df931b9b0f73157ad086ae80830581b3c1e33fd4c0'}]}, 'timestamp': '2025-11-29 07:04:48.241424', '_unique_id': 'a7792535724c494fba9af0e18f5565de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.242 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.242 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7cb2b88-9802-4b97-87eb-1e4fec66026b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000044-ed6db5c3-998e-480f-bda7-6a55dad0a78c-tap722c1241-cd', 'timestamp': '2025-11-29T07:04:48.242730', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'tap722c1241-cd', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:85:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap722c1241-cd'}, 'message_id': 'b14f371c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.73366115, 'message_signature': 'd20b6e33c6979e27d34ea3818a3d309b115067a545e2a017a9968db45e855ae0'}]}, 'timestamp': '2025-11-29 07:04:48.242990', '_unique_id': '7e25a177ff394038805829398bea6f6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:04:48.235 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5973e43-6dc9-479a-a727-5e86dc61bcdd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000044-ed6db5c3-998e-480f-bda7-6a55dad0a78c-tap722c1241-cd', 'timestamp': '2025-11-29T07:04:48.244077', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'tap722c1241-cd', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:85:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap722c1241-cd'}, 'message_id': 'b14f6b38-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.73366115, 'message_signature': 'e129ff3a174ec05a2ee357f1d2a92de023fb717a7cf41d14b3187a713ccee56d'}]}, 'timestamp': '2025-11-29 07:04:48.244337', '_unique_id': '303c4d0f955d4f4f9cdaf8200fdc718a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.245 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.245 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.245 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-2>, <NovaLikeServer: tempest-ServersTestJSON-server-1678810749>, <NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-2>, <NovaLikeServer: tempest-ServersTestJSON-server-1678810749>, <NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-1>]
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.245 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.246 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.246 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.246 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.246 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.247 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2806d7c5-4cb1-431b-b56d-85213a027c2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-vda', 'timestamp': '2025-11-29T07:04:48.245971', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14fb480-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': 'f93ac866e1f08837d6a81fd5997fff4108bda9869a4e8c144f98e2e916637ae5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-sda', 'timestamp': '2025-11-29T07:04:48.245971', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b14fbdc2-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': '06874b12fba2ce3a76f68c3bc50a2f2d92e840ae3a2a9f0c096c93603e8611cc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-vda', 'timestamp': '2025-11-29T07:04:48.245971', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14fc5f6-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': '4eea552148697f0074f084e835e3e9949e2d690a09f98a117ad69897324e6451'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-sda', 'timestamp': '2025-11-29T07:04:48.245971', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b14fce0c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': '548a8bf4e1aa1066cfa47fc8e0f7ced05a9c1aa0d0d6c1dd054f70cae1420598'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-vda', 'timestamp': '2025-11-29T07:04:48.245971', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b14fd708-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': '94b7aa119c362ef9ade9970f612a4bc778e98bc319c71bbb169374e9a22c9b38'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-sda', 'timestamp': '2025-11-29T07:04:48.245971', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'arch
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: ': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b14fe220-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': '4d6be89b97a32f20527bb2feed8984c2f93fe7b325d302b65d3c585130adcfe2'}]}, 'timestamp': '2025-11-29 07:04:48.247399', '_unique_id': 'cc59d4c40219498f88d506fc32840e65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:04:48.238 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.248 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.249 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-2>, <NovaLikeServer: tempest-ServersTestJSON-server-1678810749>, <NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-2>, <NovaLikeServer: tempest-ServersTestJSON-server-1678810749>, <NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-1>]
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.249 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '788d4333-eb48-4cd1-8d9e-27eff4f21755', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000044-ed6db5c3-998e-480f-bda7-6a55dad0a78c-tap722c1241-cd', 'timestamp': '2025-11-29T07:04:48.249365', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'tap722c1241-cd', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:85:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap722c1241-cd'}, 'message_id': 'b1503afe-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.73366115, 'message_signature': 'c0df4e9c91512cb493085bac9fecd3bdca9d26d4ed8e43f7978368b21624f346'}]}, 'timestamp': '2025-11-29 07:04:48.249677', '_unique_id': '796e33d1e205473c8d7e5b0bb8dc5e81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:04:48.241 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.251 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:04:48 np0005539505 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:04:48.248 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:04:48 np0005539505 podman[225235]: 2025-11-29 07:04:48.190254478 +0000 UTC m=+0.024147546 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.286 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.286 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance b7801223-d966-4047-b510-680042881897: ceilometer.compute.pollsters.NoVolumeException
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.309 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.309 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance ed6db5c3-998e-480f-bda7-6a55dad0a78c: ceilometer.compute.pollsters.NoVolumeException
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.328 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.328 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance b5fe1733-1d14-4b12-870c-69a44f532ef4: ceilometer.compute.pollsters.NoVolumeException
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.329 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.329 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.329 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.329 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.330 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.330 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9634f14c-c325-452f-80a1-15dfdd11ddb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-vda', 'timestamp': '2025-11-29T07:04:48.329129', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15c67fc-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.741374899, 'message_signature': 'f3ec869665b5a1d9775a7d1a7cd8e484bd8340e2cdd27e68b69b3c744267b7d5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-sda', 'timestamp': '2025-11-29T07:04:48.329129', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b15c72b0-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.741374899, 'message_signature': '87e05478f87c969c81d77d6136f1ed5ed2a2389117480885c6cd62a272b2fa17'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-vda', 'timestamp': '2025-11-29T07:04:48.329129', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15c7a58-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.754592064, 'message_signature': 'b7ff64149553ebba2df9d112c8c624ffa340a80cfd19e71e57d1bad4a78872ed'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-sda', 'timestamp': '2025-11-29T07:04:48.329129', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b15c839a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.754592064, 'message_signature': '494d6edb305d11a36c8549c256bf4832ce3211940c5386d2b1580f6618c397bc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-vda', 'timestamp': '2025-11-29T07:04:48.329129', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15c8c00-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.765664028, 'message_signature': '030e0cacdf4ae9f0f0d81243c61c49e97bef508f645507f0144fea12bd7406a7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-sda', 'timestamp': '2025-11-29T07:04:48.329129', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'me
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 'b15c9768-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.765664028, 'message_signature': '0e72e67f18b20922fb99b48dd47518faf395245f50c3071028e7bb337c8d3207'}]}, 'timestamp': '2025-11-29 07:04:48.330681', '_unique_id': '1d16ca841096478bb394c1b529b0ef7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.332 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.333 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.333 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.333 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.333 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1eb770ec-e54b-44f0-9817-cb4a02a7bd3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-vda', 'timestamp': '2025-11-29T07:04:48.332907', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15cfa96-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': '2953851a651c852dbda5f9d6d3752cce0cdc21ceaedb16eb600e7cc2eaee302e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-sda', 'timestamp': '2025-11-29T07:04:48.332907', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b15d066c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': 'e368ab376034815c8d69b8f6ef9be7567936d637dcfa8908c269b4d19f8b0684'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-vda', 'timestamp': '2025-11-29T07:04:48.332907', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15d0fd6-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': 'eb59bc60930912626c6c8f0ad232c80ce21040c99a82a059fcf95ca5664a44f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-sda', 'timestamp': '2025-11-29T07:04:48.332907', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b15d1742-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': 'bdbf421c201134c09b7cb0d68a00998377aa6af1380cbe93963b9da5865ada5e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-vda', 'timestamp': '2025-11-29T07:04:48.332907', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15d1e72-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': '8192bc352c562549b7e3dc2bf6b9812e8e9d19e6082158bedfc116025a91b94d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-sda', 'timestamp': '2025-11-29T07:04:48.332907', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddea
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: b': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b15d258e-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': 'df82437ccf8861b3713423e607892eaee807f786bbd9bae630bcd3de1e5c6cdc'}]}, 'timestamp': '2025-11-29 07:04:48.334289', '_unique_id': 'd498086a1dbf4e2291f68af0d3b5b4d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.335 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.335 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d2253dc-8bc0-41e3-a684-fa08bda6dd83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000044-ed6db5c3-998e-480f-bda7-6a55dad0a78c-tap722c1241-cd', 'timestamp': '2025-11-29T07:04:48.335549', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'tap722c1241-cd', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:85:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap722c1241-cd'}, 'message_id': 'b15d608a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.73366115, 'message_signature': 'c630723cb0cc04bd61ea2bb231dc4a07bb3354b1e02f4810d2ec68d30c948122'}]}, 'timestamp': '2025-11-29 07:04:48.335801', '_unique_id': '324ee97ccee84e8caea5f33ab0ca77cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.336 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '025a7929-8f40-43b6-b8a3-a2529442d686', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000044-ed6db5c3-998e-480f-bda7-6a55dad0a78c-tap722c1241-cd', 'timestamp': '2025-11-29T07:04:48.336884', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'tap722c1241-cd', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:85:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap722c1241-cd'}, 'message_id': 'b15d949c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.73366115, 'message_signature': '86d84e03166e0930551f6a50ca0db7687e8bdb324d200b2273e350aaf64b85f6'}]}, 'timestamp': '2025-11-29 07:04:48.337168', '_unique_id': 'c7df2e96d4324270a79e83d57608ca78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.337 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.338 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.338 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b2cf8631-2c60-44df-b28f-ca1c77fb34a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000044-ed6db5c3-998e-480f-bda7-6a55dad0a78c-tap722c1241-cd', 'timestamp': '2025-11-29T07:04:48.338606', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'tap722c1241-cd', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:85:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap722c1241-cd'}, 'message_id': 'b15dd8e4-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.73366115, 'message_signature': '4747fa547ab84845d76d55ab77473c28941d8254b5602da68631cd7c1e8980c4'}]}, 'timestamp': '2025-11-29 07:04:48.338915', '_unique_id': 'a52134cdc6864633b7243710597e5c8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.339 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.340 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.340 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.read.latency volume: 192187271 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.340 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.read.latency volume: 903285 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.340 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.340 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.341 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.read.latency volume: 161305423 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.341 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.read.latency volume: 15357576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6519a20-2500-4e33-9ea5-c9835eb1e3aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 192187271, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-vda', 'timestamp': '2025-11-29T07:04:48.340289', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15e1958-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': '684894c6303a774b83a01120e3ed292490960a29a8ebaf7780f2f90aaac1e738'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 903285, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-sda', 'timestamp': '2025-11-29T07:04:48.340289', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b15e213c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': '4250d2e5c4881b563e5618ff444369832d060b16f1b3e2ba5ea4547ffc4b99cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-vda', 'timestamp': '2025-11-29T07:04:48.340289', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15e288a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': '036582cf2cfb18896d7e4a4b09ce414df2d00c732aa2e611004d9a7ab6d05107'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-sda', 'timestamp': '2025-11-29T07:04:48.340289', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b15e2fa6-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': '78fd92e4cb9a28e6b33c8656d2c2736360ec74765e127313deeac5fbdd549d90'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 161305423, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-vda', 'timestamp': '2025-11-29T07:04:48.340289', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15e36c2-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': '785f50eb2e6a78422fba83a6202b17c86fbbfa11606de450b2ca8169792dbdd1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15357576, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-sda', 'timestamp': '2025-11-29T07:04:48.340289', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: _gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b15e4180-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': 'b0d7a8f7dba0b8fbd11cb40c512c637748610a7873fb74acad4cbe6bdb045aed'}]}, 'timestamp': '2025-11-29 07:04:48.341537', '_unique_id': '2149853615b34824ac8b20057ba715ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.342 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8242fb5-c609-4507-9775-5d87529670e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000044-ed6db5c3-998e-480f-bda7-6a55dad0a78c-tap722c1241-cd', 'timestamp': '2025-11-29T07:04:48.342896', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'tap722c1241-cd', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:85:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap722c1241-cd'}, 'message_id': 'b15e7f06-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.73366115, 'message_signature': '14b6635a9857d1bfd5a111f6c90436b064c9946841238f86bb7cfbd14f6d67da'}]}, 'timestamp': '2025-11-29 07:04:48.343125', '_unique_id': '7b12ab9113bc4b55bc3b2563c57285ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.343 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.344 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.344 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.344 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.344 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.344 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.345 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.345 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db8fe790-0f66-4514-aa25-3940655e5508', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-vda', 'timestamp': '2025-11-29T07:04:48.344290', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15eb552-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': '5824e942b7e886ed6452c71a122b0029da76053fb4d7d60cc74b06f788cb35e2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897-sda', 'timestamp': '2025-11-29T07:04:48.344290', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b15ebdae-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.781451256, 'message_signature': '93459c4b51916ac7e55016e3fe194a7e3cb857aa927d8e7531284abf1f02d3b0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-vda', 'timestamp': '2025-11-29T07:04:48.344290', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15ec696-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': 'c8610bb719695169249bea0cd924dde0b5a89f50eaaac4c8b62c1c76845113ed'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c-sda', 'timestamp': '2025-11-29T07:04:48.344290', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b15ece16-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.811481778, 'message_signature': 'df3822a8455cc9d1bcfceb48aa52a693772971828664978e97820a6ceafdc1a0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-vda', 'timestamp': '2025-11-29T07:04:48.344290', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b15ed65e-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': 'ac690d53b0cfd10bafc02d2e634e38a30d1f554c02a6df2ce3edbdbbff514363'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4-sda', 'timestamp': '2025-11-29T07:04:48.344290', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: : 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b15eddb6-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.838385591, 'message_signature': '7efa813124e1def678bd1e9e003ba8a57e53fcaf4610c1d4c1a0ffc2bb390ba1'}]}, 'timestamp': '2025-11-29 07:04:48.345528', '_unique_id': 'fc48f2f22f0b46a59d4bf9974fa390f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 DEBUG ceilometer.compute.pollsters [-] b7801223-d966-4047-b510-680042881897/cpu volume: 2370000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.346 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 DEBUG ceilometer.compute.pollsters [-] b5fe1733-1d14-4b12-870c-69a44f532ef4/cpu volume: 3100000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aff08e0b-ce75-4be3-bb24-5971019d196c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2370000000, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b7801223-d966-4047-b510-680042881897', 'timestamp': '2025-11-29T07:04:48.346706', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-2', 'name': 'instance-00000046', 'instance_id': 'b7801223-d966-4047-b510-680042881897', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b15f13b2-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.927193201, 'message_signature': '76f4f775a14820f251c8b8f66d6cbd8ff72acf711e27b8ff0c454e2f0113092c'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'timestamp': '2025-11-29T07:04:48.346706', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'instance-00000044', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b15f1b50-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.950274287, 'message_signature': '339e2a98caceeb8d4ec1b3ec2fbc3deb921fb474259dab2649185255471cc273'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3100000000, 'user_id': '0c56214d54944034ac2500edac59a239', 'user_name': None, 'project_id': 'd09f64becda14f30b831bdf7371d586b', 'project_name': None, 'resource_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'timestamp': '2025-11-29T07:04:48.346706', 'resource_metadata': {'display_name': 'tempest-ServersOnMultiNodesTest-server-1980003394-1', 'name': 'instance-00000045', 'instance_id': 'b5fe1733-1d14-4b12-870c-69a44f532ef4', 'instance_type': 'm1.nano', 'host': '2005f8f6f5082c6aa6607afb13bb67c7ea92b55a82b694e13edd9ef6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b15f2294-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.969580895, 'message_signature': 'a4b16cc3aa37a7e69a2d1e95bfa545a9f0d6577ae0d770d9ae45d5111b180cbc'}]}, 'timestamp': '2025-11-29 07:04:48.347308', '_unique_id': '10ac52a749a74d39876c15d7f813d056'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.347 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.348 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.348 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.348 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-2>, <NovaLikeServer: tempest-ServersTestJSON-server-1678810749>, <NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-2>, <NovaLikeServer: tempest-ServersTestJSON-server-1678810749>, <NovaLikeServer: tempest-ServersOnMultiNodesTest-server-1980003394-1>]
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.348 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.348 12 DEBUG ceilometer.compute.pollsters [-] ed6db5c3-998e-480f-bda7-6a55dad0a78c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fa9a0bf-131b-4169-8c49-ae950b1378d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-00000044-ed6db5c3-998e-480f-bda7-6a55dad0a78c-tap722c1241-cd', 'timestamp': '2025-11-29T07:04:48.348681', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-1678810749', 'name': 'tap722c1241-cd', 'instance_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:81:85:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap722c1241-cd'}, 'message_id': 'b15f60e2-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5335.73366115, 'message_signature': '64abe572f93b4d6b299d5c680e6cc3423b99c5a5879d632be730732d444cacab'}]}, 'timestamp': '2025-11-29 07:04:48.348904', '_unique_id': '7e51bfe985e24ee4aebfe5d26274f6ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:04:48.349 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.435 186962 DEBUG nova.compute.manager [req-eb1ca005-7676-4337-8285-8427064afeec req-8bef0f92-2a42-44b8-90b5-4b3852e13670 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Received event network-vif-plugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.436 186962 DEBUG oslo_concurrency.lockutils [req-eb1ca005-7676-4337-8285-8427064afeec req-8bef0f92-2a42-44b8-90b5-4b3852e13670 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.436 186962 DEBUG oslo_concurrency.lockutils [req-eb1ca005-7676-4337-8285-8427064afeec req-8bef0f92-2a42-44b8-90b5-4b3852e13670 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.436 186962 DEBUG oslo_concurrency.lockutils [req-eb1ca005-7676-4337-8285-8427064afeec req-8bef0f92-2a42-44b8-90b5-4b3852e13670 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.436 186962 DEBUG nova.compute.manager [req-eb1ca005-7676-4337-8285-8427064afeec req-8bef0f92-2a42-44b8-90b5-4b3852e13670 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Processing event network-vif-plugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.437 186962 DEBUG nova.compute.manager [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.446 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399888.4405882, ed6db5c3-998e-480f-bda7-6a55dad0a78c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.447 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.448 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.453 186962 INFO nova.virt.libvirt.driver [-] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Instance spawned successfully.#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.453 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.474 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.479 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.482 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.483 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.483 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.484 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.485 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.486 186962 DEBUG nova.virt.libvirt.driver [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:04:48 np0005539505 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:04:48.331 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:04:48 np0005539505 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:04:48.334 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.512 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:04:48 np0005539505 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:04:48.342 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:04:48 np0005539505 rsyslogd[1004]: message too long (8192) with configured size 8096, begin of message is: 2025-11-29 07:04:48.346 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.561 186962 INFO nova.compute.manager [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Took 10.84 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.562 186962 DEBUG nova.compute.manager [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.570 186962 DEBUG nova.network.neutron [req-440191de-0fce-4eb6-b5f2-347302a09120 req-67b2b869-21ba-41dd-8cf4-bbe99d946556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Updated VIF entry in instance network info cache for port 722c1241-cdc7-49b5-8a52-ce1fe790e0cd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.571 186962 DEBUG nova.network.neutron [req-440191de-0fce-4eb6-b5f2-347302a09120 req-67b2b869-21ba-41dd-8cf4-bbe99d946556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Updating instance_info_cache with network_info: [{"id": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "address": "fa:16:3e:81:85:55", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722c1241-cd", "ovs_interfaceid": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.597 186962 DEBUG oslo_concurrency.lockutils [req-440191de-0fce-4eb6-b5f2-347302a09120 req-67b2b869-21ba-41dd-8cf4-bbe99d946556 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-ed6db5c3-998e-480f-bda7-6a55dad0a78c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.659 186962 INFO nova.compute.manager [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Took 11.40 seconds to build instance.#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.685 186962 DEBUG oslo_concurrency.lockutils [None req-6af274c5-939c-4e77-9cca-4d3795fca90e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:48 np0005539505 nova_compute[186958]: 2025-11-29 07:04:48.930 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:49 np0005539505 podman[225235]: 2025-11-29 07:04:49.718422432 +0000 UTC m=+1.552315500 container create b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:04:50 np0005539505 systemd[1]: Started libpod-conmon-b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7.scope.
Nov 29 02:04:50 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:04:50 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44cb971ca8511dec6e39ed93122408fe36290efbcc7266db9a266419b98fffd4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:04:50 np0005539505 nova_compute[186958]: 2025-11-29 07:04:50.894 186962 DEBUG nova.compute.manager [req-9202ccca-fc1f-4aa2-9933-8c4fbb175192 req-a2d36bc7-5728-4bd5-9029-f7b56e9f88cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Received event network-vif-plugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:50 np0005539505 nova_compute[186958]: 2025-11-29 07:04:50.894 186962 DEBUG oslo_concurrency.lockutils [req-9202ccca-fc1f-4aa2-9933-8c4fbb175192 req-a2d36bc7-5728-4bd5-9029-f7b56e9f88cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:50 np0005539505 nova_compute[186958]: 2025-11-29 07:04:50.894 186962 DEBUG oslo_concurrency.lockutils [req-9202ccca-fc1f-4aa2-9933-8c4fbb175192 req-a2d36bc7-5728-4bd5-9029-f7b56e9f88cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:50 np0005539505 nova_compute[186958]: 2025-11-29 07:04:50.894 186962 DEBUG oslo_concurrency.lockutils [req-9202ccca-fc1f-4aa2-9933-8c4fbb175192 req-a2d36bc7-5728-4bd5-9029-f7b56e9f88cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:50 np0005539505 nova_compute[186958]: 2025-11-29 07:04:50.895 186962 DEBUG nova.compute.manager [req-9202ccca-fc1f-4aa2-9933-8c4fbb175192 req-a2d36bc7-5728-4bd5-9029-f7b56e9f88cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] No waiting events found dispatching network-vif-plugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:04:50 np0005539505 nova_compute[186958]: 2025-11-29 07:04:50.895 186962 WARNING nova.compute.manager [req-9202ccca-fc1f-4aa2-9933-8c4fbb175192 req-a2d36bc7-5728-4bd5-9029-f7b56e9f88cb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Received unexpected event network-vif-plugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd for instance with vm_state active and task_state None.#033[00m
Nov 29 02:04:50 np0005539505 podman[225235]: 2025-11-29 07:04:50.901371959 +0000 UTC m=+2.735264997 container init b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 29 02:04:50 np0005539505 podman[225235]: 2025-11-29 07:04:50.909686435 +0000 UTC m=+2.743579473 container start b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:04:50 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225251]: [NOTICE]   (225271) : New worker (225273) forked
Nov 29 02:04:50 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225251]: [NOTICE]   (225271) : Loading success.
Nov 29 02:04:51 np0005539505 nova_compute[186958]: 2025-11-29 07:04:51.382 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:51 np0005539505 nova_compute[186958]: 2025-11-29 07:04:51.978 186962 DEBUG oslo_concurrency.lockutils [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:51 np0005539505 nova_compute[186958]: 2025-11-29 07:04:51.979 186962 DEBUG oslo_concurrency.lockutils [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:51 np0005539505 nova_compute[186958]: 2025-11-29 07:04:51.979 186962 DEBUG oslo_concurrency.lockutils [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:51 np0005539505 nova_compute[186958]: 2025-11-29 07:04:51.979 186962 DEBUG oslo_concurrency.lockutils [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:51 np0005539505 nova_compute[186958]: 2025-11-29 07:04:51.980 186962 DEBUG oslo_concurrency.lockutils [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:51 np0005539505 nova_compute[186958]: 2025-11-29 07:04:51.994 186962 INFO nova.compute.manager [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Terminating instance#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.010 186962 DEBUG nova.compute.manager [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:04:52 np0005539505 kernel: tap722c1241-cd (unregistering): left promiscuous mode
Nov 29 02:04:52 np0005539505 NetworkManager[55134]: <info>  [1764399892.0327] device (tap722c1241-cd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:04:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:52Z|00266|binding|INFO|Releasing lport 722c1241-cdc7-49b5-8a52-ce1fe790e0cd from this chassis (sb_readonly=0)
Nov 29 02:04:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:52Z|00267|binding|INFO|Setting lport 722c1241-cdc7-49b5-8a52-ce1fe790e0cd down in Southbound
Nov 29 02:04:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:04:52Z|00268|binding|INFO|Removing iface tap722c1241-cd ovn-installed in OVS
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.043 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:52.051 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:85:55 10.100.0.12'], port_security=['fa:16:3e:81:85:55 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ed6db5c3-998e-480f-bda7-6a55dad0a78c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=722c1241-cdc7-49b5-8a52-ce1fe790e0cd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.068 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:52 np0005539505 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000044.scope: Deactivated successfully.
Nov 29 02:04:52 np0005539505 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000044.scope: Consumed 3.832s CPU time.
Nov 29 02:04:52 np0005539505 systemd-machined[153285]: Machine qemu-34-instance-00000044 terminated.
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.280 186962 INFO nova.virt.libvirt.driver [-] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Instance destroyed successfully.#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.281 186962 DEBUG nova.objects.instance [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'resources' on Instance uuid ed6db5c3-998e-480f-bda7-6a55dad0a78c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.302 186962 DEBUG nova.virt.libvirt.vif [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1678810749',display_name='tempest-ServersTestJSON-server-1678810749',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1678810749',id=68,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/b7cPkRmr/+XAGjdn1bjkhWE1hfawkyCthn1Jv+xORYaGlZIECh8qzRWf08923NzTC3Jw4d3d0KGg2a4yxoR8IWQQuNynMDQfeOWEXjmB4LfHDu0XqZbnooU9N/rdZLw==',key_name='tempest-key-93338227',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-4jg1c0dy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:04:48Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=ed6db5c3-998e-480f-bda7-6a55dad0a78c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "address": "fa:16:3e:81:85:55", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722c1241-cd", "ovs_interfaceid": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.303 186962 DEBUG nova.network.os_vif_util [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "address": "fa:16:3e:81:85:55", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap722c1241-cd", "ovs_interfaceid": "722c1241-cdc7-49b5-8a52-ce1fe790e0cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.304 186962 DEBUG nova.network.os_vif_util [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:85:55,bridge_name='br-int',has_traffic_filtering=True,id=722c1241-cdc7-49b5-8a52-ce1fe790e0cd,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722c1241-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.305 186962 DEBUG os_vif [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:85:55,bridge_name='br-int',has_traffic_filtering=True,id=722c1241-cdc7-49b5-8a52-ce1fe790e0cd,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722c1241-cd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.308 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.308 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap722c1241-cd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.310 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.311 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.313 186962 INFO os_vif [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:85:55,bridge_name='br-int',has_traffic_filtering=True,id=722c1241-cdc7-49b5-8a52-ce1fe790e0cd,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap722c1241-cd')#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.314 186962 INFO nova.virt.libvirt.driver [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Deleting instance files /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c_del#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.315 186962 INFO nova.virt.libvirt.driver [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Deletion of /var/lib/nova/instances/ed6db5c3-998e-480f-bda7-6a55dad0a78c_del complete#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.422 186962 INFO nova.compute.manager [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.422 186962 DEBUG oslo.service.loopingcall [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.433 186962 DEBUG nova.compute.manager [-] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.433 186962 DEBUG nova.network.neutron [-] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:04:52 np0005539505 podman[225250]: 2025-11-29 07:04:52.466870624 +0000 UTC m=+2.073571187 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:04:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:52.484 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 722c1241-cdc7-49b5-8a52-ce1fe790e0cd in datapath 9cf3a513-f54e-430e-b018-befaa643b464 unbound from our chassis#033[00m
Nov 29 02:04:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:52.486 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cf3a513-f54e-430e-b018-befaa643b464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:04:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:52.487 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[39a53a49-2858-4b63-a519-09467429f893]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:52.487 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace which is not needed anymore#033[00m
Nov 29 02:04:52 np0005539505 podman[225253]: 2025-11-29 07:04:52.50477482 +0000 UTC m=+2.098573487 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Nov 29 02:04:52 np0005539505 nova_compute[186958]: 2025-11-29 07:04:52.627 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:52.629 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:04:52 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225251]: [NOTICE]   (225271) : haproxy version is 2.8.14-c23fe91
Nov 29 02:04:52 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225251]: [NOTICE]   (225271) : path to executable is /usr/sbin/haproxy
Nov 29 02:04:52 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225251]: [WARNING]  (225271) : Exiting Master process...
Nov 29 02:04:52 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225251]: [WARNING]  (225271) : Exiting Master process...
Nov 29 02:04:52 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225251]: [ALERT]    (225271) : Current worker (225273) exited with code 143 (Terminated)
Nov 29 02:04:52 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225251]: [WARNING]  (225271) : All workers exited. Exiting... (0)
Nov 29 02:04:52 np0005539505 systemd[1]: libpod-b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7.scope: Deactivated successfully.
Nov 29 02:04:52 np0005539505 conmon[225251]: conmon b086cae154e2ff660bda <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7.scope/container/memory.events
Nov 29 02:04:52 np0005539505 podman[225350]: 2025-11-29 07:04:52.792265439 +0000 UTC m=+0.206002097 container died b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.373 186962 DEBUG nova.compute.manager [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Received event network-vif-unplugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.373 186962 DEBUG oslo_concurrency.lockutils [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.374 186962 DEBUG oslo_concurrency.lockutils [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.375 186962 DEBUG oslo_concurrency.lockutils [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.375 186962 DEBUG nova.compute.manager [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] No waiting events found dispatching network-vif-unplugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.376 186962 DEBUG nova.compute.manager [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Received event network-vif-unplugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.376 186962 DEBUG nova.compute.manager [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Received event network-vif-plugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.377 186962 DEBUG oslo_concurrency.lockutils [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.377 186962 DEBUG oslo_concurrency.lockutils [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.378 186962 DEBUG oslo_concurrency.lockutils [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.378 186962 DEBUG nova.compute.manager [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] No waiting events found dispatching network-vif-plugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.379 186962 WARNING nova.compute.manager [req-2a050bad-52c4-461d-be70-8fcc023b81a4 req-72c700fe-49a4-4989-b03b-011c2badd340 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Received unexpected event network-vif-plugged-722c1241-cdc7-49b5-8a52-ce1fe790e0cd for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:04:53 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7-userdata-shm.mount: Deactivated successfully.
Nov 29 02:04:53 np0005539505 systemd[1]: var-lib-containers-storage-overlay-44cb971ca8511dec6e39ed93122408fe36290efbcc7266db9a266419b98fffd4-merged.mount: Deactivated successfully.
Nov 29 02:04:53 np0005539505 nova_compute[186958]: 2025-11-29 07:04:53.933 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:54 np0005539505 podman[225350]: 2025-11-29 07:04:54.348146312 +0000 UTC m=+1.761882970 container cleanup b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:04:54 np0005539505 systemd[1]: libpod-conmon-b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7.scope: Deactivated successfully.
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.390 186962 DEBUG nova.network.neutron [-] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.419 186962 INFO nova.compute.manager [-] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Took 1.99 seconds to deallocate network for instance.#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.504 186962 DEBUG nova.compute.manager [req-1f9b48c7-8ce3-41b7-8430-4c8943ba8d69 req-306604b9-f23c-4144-b93d-c71bb026a5aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Received event network-vif-deleted-722c1241-cdc7-49b5-8a52-ce1fe790e0cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.510 186962 DEBUG oslo_concurrency.lockutils [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.510 186962 DEBUG oslo_concurrency.lockutils [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:54 np0005539505 podman[225380]: 2025-11-29 07:04:54.548348164 +0000 UTC m=+0.173843055 container remove b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 02:04:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:54.554 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b850a45c-dc26-4c6b-b20c-d7c1dfc77527]: (4, ('Sat Nov 29 07:04:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7)\nb086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7\nSat Nov 29 07:04:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (b086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7)\nb086cae154e2ff660bda856e8a8131123cfbc0f65fc49c64be0038c62b4e07a7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:54.557 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[eb623ebe-10ed-4154-a15c-445862a1d08e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:54.559 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.561 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:54 np0005539505 kernel: tap9cf3a513-f0: left promiscuous mode
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.566 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:54.568 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[be720354-4e81-426b-945f-a34b564e9720]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.578 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:54.585 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[51ed4c90-e968-4bc7-8435-3d88b9f6b9e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:54.587 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb51ebd-ed1e-4403-9c39-8e1989c8fd3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:54.608 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[105a81f0-4242-4b35-891b-1441677357b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533502, 'reachable_time': 22375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225396, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:54.612 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:04:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:54.613 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[333901ff-9c12-4b7e-bdbb-06cccd162b6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:04:54 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9cf3a513\x2df54e\x2d430e\x2db018\x2dbefaa643b464.mount: Deactivated successfully.
Nov 29 02:04:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:54.614 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.629 186962 DEBUG nova.compute.provider_tree [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.648 186962 DEBUG nova.scheduler.client.report [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.686 186962 DEBUG oslo_concurrency.lockutils [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.728 186962 INFO nova.scheduler.client.report [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Deleted allocations for instance ed6db5c3-998e-480f-bda7-6a55dad0a78c#033[00m
Nov 29 02:04:54 np0005539505 nova_compute[186958]: 2025-11-29 07:04:54.824 186962 DEBUG oslo_concurrency.lockutils [None req-1c4dd9e0-9aef-4a32-b7a8-418de02237a6 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "ed6db5c3-998e-480f-bda7-6a55dad0a78c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.846s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.413 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.415 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.415 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.416 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.491 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.553 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.555 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.612 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.618 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.681 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.683 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.748 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.894 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.896 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5440MB free_disk=73.22471237182617GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.896 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:55 np0005539505 nova_compute[186958]: 2025-11-29 07:04:55.897 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:56 np0005539505 nova_compute[186958]: 2025-11-29 07:04:56.219 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance b5fe1733-1d14-4b12-870c-69a44f532ef4 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:04:56 np0005539505 nova_compute[186958]: 2025-11-29 07:04:56.219 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance b7801223-d966-4047-b510-680042881897 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:04:56 np0005539505 nova_compute[186958]: 2025-11-29 07:04:56.220 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:04:56 np0005539505 nova_compute[186958]: 2025-11-29 07:04:56.220 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:04:56 np0005539505 nova_compute[186958]: 2025-11-29 07:04:56.273 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:04:56 np0005539505 nova_compute[186958]: 2025-11-29 07:04:56.436 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:04:56 np0005539505 nova_compute[186958]: 2025-11-29 07:04:56.841 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:04:56 np0005539505 nova_compute[186958]: 2025-11-29 07:04:56.842 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.945s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:04:57 np0005539505 nova_compute[186958]: 2025-11-29 07:04:57.311 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:57 np0005539505 podman[225413]: 2025-11-29 07:04:57.7257162 +0000 UTC m=+0.052169232 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 29 02:04:58 np0005539505 nova_compute[186958]: 2025-11-29 07:04:58.838 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:04:58 np0005539505 nova_compute[186958]: 2025-11-29 07:04:58.952 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:04:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:04:59.616 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:04:59 np0005539505 podman[225452]: 2025-11-29 07:04:59.723906866 +0000 UTC m=+0.054176129 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 29 02:04:59 np0005539505 nova_compute[186958]: 2025-11-29 07:04:59.797 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:59 np0005539505 nova_compute[186958]: 2025-11-29 07:04:59.798 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:04:59 np0005539505 nova_compute[186958]: 2025-11-29 07:04:59.820 186962 DEBUG nova.compute.manager [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:04:59 np0005539505 nova_compute[186958]: 2025-11-29 07:04:59.995 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:04:59 np0005539505 nova_compute[186958]: 2025-11-29 07:04:59.995 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.003 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.004 186962 INFO nova.compute.claims [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.210 186962 DEBUG nova.compute.provider_tree [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.224 186962 DEBUG nova.scheduler.client.report [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.245 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.246 186962 DEBUG nova.compute.manager [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.301 186962 DEBUG nova.compute.manager [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.301 186962 DEBUG nova.network.neutron [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.321 186962 INFO nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.338 186962 DEBUG nova.compute.manager [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.462 186962 DEBUG nova.compute.manager [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.463 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.464 186962 INFO nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Creating image(s)#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.464 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "/var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.465 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.466 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.481 186962 DEBUG nova.policy [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.484 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.542 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.543 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.544 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.555 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.618 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:00 np0005539505 nova_compute[186958]: 2025-11-29 07:05:00.620 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:01 np0005539505 nova_compute[186958]: 2025-11-29 07:05:01.177 186962 DEBUG nova.network.neutron [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Successfully created port: 29aac0a6-692c-4971-9359-052956337832 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.315 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.456 186962 DEBUG nova.network.neutron [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Successfully updated port: 29aac0a6-692c-4971-9359-052956337832 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.561 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "refresh_cache-b7cf911a-b1c6-47f2-aed3-6384f2ef588c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.562 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquired lock "refresh_cache-b7cf911a-b1c6-47f2-aed3-6384f2ef588c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.562 186962 DEBUG nova.network.neutron [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.571 186962 DEBUG nova.compute.manager [req-c6d3c9d7-1228-4ae4-bb00-1d4197c7a452 req-8cb05d89-bac0-4027-9fef-402c7dc33b56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Received event network-changed-29aac0a6-692c-4971-9359-052956337832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.571 186962 DEBUG nova.compute.manager [req-c6d3c9d7-1228-4ae4-bb00-1d4197c7a452 req-8cb05d89-bac0-4027-9fef-402c7dc33b56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Refreshing instance network info cache due to event network-changed-29aac0a6-692c-4971-9359-052956337832. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.572 186962 DEBUG oslo_concurrency.lockutils [req-c6d3c9d7-1228-4ae4-bb00-1d4197c7a452 req-8cb05d89-bac0-4027-9fef-402c7dc33b56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b7cf911a-b1c6-47f2-aed3-6384f2ef588c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:02 np0005539505 nova_compute[186958]: 2025-11-29 07:05:02.858 186962 DEBUG nova.network.neutron [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:05:03 np0005539505 nova_compute[186958]: 2025-11-29 07:05:03.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:03 np0005539505 nova_compute[186958]: 2025-11-29 07:05:03.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:05:03 np0005539505 nova_compute[186958]: 2025-11-29 07:05:03.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:05:03 np0005539505 nova_compute[186958]: 2025-11-29 07:05:03.570 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:05:03 np0005539505 nova_compute[186958]: 2025-11-29 07:05:03.743 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-b5fe1733-1d14-4b12-870c-69a44f532ef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:03 np0005539505 nova_compute[186958]: 2025-11-29 07:05:03.744 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-b5fe1733-1d14-4b12-870c-69a44f532ef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:03 np0005539505 nova_compute[186958]: 2025-11-29 07:05:03.744 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:05:03 np0005539505 nova_compute[186958]: 2025-11-29 07:05:03.744 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b5fe1733-1d14-4b12-870c-69a44f532ef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:03 np0005539505 nova_compute[186958]: 2025-11-29 07:05:03.953 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.073 186962 DEBUG nova.network.neutron [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Updating instance_info_cache with network_info: [{"id": "29aac0a6-692c-4971-9359-052956337832", "address": "fa:16:3e:00:4d:21", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29aac0a6-69", "ovs_interfaceid": "29aac0a6-692c-4971-9359-052956337832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.409 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Releasing lock "refresh_cache-b7cf911a-b1c6-47f2-aed3-6384f2ef588c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.410 186962 DEBUG nova.compute.manager [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Instance network_info: |[{"id": "29aac0a6-692c-4971-9359-052956337832", "address": "fa:16:3e:00:4d:21", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29aac0a6-69", "ovs_interfaceid": "29aac0a6-692c-4971-9359-052956337832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.410 186962 DEBUG oslo_concurrency.lockutils [req-c6d3c9d7-1228-4ae4-bb00-1d4197c7a452 req-8cb05d89-bac0-4027-9fef-402c7dc33b56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b7cf911a-b1c6-47f2-aed3-6384f2ef588c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.410 186962 DEBUG nova.network.neutron [req-c6d3c9d7-1228-4ae4-bb00-1d4197c7a452 req-8cb05d89-bac0-4027-9fef-402c7dc33b56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Refreshing network info cache for port 29aac0a6-692c-4971-9359-052956337832 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.442 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.687 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "b5fe1733-1d14-4b12-870c-69a44f532ef4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.688 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b5fe1733-1d14-4b12-870c-69a44f532ef4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.688 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "b5fe1733-1d14-4b12-870c-69a44f532ef4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.688 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b5fe1733-1d14-4b12-870c-69a44f532ef4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.689 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b5fe1733-1d14-4b12-870c-69a44f532ef4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.747 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.755 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk 1073741824" returned: 0 in 4.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.756 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 4.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.757 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.815 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.816 186962 DEBUG nova.virt.disk.api [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Checking if we can resize image /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.816 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.870 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.870 186962 DEBUG nova.virt.disk.api [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Cannot resize image /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.871 186962 DEBUG nova.objects.instance [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'migration_context' on Instance uuid b7cf911a-b1c6-47f2-aed3-6384f2ef588c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.927 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-b5fe1733-1d14-4b12-870c-69a44f532ef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.927 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.934 186962 INFO nova.compute.manager [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Terminating instance#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.951 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "refresh_cache-b5fe1733-1d14-4b12-870c-69a44f532ef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.951 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquired lock "refresh_cache-b5fe1733-1d14-4b12-870c-69a44f532ef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.952 186962 DEBUG nova.network.neutron [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.975 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.975 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Ensure instance console log exists: /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.976 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.976 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.976 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.978 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Start _get_guest_xml network_info=[{"id": "29aac0a6-692c-4971-9359-052956337832", "address": "fa:16:3e:00:4d:21", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29aac0a6-69", "ovs_interfaceid": "29aac0a6-692c-4971-9359-052956337832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.987 186962 WARNING nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.990 186962 DEBUG nova.virt.libvirt.host [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:05:04 np0005539505 nova_compute[186958]: 2025-11-29 07:05:04.991 186962 DEBUG nova.virt.libvirt.host [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.000 186962 DEBUG nova.virt.libvirt.host [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.001 186962 DEBUG nova.virt.libvirt.host [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.002 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.003 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.003 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.003 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.003 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.004 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.004 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.004 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.004 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.005 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.005 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.005 186962 DEBUG nova.virt.hardware [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.009 186962 DEBUG nova.virt.libvirt.vif [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:04:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1328477640',display_name='tempest-ServersTestJSON-server-1328477640',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1328477640',id=73,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-lb0i5ir4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:05:00Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=b7cf911a-b1c6-47f2-aed3-6384f2ef588c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29aac0a6-692c-4971-9359-052956337832", "address": "fa:16:3e:00:4d:21", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29aac0a6-69", "ovs_interfaceid": "29aac0a6-692c-4971-9359-052956337832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.009 186962 DEBUG nova.network.os_vif_util [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "29aac0a6-692c-4971-9359-052956337832", "address": "fa:16:3e:00:4d:21", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29aac0a6-69", "ovs_interfaceid": "29aac0a6-692c-4971-9359-052956337832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.009 186962 DEBUG nova.network.os_vif_util [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:4d:21,bridge_name='br-int',has_traffic_filtering=True,id=29aac0a6-692c-4971-9359-052956337832,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29aac0a6-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.010 186962 DEBUG nova.objects.instance [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'pci_devices' on Instance uuid b7cf911a-b1c6-47f2-aed3-6384f2ef588c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.143 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  <uuid>b7cf911a-b1c6-47f2-aed3-6384f2ef588c</uuid>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  <name>instance-00000049</name>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersTestJSON-server-1328477640</nova:name>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:05:04</nova:creationTime>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:        <nova:user uuid="f2f86d3bd4814a09966b869dd539a6c9">tempest-ServersTestJSON-373958708-project-member</nova:user>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:        <nova:project uuid="1dba9539037a4e9dbf33cba140fe21fe">tempest-ServersTestJSON-373958708</nova:project>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:        <nova:port uuid="29aac0a6-692c-4971-9359-052956337832">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <entry name="serial">b7cf911a-b1c6-47f2-aed3-6384f2ef588c</entry>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <entry name="uuid">b7cf911a-b1c6-47f2-aed3-6384f2ef588c</entry>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk.config"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:00:4d:21"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <target dev="tap29aac0a6-69"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/console.log" append="off"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:05:05 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:05:05 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:05:05 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:05:05 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.145 186962 DEBUG nova.compute.manager [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Preparing to wait for external event network-vif-plugged-29aac0a6-692c-4971-9359-052956337832 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.145 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.146 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.146 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.146 186962 DEBUG nova.virt.libvirt.vif [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:04:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1328477640',display_name='tempest-ServersTestJSON-server-1328477640',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1328477640',id=73,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-lb0i5ir4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:05:00Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=b7cf911a-b1c6-47f2-aed3-6384f2ef588c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "29aac0a6-692c-4971-9359-052956337832", "address": "fa:16:3e:00:4d:21", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29aac0a6-69", "ovs_interfaceid": "29aac0a6-692c-4971-9359-052956337832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.147 186962 DEBUG nova.network.os_vif_util [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "29aac0a6-692c-4971-9359-052956337832", "address": "fa:16:3e:00:4d:21", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29aac0a6-69", "ovs_interfaceid": "29aac0a6-692c-4971-9359-052956337832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.147 186962 DEBUG nova.network.os_vif_util [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:4d:21,bridge_name='br-int',has_traffic_filtering=True,id=29aac0a6-692c-4971-9359-052956337832,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29aac0a6-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.148 186962 DEBUG os_vif [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:4d:21,bridge_name='br-int',has_traffic_filtering=True,id=29aac0a6-692c-4971-9359-052956337832,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29aac0a6-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.148 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.149 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.151 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.155 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.155 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29aac0a6-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.156 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap29aac0a6-69, col_values=(('external_ids', {'iface-id': '29aac0a6-692c-4971-9359-052956337832', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:4d:21', 'vm-uuid': 'b7cf911a-b1c6-47f2-aed3-6384f2ef588c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.158 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:05 np0005539505 NetworkManager[55134]: <info>  [1764399905.1601] manager: (tap29aac0a6-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.162 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.166 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.168 186962 INFO os_vif [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:4d:21,bridge_name='br-int',has_traffic_filtering=True,id=29aac0a6-692c-4971-9359-052956337832,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29aac0a6-69')#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.231 186962 DEBUG nova.network.neutron [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.236 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "b7801223-d966-4047-b510-680042881897" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.236 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b7801223-d966-4047-b510-680042881897" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.237 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "b7801223-d966-4047-b510-680042881897-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.237 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b7801223-d966-4047-b510-680042881897-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.237 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b7801223-d966-4047-b510-680042881897-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.256 186962 INFO nova.compute.manager [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Terminating instance#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.268 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "refresh_cache-b7801223-d966-4047-b510-680042881897" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.269 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquired lock "refresh_cache-b7801223-d966-4047-b510-680042881897" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.269 186962 DEBUG nova.network.neutron [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.490 186962 DEBUG nova.network.neutron [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.566 186962 DEBUG nova.network.neutron [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.598 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Releasing lock "refresh_cache-b5fe1733-1d14-4b12-870c-69a44f532ef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.599 186962 DEBUG nova.compute.manager [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.815 186962 DEBUG nova.network.neutron [req-c6d3c9d7-1228-4ae4-bb00-1d4197c7a452 req-8cb05d89-bac0-4027-9fef-402c7dc33b56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Updated VIF entry in instance network info cache for port 29aac0a6-692c-4971-9359-052956337832. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.816 186962 DEBUG nova.network.neutron [req-c6d3c9d7-1228-4ae4-bb00-1d4197c7a452 req-8cb05d89-bac0-4027-9fef-402c7dc33b56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Updating instance_info_cache with network_info: [{"id": "29aac0a6-692c-4971-9359-052956337832", "address": "fa:16:3e:00:4d:21", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29aac0a6-69", "ovs_interfaceid": "29aac0a6-692c-4971-9359-052956337832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:05 np0005539505 nova_compute[186958]: 2025-11-29 07:05:05.878 186962 DEBUG nova.network.neutron [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.024 186962 DEBUG oslo_concurrency.lockutils [req-c6d3c9d7-1228-4ae4-bb00-1d4197c7a452 req-8cb05d89-bac0-4027-9fef-402c7dc33b56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b7cf911a-b1c6-47f2-aed3-6384f2ef588c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.130 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Releasing lock "refresh_cache-b7801223-d966-4047-b510-680042881897" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.131 186962 DEBUG nova.compute.manager [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:05:06 np0005539505 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000045.scope: Deactivated successfully.
Nov 29 02:05:06 np0005539505 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000045.scope: Consumed 11.740s CPU time.
Nov 29 02:05:06 np0005539505 systemd-machined[153285]: Machine qemu-32-instance-00000045 terminated.
Nov 29 02:05:06 np0005539505 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000046.scope: Deactivated successfully.
Nov 29 02:05:06 np0005539505 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000046.scope: Consumed 11.127s CPU time.
Nov 29 02:05:06 np0005539505 systemd-machined[153285]: Machine qemu-33-instance-00000046 terminated.
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.447 186962 INFO nova.virt.libvirt.driver [-] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Instance destroyed successfully.#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.447 186962 DEBUG nova.objects.instance [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'resources' on Instance uuid b5fe1733-1d14-4b12-870c-69a44f532ef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.578 186962 INFO nova.virt.libvirt.driver [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Deleting instance files /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4_del#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.578 186962 INFO nova.virt.libvirt.driver [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Deletion of /var/lib/nova/instances/b5fe1733-1d14-4b12-870c-69a44f532ef4_del complete#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.587 186962 INFO nova.virt.libvirt.driver [-] [instance: b7801223-d966-4047-b510-680042881897] Instance destroyed successfully.#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.587 186962 DEBUG nova.objects.instance [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lazy-loading 'resources' on Instance uuid b7801223-d966-4047-b510-680042881897 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.635 186962 INFO nova.virt.libvirt.driver [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Deleting instance files /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897_del#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.636 186962 INFO nova.virt.libvirt.driver [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Deletion of /var/lib/nova/instances/b7801223-d966-4047-b510-680042881897_del complete#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.649 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.649 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.649 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No VIF found with MAC fa:16:3e:00:4d:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.650 186962 INFO nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Using config drive#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.779 186962 INFO nova.compute.manager [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Took 1.18 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.780 186962 DEBUG oslo.service.loopingcall [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.780 186962 DEBUG nova.compute.manager [-] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.780 186962 DEBUG nova.network.neutron [-] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.811 186962 INFO nova.compute.manager [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] [instance: b7801223-d966-4047-b510-680042881897] Took 0.68 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.812 186962 DEBUG oslo.service.loopingcall [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.812 186962 DEBUG nova.compute.manager [-] [instance: b7801223-d966-4047-b510-680042881897] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.812 186962 DEBUG nova.network.neutron [-] [instance: b7801223-d966-4047-b510-680042881897] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.934 186962 DEBUG nova.network.neutron [-] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:05:06 np0005539505 nova_compute[186958]: 2025-11-29 07:05:06.961 186962 DEBUG nova.network.neutron [-] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.013 186962 INFO nova.compute.manager [-] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Took 0.23 seconds to deallocate network for instance.#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.051 186962 DEBUG nova.network.neutron [-] [instance: b7801223-d966-4047-b510-680042881897] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.107 186962 DEBUG nova.network.neutron [-] [instance: b7801223-d966-4047-b510-680042881897] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.137 186962 INFO nova.compute.manager [-] [instance: b7801223-d966-4047-b510-680042881897] Took 0.32 seconds to deallocate network for instance.#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.141 186962 INFO nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Creating config drive at /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk.config#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.147 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwv4sahtl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.277 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399892.2762494, ed6db5c3-998e-480f-bda7-6a55dad0a78c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.278 186962 INFO nova.compute.manager [-] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.280 186962 DEBUG oslo_concurrency.processutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwv4sahtl" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.303 186962 DEBUG nova.compute.manager [None req-35c5270b-779c-4e49-a781-f68fc18c1e53 - - - - - -] [instance: ed6db5c3-998e-480f-bda7-6a55dad0a78c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:07 np0005539505 kernel: tap29aac0a6-69: entered promiscuous mode
Nov 29 02:05:07 np0005539505 NetworkManager[55134]: <info>  [1764399907.3482] manager: (tap29aac0a6-69): new Tun device (/org/freedesktop/NetworkManager/Devices/135)
Nov 29 02:05:07 np0005539505 systemd-udevd[225490]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:05:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:05:07Z|00269|binding|INFO|Claiming lport 29aac0a6-692c-4971-9359-052956337832 for this chassis.
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.348 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:05:07Z|00270|binding|INFO|29aac0a6-692c-4971-9359-052956337832: Claiming fa:16:3e:00:4d:21 10.100.0.7
Nov 29 02:05:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:05:07Z|00271|binding|INFO|Setting lport 29aac0a6-692c-4971-9359-052956337832 ovn-installed in OVS
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.362 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.363 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:07 np0005539505 NetworkManager[55134]: <info>  [1764399907.3687] device (tap29aac0a6-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.369 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:07 np0005539505 NetworkManager[55134]: <info>  [1764399907.3721] device (tap29aac0a6-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:05:07 np0005539505 systemd-machined[153285]: New machine qemu-35-instance-00000049.
Nov 29 02:05:07 np0005539505 systemd[1]: Started Virtual Machine qemu-35-instance-00000049.
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.533 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.534 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:05:07Z|00272|binding|INFO|Setting lport 29aac0a6-692c-4971-9359-052956337832 up in Southbound
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.537 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:4d:21 10.100.0.7'], port_security=['fa:16:3e:00:4d:21 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b7cf911a-b1c6-47f2-aed3-6384f2ef588c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=29aac0a6-692c-4971-9359-052956337832) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.539 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 29aac0a6-692c-4971-9359-052956337832 in datapath 9cf3a513-f54e-430e-b018-befaa643b464 bound to our chassis#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.542 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9cf3a513-f54e-430e-b018-befaa643b464#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.556 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[72afa5aa-a652-41b7-9af5-0dcf2bd606f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.557 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9cf3a513-f1 in ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.559 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9cf3a513-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.560 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8d53c85c-45b0-43d7-b136-3c557cedb6d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.561 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[176d6ead-0c26-4a90-a876-5703c4474ea4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.570 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.580 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[b9cc5690-580e-4378-b380-9803e927e201]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.598 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fffdecd5-c747-4bc8-ab52-9a9b51ea644a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.628 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc6b5d9-47de-42a2-a70b-aecc3b8feb49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 NetworkManager[55134]: <info>  [1764399907.6356] manager: (tap9cf3a513-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/136)
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.633 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[da08583d-8b69-4a16-9d6a-7c43e646d2ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.655 186962 DEBUG nova.compute.provider_tree [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.670 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3c7faf-c814-4910-be1b-bda5d4ea2ee8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.674 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d23128-2ac0-4234-bdc1-ac49b09657fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 NetworkManager[55134]: <info>  [1764399907.7049] device (tap9cf3a513-f0): carrier: link connected
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.710 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[daa46bf0-424f-404f-a34a-2b7403fe50d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.724 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[83cab563-4ff8-4ee0-bc1c-cf1790253081]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535528, 'reachable_time': 22168, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225558, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.738 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[911b8d85-21b3-4e8f-847f-81bbc6dd7e37]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:28ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 535528, 'tstamp': 535528}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225559, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.752 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c1877965-cc1b-4a00-838b-5a4ca7fd3f28]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535528, 'reachable_time': 22168, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225560, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.768 186962 DEBUG nova.scheduler.client.report [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.782 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e0bb8b1a-0e53-412f-8445-030fd4d578f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.805 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.808 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.833 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[58a35aec-5b49-473b-8993-1d3cad5b7a3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.835 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.836 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.836 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9cf3a513-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:07 np0005539505 NetworkManager[55134]: <info>  [1764399907.8398] manager: (tap9cf3a513-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Nov 29 02:05:07 np0005539505 kernel: tap9cf3a513-f0: entered promiscuous mode
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.840 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.873 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9cf3a513-f0, col_values=(('external_ids', {'iface-id': 'ed5aef73-67a0-4ad1-8aea-9c411786c18e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:05:07Z|00273|binding|INFO|Releasing lport ed5aef73-67a0-4ad1-8aea-9c411786c18e from this chassis (sb_readonly=0)
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.876 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.877 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[21587c54-6281-4898-8952-7faf8e4dea6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.878 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.878 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:07.879 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'env', 'PROCESS_TAG=haproxy-9cf3a513-f54e-430e-b018-befaa643b464', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9cf3a513-f54e-430e-b018-befaa643b464.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.887 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.918 186962 DEBUG nova.compute.provider_tree [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.921 186962 INFO nova.scheduler.client.report [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Deleted allocations for instance b5fe1733-1d14-4b12-870c-69a44f532ef4#033[00m
Nov 29 02:05:07 np0005539505 nova_compute[186958]: 2025-11-29 07:05:07.939 186962 DEBUG nova.scheduler.client.report [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.002 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399908.0019689, b7cf911a-b1c6-47f2-aed3-6384f2ef588c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.002 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] VM Started (Lifecycle Event)#033[00m
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.249 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:08 np0005539505 podman[225599]: 2025-11-29 07:05:08.321471719 +0000 UTC m=+0.040426128 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.456 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.464 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399908.002096, b7cf911a-b1c6-47f2-aed3-6384f2ef588c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.464 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.501 186962 INFO nova.scheduler.client.report [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Deleted allocations for instance b7801223-d966-4047-b510-680042881897#033[00m
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.608 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.613 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.660 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.682 186962 DEBUG oslo_concurrency.lockutils [None req-27d8362c-58b5-4548-b401-7c79334509c0 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b5fe1733-1d14-4b12-870c-69a44f532ef4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.994s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:08 np0005539505 nova_compute[186958]: 2025-11-29 07:05:08.956 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:09 np0005539505 podman[225599]: 2025-11-29 07:05:09.595484421 +0000 UTC m=+1.314438790 container create e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 02:05:09 np0005539505 systemd[1]: Started libpod-conmon-e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181.scope.
Nov 29 02:05:09 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:05:10 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42b971a11410596f2ef21d9f83e09e23bbb45f14c82ee25327c7311490d1096/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.161 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:10 np0005539505 podman[225599]: 2025-11-29 07:05:10.262746241 +0000 UTC m=+1.981700650 container init e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:05:10 np0005539505 podman[225599]: 2025-11-29 07:05:10.276292785 +0000 UTC m=+1.995247154 container start e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 02:05:10 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225615]: [NOTICE]   (225640) : New worker (225642) forked
Nov 29 02:05:10 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225615]: [NOTICE]   (225640) : Loading success.
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.548 186962 DEBUG nova.compute.manager [req-12f4dd7f-1952-4c82-8948-388a66e75c80 req-f353619e-812f-4d4e-af3d-c8ea0ad9038f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Received event network-vif-plugged-29aac0a6-692c-4971-9359-052956337832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.549 186962 DEBUG oslo_concurrency.lockutils [req-12f4dd7f-1952-4c82-8948-388a66e75c80 req-f353619e-812f-4d4e-af3d-c8ea0ad9038f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.550 186962 DEBUG oslo_concurrency.lockutils [req-12f4dd7f-1952-4c82-8948-388a66e75c80 req-f353619e-812f-4d4e-af3d-c8ea0ad9038f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.550 186962 DEBUG oslo_concurrency.lockutils [req-12f4dd7f-1952-4c82-8948-388a66e75c80 req-f353619e-812f-4d4e-af3d-c8ea0ad9038f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.551 186962 DEBUG nova.compute.manager [req-12f4dd7f-1952-4c82-8948-388a66e75c80 req-f353619e-812f-4d4e-af3d-c8ea0ad9038f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Processing event network-vif-plugged-29aac0a6-692c-4971-9359-052956337832 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.552 186962 DEBUG nova.compute.manager [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.558 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399910.557811, b7cf911a-b1c6-47f2-aed3-6384f2ef588c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.558 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.561 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.565 186962 INFO nova.virt.libvirt.driver [-] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Instance spawned successfully.#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.565 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.570 186962 DEBUG oslo_concurrency.lockutils [None req-e0f6039d-565a-4bac-ad68-b98493937372 0c56214d54944034ac2500edac59a239 d09f64becda14f30b831bdf7371d586b - - default default] Lock "b7801223-d966-4047-b510-680042881897" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.708 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.713 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.716 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.717 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.717 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.717 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.718 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.718 186962 DEBUG nova.virt.libvirt.driver [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.787 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.854 186962 INFO nova.compute.manager [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Took 10.39 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.855 186962 DEBUG nova.compute.manager [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:10 np0005539505 nova_compute[186958]: 2025-11-29 07:05:10.964 186962 INFO nova.compute.manager [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Took 11.01 seconds to build instance.#033[00m
Nov 29 02:05:10 np0005539505 podman[225618]: 2025-11-29 07:05:10.986562685 +0000 UTC m=+1.043130809 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:05:10 np0005539505 podman[225617]: 2025-11-29 07:05:10.998292447 +0000 UTC m=+1.056149427 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, config_id=edpm, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Nov 29 02:05:11 np0005539505 nova_compute[186958]: 2025-11-29 07:05:11.015 186962 DEBUG oslo_concurrency.lockutils [None req-06ba6bbc-1a83-4786-a0db-39185a3922b0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:11 np0005539505 nova_compute[186958]: 2025-11-29 07:05:11.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:14 np0005539505 nova_compute[186958]: 2025-11-29 07:05:13.999 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:14 np0005539505 nova_compute[186958]: 2025-11-29 07:05:14.202 186962 DEBUG nova.compute.manager [req-ec961e74-375e-4db4-ad3c-84a544acce5d req-2d2eea18-05e1-4b67-a9df-acf3cf7e355d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Received event network-vif-plugged-29aac0a6-692c-4971-9359-052956337832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:14 np0005539505 nova_compute[186958]: 2025-11-29 07:05:14.203 186962 DEBUG oslo_concurrency.lockutils [req-ec961e74-375e-4db4-ad3c-84a544acce5d req-2d2eea18-05e1-4b67-a9df-acf3cf7e355d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:14 np0005539505 nova_compute[186958]: 2025-11-29 07:05:14.203 186962 DEBUG oslo_concurrency.lockutils [req-ec961e74-375e-4db4-ad3c-84a544acce5d req-2d2eea18-05e1-4b67-a9df-acf3cf7e355d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:14 np0005539505 nova_compute[186958]: 2025-11-29 07:05:14.204 186962 DEBUG oslo_concurrency.lockutils [req-ec961e74-375e-4db4-ad3c-84a544acce5d req-2d2eea18-05e1-4b67-a9df-acf3cf7e355d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:14 np0005539505 nova_compute[186958]: 2025-11-29 07:05:14.204 186962 DEBUG nova.compute.manager [req-ec961e74-375e-4db4-ad3c-84a544acce5d req-2d2eea18-05e1-4b67-a9df-acf3cf7e355d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] No waiting events found dispatching network-vif-plugged-29aac0a6-692c-4971-9359-052956337832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:14 np0005539505 nova_compute[186958]: 2025-11-29 07:05:14.205 186962 WARNING nova.compute.manager [req-ec961e74-375e-4db4-ad3c-84a544acce5d req-2d2eea18-05e1-4b67-a9df-acf3cf7e355d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Received unexpected event network-vif-plugged-29aac0a6-692c-4971-9359-052956337832 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:05:14 np0005539505 podman[225677]: 2025-11-29 07:05:14.726376175 +0000 UTC m=+0.056280609 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 29 02:05:15 np0005539505 nova_compute[186958]: 2025-11-29 07:05:15.165 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:19 np0005539505 nova_compute[186958]: 2025-11-29 07:05:19.001 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:20 np0005539505 nova_compute[186958]: 2025-11-29 07:05:20.169 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:21 np0005539505 nova_compute[186958]: 2025-11-29 07:05:21.445 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399906.4431493, b5fe1733-1d14-4b12-870c-69a44f532ef4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:21 np0005539505 nova_compute[186958]: 2025-11-29 07:05:21.445 186962 INFO nova.compute.manager [-] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:05:21 np0005539505 nova_compute[186958]: 2025-11-29 07:05:21.472 186962 DEBUG nova.compute.manager [None req-049edaf1-237f-4f60-8210-39beb7850d67 - - - - - -] [instance: b5fe1733-1d14-4b12-870c-69a44f532ef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:21 np0005539505 nova_compute[186958]: 2025-11-29 07:05:21.584 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399906.5825794, b7801223-d966-4047-b510-680042881897 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:21 np0005539505 nova_compute[186958]: 2025-11-29 07:05:21.585 186962 INFO nova.compute.manager [-] [instance: b7801223-d966-4047-b510-680042881897] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:05:21 np0005539505 nova_compute[186958]: 2025-11-29 07:05:21.605 186962 DEBUG nova.compute.manager [None req-a8134051-fd1b-4718-bcb0-0f06f604fb78 - - - - - -] [instance: b7801223-d966-4047-b510-680042881897] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:22 np0005539505 podman[225699]: 2025-11-29 07:05:22.794593472 +0000 UTC m=+0.113770930 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:05:22 np0005539505 podman[225700]: 2025-11-29 07:05:22.813422877 +0000 UTC m=+0.127238373 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 29 02:05:24 np0005539505 nova_compute[186958]: 2025-11-29 07:05:24.003 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:25 np0005539505 nova_compute[186958]: 2025-11-29 07:05:25.171 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:26.944 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:26.944 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:26.945 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:28 np0005539505 podman[225775]: 2025-11-29 07:05:28.842358162 +0000 UTC m=+0.075211856 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:05:29 np0005539505 nova_compute[186958]: 2025-11-29 07:05:29.006 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:30 np0005539505 nova_compute[186958]: 2025-11-29 07:05:30.174 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:30 np0005539505 podman[225796]: 2025-11-29 07:05:30.728407676 +0000 UTC m=+0.061605550 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 02:05:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:05:31Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:00:4d:21 10.100.0.7
Nov 29 02:05:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:05:31Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:00:4d:21 10.100.0.7
Nov 29 02:05:34 np0005539505 nova_compute[186958]: 2025-11-29 07:05:34.007 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:35 np0005539505 nova_compute[186958]: 2025-11-29 07:05:35.177 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:36.369 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:36.371 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:05:36 np0005539505 nova_compute[186958]: 2025-11-29 07:05:36.371 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:38 np0005539505 nova_compute[186958]: 2025-11-29 07:05:38.279 186962 DEBUG oslo_concurrency.lockutils [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:38 np0005539505 nova_compute[186958]: 2025-11-29 07:05:38.280 186962 DEBUG oslo_concurrency.lockutils [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:38 np0005539505 nova_compute[186958]: 2025-11-29 07:05:38.280 186962 DEBUG oslo_concurrency.lockutils [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:38 np0005539505 nova_compute[186958]: 2025-11-29 07:05:38.280 186962 DEBUG oslo_concurrency.lockutils [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:38 np0005539505 nova_compute[186958]: 2025-11-29 07:05:38.281 186962 DEBUG oslo_concurrency.lockutils [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:38 np0005539505 nova_compute[186958]: 2025-11-29 07:05:38.295 186962 INFO nova.compute.manager [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Terminating instance#033[00m
Nov 29 02:05:39 np0005539505 nova_compute[186958]: 2025-11-29 07:05:39.010 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:39.374 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:40 np0005539505 nova_compute[186958]: 2025-11-29 07:05:40.179 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:40 np0005539505 nova_compute[186958]: 2025-11-29 07:05:40.824 186962 DEBUG nova.compute.manager [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:05:40 np0005539505 kernel: tap29aac0a6-69 (unregistering): left promiscuous mode
Nov 29 02:05:40 np0005539505 NetworkManager[55134]: <info>  [1764399940.8510] device (tap29aac0a6-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:05:40 np0005539505 nova_compute[186958]: 2025-11-29 07:05:40.866 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:40 np0005539505 ovn_controller[95143]: 2025-11-29T07:05:40Z|00274|binding|INFO|Releasing lport 29aac0a6-692c-4971-9359-052956337832 from this chassis (sb_readonly=0)
Nov 29 02:05:40 np0005539505 ovn_controller[95143]: 2025-11-29T07:05:40Z|00275|binding|INFO|Setting lport 29aac0a6-692c-4971-9359-052956337832 down in Southbound
Nov 29 02:05:40 np0005539505 ovn_controller[95143]: 2025-11-29T07:05:40Z|00276|binding|INFO|Removing iface tap29aac0a6-69 ovn-installed in OVS
Nov 29 02:05:40 np0005539505 nova_compute[186958]: 2025-11-29 07:05:40.869 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:40.877 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:4d:21 10.100.0.7'], port_security=['fa:16:3e:00:4d:21 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b7cf911a-b1c6-47f2-aed3-6384f2ef588c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=29aac0a6-692c-4971-9359-052956337832) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:05:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:40.880 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 29aac0a6-692c-4971-9359-052956337832 in datapath 9cf3a513-f54e-430e-b018-befaa643b464 unbound from our chassis#033[00m
Nov 29 02:05:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:40.882 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cf3a513-f54e-430e-b018-befaa643b464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:05:40 np0005539505 nova_compute[186958]: 2025-11-29 07:05:40.886 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:40.885 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5cc22ba4-2345-45b2-ba24-f20d7a598310]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:40.888 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace which is not needed anymore#033[00m
Nov 29 02:05:40 np0005539505 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000049.scope: Deactivated successfully.
Nov 29 02:05:40 np0005539505 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000049.scope: Consumed 13.932s CPU time.
Nov 29 02:05:40 np0005539505 systemd-machined[153285]: Machine qemu-35-instance-00000049 terminated.
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.099 186962 INFO nova.virt.libvirt.driver [-] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Instance destroyed successfully.#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.100 186962 DEBUG nova.objects.instance [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'resources' on Instance uuid b7cf911a-b1c6-47f2-aed3-6384f2ef588c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.169 186962 DEBUG nova.virt.libvirt.vif [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1328477640',display_name='tempest-ServersTestJSON-server-1328477640',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-1328477640',id=73,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:05:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-lb0i5ir4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:05:10Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=b7cf911a-b1c6-47f2-aed3-6384f2ef588c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29aac0a6-692c-4971-9359-052956337832", "address": "fa:16:3e:00:4d:21", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29aac0a6-69", "ovs_interfaceid": "29aac0a6-692c-4971-9359-052956337832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.169 186962 DEBUG nova.network.os_vif_util [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "29aac0a6-692c-4971-9359-052956337832", "address": "fa:16:3e:00:4d:21", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29aac0a6-69", "ovs_interfaceid": "29aac0a6-692c-4971-9359-052956337832", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.170 186962 DEBUG nova.network.os_vif_util [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:4d:21,bridge_name='br-int',has_traffic_filtering=True,id=29aac0a6-692c-4971-9359-052956337832,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29aac0a6-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.170 186962 DEBUG os_vif [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:4d:21,bridge_name='br-int',has_traffic_filtering=True,id=29aac0a6-692c-4971-9359-052956337832,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29aac0a6-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.173 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.173 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29aac0a6-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.175 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.177 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.180 186962 INFO os_vif [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:4d:21,bridge_name='br-int',has_traffic_filtering=True,id=29aac0a6-692c-4971-9359-052956337832,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29aac0a6-69')#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.180 186962 INFO nova.virt.libvirt.driver [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Deleting instance files /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c_del#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.181 186962 INFO nova.virt.libvirt.driver [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Deletion of /var/lib/nova/instances/b7cf911a-b1c6-47f2-aed3-6384f2ef588c_del complete#033[00m
Nov 29 02:05:41 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225615]: [NOTICE]   (225640) : haproxy version is 2.8.14-c23fe91
Nov 29 02:05:41 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225615]: [NOTICE]   (225640) : path to executable is /usr/sbin/haproxy
Nov 29 02:05:41 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225615]: [WARNING]  (225640) : Exiting Master process...
Nov 29 02:05:41 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225615]: [WARNING]  (225640) : Exiting Master process...
Nov 29 02:05:41 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225615]: [ALERT]    (225640) : Current worker (225642) exited with code 143 (Terminated)
Nov 29 02:05:41 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[225615]: [WARNING]  (225640) : All workers exited. Exiting... (0)
Nov 29 02:05:41 np0005539505 systemd[1]: libpod-e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181.scope: Deactivated successfully.
Nov 29 02:05:41 np0005539505 podman[225840]: 2025-11-29 07:05:41.206422812 +0000 UTC m=+0.217526605 container died e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:05:41 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181-userdata-shm.mount: Deactivated successfully.
Nov 29 02:05:41 np0005539505 systemd[1]: var-lib-containers-storage-overlay-c42b971a11410596f2ef21d9f83e09e23bbb45f14c82ee25327c7311490d1096-merged.mount: Deactivated successfully.
Nov 29 02:05:41 np0005539505 podman[225870]: 2025-11-29 07:05:41.472897606 +0000 UTC m=+0.347102014 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7)
Nov 29 02:05:41 np0005539505 podman[225871]: 2025-11-29 07:05:41.473541954 +0000 UTC m=+0.344107278 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:05:41 np0005539505 podman[225840]: 2025-11-29 07:05:41.510554265 +0000 UTC m=+0.521658058 container cleanup e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 02:05:41 np0005539505 systemd[1]: libpod-conmon-e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181.scope: Deactivated successfully.
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.627 186962 INFO nova.compute.manager [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Took 0.80 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.628 186962 DEBUG oslo.service.loopingcall [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.628 186962 DEBUG nova.compute.manager [-] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.629 186962 DEBUG nova.network.neutron [-] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:05:41 np0005539505 podman[225932]: 2025-11-29 07:05:41.653814372 +0000 UTC m=+0.121800019 container remove e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:05:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:41.660 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0d46af14-03a1-444e-a49e-3e5207aaa26d]: (4, ('Sat Nov 29 07:05:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181)\ne99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181\nSat Nov 29 07:05:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (e99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181)\ne99b47b51a6606172c776dd6fa96149538250a9604aba389f525971fafa96181\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:41.662 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4550d152-0a3a-46f8-8da6-0920e43e0fd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:41.664 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:05:41 np0005539505 kernel: tap9cf3a513-f0: left promiscuous mode
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.665 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.678 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.679 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:41.681 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9926308f-a227-4a3c-89f2-60cd99309cea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:41.709 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5f40a946-7e9e-40f3-a408-928d8d0cbd2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:41.711 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a7611dbb-cd60-4802-9169-2735f3d07ae7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:41.729 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e31fb7e7-86be-4eb3-89a4-4aad928f4daf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 535520, 'reachable_time': 32096, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225946, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:41.732 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:05:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:05:41.733 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[9d9662f8-8267-4ba0-958e-7b6af188a15b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:05:41 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9cf3a513\x2df54e\x2d430e\x2db018\x2dbefaa643b464.mount: Deactivated successfully.
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.793 186962 DEBUG nova.compute.manager [req-825dff70-c2e3-46ae-a92a-83f52c1955ba req-c6e1ccb9-7019-4b1f-b248-ae1b856aef6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Received event network-vif-unplugged-29aac0a6-692c-4971-9359-052956337832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.794 186962 DEBUG oslo_concurrency.lockutils [req-825dff70-c2e3-46ae-a92a-83f52c1955ba req-c6e1ccb9-7019-4b1f-b248-ae1b856aef6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.794 186962 DEBUG oslo_concurrency.lockutils [req-825dff70-c2e3-46ae-a92a-83f52c1955ba req-c6e1ccb9-7019-4b1f-b248-ae1b856aef6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.795 186962 DEBUG oslo_concurrency.lockutils [req-825dff70-c2e3-46ae-a92a-83f52c1955ba req-c6e1ccb9-7019-4b1f-b248-ae1b856aef6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.795 186962 DEBUG nova.compute.manager [req-825dff70-c2e3-46ae-a92a-83f52c1955ba req-c6e1ccb9-7019-4b1f-b248-ae1b856aef6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] No waiting events found dispatching network-vif-unplugged-29aac0a6-692c-4971-9359-052956337832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:41 np0005539505 nova_compute[186958]: 2025-11-29 07:05:41.795 186962 DEBUG nova.compute.manager [req-825dff70-c2e3-46ae-a92a-83f52c1955ba req-c6e1ccb9-7019-4b1f-b248-ae1b856aef6e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Received event network-vif-unplugged-29aac0a6-692c-4971-9359-052956337832 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:05:43 np0005539505 nova_compute[186958]: 2025-11-29 07:05:43.907 186962 DEBUG nova.compute.manager [req-db8e140b-aabe-4d81-8f68-f052d5746ef8 req-68ddc7ef-0405-465f-822b-e98e872cf922 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Received event network-vif-deleted-29aac0a6-692c-4971-9359-052956337832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:43 np0005539505 nova_compute[186958]: 2025-11-29 07:05:43.907 186962 INFO nova.compute.manager [req-db8e140b-aabe-4d81-8f68-f052d5746ef8 req-68ddc7ef-0405-465f-822b-e98e872cf922 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Neutron deleted interface 29aac0a6-692c-4971-9359-052956337832; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:05:43 np0005539505 nova_compute[186958]: 2025-11-29 07:05:43.907 186962 DEBUG nova.network.neutron [req-db8e140b-aabe-4d81-8f68-f052d5746ef8 req-68ddc7ef-0405-465f-822b-e98e872cf922 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:43 np0005539505 nova_compute[186958]: 2025-11-29 07:05:43.910 186962 DEBUG nova.compute.manager [req-4838cb1d-cd13-43d4-84da-2952ea2b4193 req-12fcce2e-8c7a-4a6a-a361-352a38eef730 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Received event network-vif-plugged-29aac0a6-692c-4971-9359-052956337832 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:05:43 np0005539505 nova_compute[186958]: 2025-11-29 07:05:43.910 186962 DEBUG oslo_concurrency.lockutils [req-4838cb1d-cd13-43d4-84da-2952ea2b4193 req-12fcce2e-8c7a-4a6a-a361-352a38eef730 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:43 np0005539505 nova_compute[186958]: 2025-11-29 07:05:43.911 186962 DEBUG oslo_concurrency.lockutils [req-4838cb1d-cd13-43d4-84da-2952ea2b4193 req-12fcce2e-8c7a-4a6a-a361-352a38eef730 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:43 np0005539505 nova_compute[186958]: 2025-11-29 07:05:43.911 186962 DEBUG oslo_concurrency.lockutils [req-4838cb1d-cd13-43d4-84da-2952ea2b4193 req-12fcce2e-8c7a-4a6a-a361-352a38eef730 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:43 np0005539505 nova_compute[186958]: 2025-11-29 07:05:43.911 186962 DEBUG nova.compute.manager [req-4838cb1d-cd13-43d4-84da-2952ea2b4193 req-12fcce2e-8c7a-4a6a-a361-352a38eef730 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] No waiting events found dispatching network-vif-plugged-29aac0a6-692c-4971-9359-052956337832 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:05:43 np0005539505 nova_compute[186958]: 2025-11-29 07:05:43.911 186962 WARNING nova.compute.manager [req-4838cb1d-cd13-43d4-84da-2952ea2b4193 req-12fcce2e-8c7a-4a6a-a361-352a38eef730 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Received unexpected event network-vif-plugged-29aac0a6-692c-4971-9359-052956337832 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:05:43 np0005539505 nova_compute[186958]: 2025-11-29 07:05:43.914 186962 DEBUG nova.network.neutron [-] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:05:44 np0005539505 nova_compute[186958]: 2025-11-29 07:05:44.012 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:44 np0005539505 nova_compute[186958]: 2025-11-29 07:05:44.613 186962 INFO nova.compute.manager [-] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Took 2.98 seconds to deallocate network for instance.#033[00m
Nov 29 02:05:44 np0005539505 nova_compute[186958]: 2025-11-29 07:05:44.617 186962 DEBUG nova.compute.manager [req-db8e140b-aabe-4d81-8f68-f052d5746ef8 req-68ddc7ef-0405-465f-822b-e98e872cf922 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Detach interface failed, port_id=29aac0a6-692c-4971-9359-052956337832, reason: Instance b7cf911a-b1c6-47f2-aed3-6384f2ef588c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:05:45 np0005539505 podman[225947]: 2025-11-29 07:05:45.7454222 +0000 UTC m=+0.077522020 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:05:45 np0005539505 nova_compute[186958]: 2025-11-29 07:05:45.863 186962 DEBUG oslo_concurrency.lockutils [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:45 np0005539505 nova_compute[186958]: 2025-11-29 07:05:45.864 186962 DEBUG oslo_concurrency.lockutils [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:45 np0005539505 nova_compute[186958]: 2025-11-29 07:05:45.934 186962 DEBUG nova.compute.provider_tree [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:46 np0005539505 nova_compute[186958]: 2025-11-29 07:05:46.039 186962 DEBUG nova.scheduler.client.report [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:46 np0005539505 nova_compute[186958]: 2025-11-29 07:05:46.206 186962 DEBUG oslo_concurrency.lockutils [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:46 np0005539505 nova_compute[186958]: 2025-11-29 07:05:46.228 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:47 np0005539505 nova_compute[186958]: 2025-11-29 07:05:47.046 186962 INFO nova.scheduler.client.report [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Deleted allocations for instance b7cf911a-b1c6-47f2-aed3-6384f2ef588c#033[00m
Nov 29 02:05:47 np0005539505 nova_compute[186958]: 2025-11-29 07:05:47.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:48 np0005539505 nova_compute[186958]: 2025-11-29 07:05:48.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:48 np0005539505 nova_compute[186958]: 2025-11-29 07:05:48.659 186962 DEBUG oslo_concurrency.lockutils [None req-3c2d66cc-5e56-4af9-b531-756f8a283046 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "b7cf911a-b1c6-47f2-aed3-6384f2ef588c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.380s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:49 np0005539505 nova_compute[186958]: 2025-11-29 07:05:49.014 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:51 np0005539505 nova_compute[186958]: 2025-11-29 07:05:51.230 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:52 np0005539505 nova_compute[186958]: 2025-11-29 07:05:52.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:53 np0005539505 nova_compute[186958]: 2025-11-29 07:05:53.209 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:53 np0005539505 nova_compute[186958]: 2025-11-29 07:05:53.210 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:53 np0005539505 nova_compute[186958]: 2025-11-29 07:05:53.243 186962 DEBUG nova.compute.manager [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:05:53 np0005539505 nova_compute[186958]: 2025-11-29 07:05:53.668 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:53 np0005539505 nova_compute[186958]: 2025-11-29 07:05:53.668 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:53 np0005539505 nova_compute[186958]: 2025-11-29 07:05:53.675 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:05:53 np0005539505 nova_compute[186958]: 2025-11-29 07:05:53.675 186962 INFO nova.compute.claims [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:05:53 np0005539505 podman[225967]: 2025-11-29 07:05:53.727073939 +0000 UTC m=+0.061001443 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:05:53 np0005539505 podman[225968]: 2025-11-29 07:05:53.788211734 +0000 UTC m=+0.119556825 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 02:05:54 np0005539505 nova_compute[186958]: 2025-11-29 07:05:54.015 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:54 np0005539505 nova_compute[186958]: 2025-11-29 07:05:54.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:54 np0005539505 nova_compute[186958]: 2025-11-29 07:05:54.482 186962 DEBUG nova.compute.provider_tree [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:54 np0005539505 nova_compute[186958]: 2025-11-29 07:05:54.704 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Acquiring lock "713a4384-a974-4edd-9e95-ef29b5169889" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:54 np0005539505 nova_compute[186958]: 2025-11-29 07:05:54.705 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:54 np0005539505 nova_compute[186958]: 2025-11-29 07:05:54.723 186962 DEBUG nova.scheduler.client.report [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:54 np0005539505 nova_compute[186958]: 2025-11-29 07:05:54.726 186962 DEBUG nova.compute.manager [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:05:54 np0005539505 nova_compute[186958]: 2025-11-29 07:05:54.957 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.289s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:54 np0005539505 nova_compute[186958]: 2025-11-29 07:05:54.958 186962 DEBUG nova.compute.manager [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:05:55 np0005539505 nova_compute[186958]: 2025-11-29 07:05:55.165 186962 DEBUG nova.compute.manager [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:05:55 np0005539505 nova_compute[186958]: 2025-11-29 07:05:55.166 186962 DEBUG nova.network.neutron [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:05:55 np0005539505 nova_compute[186958]: 2025-11-29 07:05:55.186 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:55 np0005539505 nova_compute[186958]: 2025-11-29 07:05:55.186 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:55 np0005539505 nova_compute[186958]: 2025-11-29 07:05:55.194 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:05:55 np0005539505 nova_compute[186958]: 2025-11-29 07:05:55.194 186962 INFO nova.compute.claims [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:05:55 np0005539505 nova_compute[186958]: 2025-11-29 07:05:55.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:05:55 np0005539505 nova_compute[186958]: 2025-11-29 07:05:55.670 186962 INFO nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:05:55 np0005539505 nova_compute[186958]: 2025-11-29 07:05:55.828 186962 DEBUG nova.policy [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:05:56 np0005539505 nova_compute[186958]: 2025-11-29 07:05:56.097 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399941.0960712, b7cf911a-b1c6-47f2-aed3-6384f2ef588c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:05:56 np0005539505 nova_compute[186958]: 2025-11-29 07:05:56.097 186962 INFO nova.compute.manager [-] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:05:56 np0005539505 nova_compute[186958]: 2025-11-29 07:05:56.231 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:57 np0005539505 nova_compute[186958]: 2025-11-29 07:05:57.003 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:57 np0005539505 nova_compute[186958]: 2025-11-29 07:05:57.290 186962 DEBUG nova.compute.manager [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:05:57 np0005539505 nova_compute[186958]: 2025-11-29 07:05:57.315 186962 DEBUG nova.compute.manager [None req-0c28dd51-158a-47dc-b473-694803222db1 - - - - - -] [instance: b7cf911a-b1c6-47f2-aed3-6384f2ef588c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:05:57 np0005539505 nova_compute[186958]: 2025-11-29 07:05:57.766 186962 DEBUG nova.compute.provider_tree [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:57 np0005539505 nova_compute[186958]: 2025-11-29 07:05:57.953 186962 DEBUG nova.scheduler.client.report [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.011 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.012 186962 DEBUG nova.compute.manager [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.015 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.011s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.015 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.015 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.186 186962 DEBUG nova.compute.manager [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.188 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.189 186962 INFO nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Creating image(s)#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.189 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "/var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.189 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.190 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.205 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.271 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.272 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.273 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.287 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.330 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.331 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5736MB free_disk=73.22621154785156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.332 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.332 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.353 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.354 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.501 186962 DEBUG nova.compute.manager [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.501 186962 DEBUG nova.network.neutron [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.620 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.621 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 713a4384-a974-4edd-9e95-ef29b5169889 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.621 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.621 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.673 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk 1073741824" returned: 0 in 0.319s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.674 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.401s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.675 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.703 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.713 186962 INFO nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.745 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.746 186962 DEBUG nova.virt.disk.api [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Checking if we can resize image /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.746 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.768 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.806 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.807 186962 DEBUG nova.virt.disk.api [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Cannot resize image /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.807 186962 DEBUG nova.objects.instance [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'migration_context' on Instance uuid 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.828 186962 DEBUG nova.compute.manager [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.866 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.866 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.867 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.867 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Ensure instance console log exists: /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.868 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.868 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.868 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:58 np0005539505 nova_compute[186958]: 2025-11-29 07:05:58.974 186962 DEBUG nova.policy [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '61b629c14bdd42a4ab950e1a86b22ac4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '38296bde2b4d4a719f568c7ed5c6c0bf', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.017 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.230 186962 DEBUG nova.network.neutron [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Successfully created port: 27b3dbda-6b3a-4b71-8698-f91704c0d5fa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.412 186962 DEBUG nova.compute.manager [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.413 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.414 186962 INFO nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Creating image(s)#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.414 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Acquiring lock "/var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.415 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "/var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.415 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "/var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.430 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.490 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.491 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.491 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.504 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.563 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.564 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:59 np0005539505 podman[226041]: 2025-11-29 07:05:59.73651256 +0000 UTC m=+0.065517353 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.745 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk 1073741824" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.747 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.748 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.825 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.827 186962 DEBUG nova.virt.disk.api [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Checking if we can resize image /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.827 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.887 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.888 186962 DEBUG nova.virt.disk.api [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Cannot resize image /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:05:59 np0005539505 nova_compute[186958]: 2025-11-29 07:05:59.889 186962 DEBUG nova.objects.instance [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lazy-loading 'migration_context' on Instance uuid 713a4384-a974-4edd-9e95-ef29b5169889 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:00 np0005539505 nova_compute[186958]: 2025-11-29 07:06:00.084 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:06:00 np0005539505 nova_compute[186958]: 2025-11-29 07:06:00.084 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Ensure instance console log exists: /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:06:00 np0005539505 nova_compute[186958]: 2025-11-29 07:06:00.085 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:00 np0005539505 nova_compute[186958]: 2025-11-29 07:06:00.086 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:00 np0005539505 nova_compute[186958]: 2025-11-29 07:06:00.086 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:01 np0005539505 nova_compute[186958]: 2025-11-29 07:06:01.233 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:01 np0005539505 podman[226066]: 2025-11-29 07:06:01.726863062 +0000 UTC m=+0.058657630 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:06:02 np0005539505 nova_compute[186958]: 2025-11-29 07:06:02.865 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:03 np0005539505 nova_compute[186958]: 2025-11-29 07:06:03.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:03 np0005539505 nova_compute[186958]: 2025-11-29 07:06:03.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:06:03 np0005539505 nova_compute[186958]: 2025-11-29 07:06:03.726 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:06:04 np0005539505 nova_compute[186958]: 2025-11-29 07:06:04.020 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:04 np0005539505 nova_compute[186958]: 2025-11-29 07:06:04.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:04 np0005539505 nova_compute[186958]: 2025-11-29 07:06:04.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:04 np0005539505 nova_compute[186958]: 2025-11-29 07:06:04.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:06:05 np0005539505 nova_compute[186958]: 2025-11-29 07:06:05.382 186962 DEBUG nova.network.neutron [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Successfully created port: 2fd6d05d-ccf2-45e4-8e10-8d64843e6150 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:06:06 np0005539505 nova_compute[186958]: 2025-11-29 07:06:06.235 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:06 np0005539505 nova_compute[186958]: 2025-11-29 07:06:06.780 186962 DEBUG nova.network.neutron [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Successfully updated port: 27b3dbda-6b3a-4b71-8698-f91704c0d5fa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:06:06 np0005539505 nova_compute[186958]: 2025-11-29 07:06:06.794 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "refresh_cache-82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:06:06 np0005539505 nova_compute[186958]: 2025-11-29 07:06:06.795 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquired lock "refresh_cache-82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:06:06 np0005539505 nova_compute[186958]: 2025-11-29 07:06:06.795 186962 DEBUG nova.network.neutron [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:06:07 np0005539505 nova_compute[186958]: 2025-11-29 07:06:07.128 186962 DEBUG nova.network.neutron [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:06:07 np0005539505 nova_compute[186958]: 2025-11-29 07:06:07.325 186962 DEBUG nova.compute.manager [req-bebf5820-ed7f-4c61-b030-b54d1b1f5259 req-b6afad57-1987-4e4e-a521-60dbe1aa949e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Received event network-changed-27b3dbda-6b3a-4b71-8698-f91704c0d5fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:07 np0005539505 nova_compute[186958]: 2025-11-29 07:06:07.325 186962 DEBUG nova.compute.manager [req-bebf5820-ed7f-4c61-b030-b54d1b1f5259 req-b6afad57-1987-4e4e-a521-60dbe1aa949e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Refreshing instance network info cache due to event network-changed-27b3dbda-6b3a-4b71-8698-f91704c0d5fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:06:07 np0005539505 nova_compute[186958]: 2025-11-29 07:06:07.325 186962 DEBUG oslo_concurrency.lockutils [req-bebf5820-ed7f-4c61-b030-b54d1b1f5259 req-b6afad57-1987-4e4e-a521-60dbe1aa949e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:06:08 np0005539505 nova_compute[186958]: 2025-11-29 07:06:08.541 186962 DEBUG nova.network.neutron [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Successfully updated port: 2fd6d05d-ccf2-45e4-8e10-8d64843e6150 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:06:08 np0005539505 nova_compute[186958]: 2025-11-29 07:06:08.584 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Acquiring lock "refresh_cache-713a4384-a974-4edd-9e95-ef29b5169889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:06:08 np0005539505 nova_compute[186958]: 2025-11-29 07:06:08.585 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Acquired lock "refresh_cache-713a4384-a974-4edd-9e95-ef29b5169889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:06:08 np0005539505 nova_compute[186958]: 2025-11-29 07:06:08.585 186962 DEBUG nova.network.neutron [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:06:08 np0005539505 nova_compute[186958]: 2025-11-29 07:06:08.742 186962 DEBUG nova.network.neutron [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Updating instance_info_cache with network_info: [{"id": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "address": "fa:16:3e:86:ad:71", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27b3dbda-6b", "ovs_interfaceid": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:08 np0005539505 nova_compute[186958]: 2025-11-29 07:06:08.984 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Releasing lock "refresh_cache-82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:06:08 np0005539505 nova_compute[186958]: 2025-11-29 07:06:08.985 186962 DEBUG nova.compute.manager [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Instance network_info: |[{"id": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "address": "fa:16:3e:86:ad:71", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27b3dbda-6b", "ovs_interfaceid": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:06:08 np0005539505 nova_compute[186958]: 2025-11-29 07:06:08.985 186962 DEBUG oslo_concurrency.lockutils [req-bebf5820-ed7f-4c61-b030-b54d1b1f5259 req-b6afad57-1987-4e4e-a521-60dbe1aa949e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:06:08 np0005539505 nova_compute[186958]: 2025-11-29 07:06:08.986 186962 DEBUG nova.network.neutron [req-bebf5820-ed7f-4c61-b030-b54d1b1f5259 req-b6afad57-1987-4e4e-a521-60dbe1aa949e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Refreshing network info cache for port 27b3dbda-6b3a-4b71-8698-f91704c0d5fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:06:08 np0005539505 nova_compute[186958]: 2025-11-29 07:06:08.989 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Start _get_guest_xml network_info=[{"id": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "address": "fa:16:3e:86:ad:71", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27b3dbda-6b", "ovs_interfaceid": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:06:08 np0005539505 nova_compute[186958]: 2025-11-29 07:06:08.993 186962 WARNING nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.013 186962 DEBUG nova.virt.libvirt.host [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.013 186962 DEBUG nova.virt.libvirt.host [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.016 186962 DEBUG nova.virt.libvirt.host [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.017 186962 DEBUG nova.virt.libvirt.host [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.018 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.018 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.018 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.018 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.019 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.019 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.019 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.021 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.021 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.021 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.022 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.022 186962 DEBUG nova.virt.hardware [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.025 186962 DEBUG nova.virt.libvirt.vif [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:05:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2061473623',display_name='tempest-ServersTestJSON-server-2061473623',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-2061473623',id=75,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-f75ra1ee',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:05:57Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=82b53ac1-6801-4ad8-b4d1-34e59c5d20d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "address": "fa:16:3e:86:ad:71", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27b3dbda-6b", "ovs_interfaceid": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.026 186962 DEBUG nova.network.os_vif_util [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "address": "fa:16:3e:86:ad:71", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27b3dbda-6b", "ovs_interfaceid": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.026 186962 DEBUG nova.network.os_vif_util [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:ad:71,bridge_name='br-int',has_traffic_filtering=True,id=27b3dbda-6b3a-4b71-8698-f91704c0d5fa,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27b3dbda-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.027 186962 DEBUG nova.objects.instance [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'pci_devices' on Instance uuid 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.028 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.051 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  <uuid>82b53ac1-6801-4ad8-b4d1-34e59c5d20d6</uuid>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  <name>instance-0000004b</name>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersTestJSON-server-2061473623</nova:name>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:06:08</nova:creationTime>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:        <nova:user uuid="f2f86d3bd4814a09966b869dd539a6c9">tempest-ServersTestJSON-373958708-project-member</nova:user>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:        <nova:project uuid="1dba9539037a4e9dbf33cba140fe21fe">tempest-ServersTestJSON-373958708</nova:project>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:        <nova:port uuid="27b3dbda-6b3a-4b71-8698-f91704c0d5fa">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <entry name="serial">82b53ac1-6801-4ad8-b4d1-34e59c5d20d6</entry>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <entry name="uuid">82b53ac1-6801-4ad8-b4d1-34e59c5d20d6</entry>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk.config"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:86:ad:71"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <target dev="tap27b3dbda-6b"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/console.log" append="off"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:06:09 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:06:09 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:06:09 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:06:09 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.052 186962 DEBUG nova.compute.manager [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Preparing to wait for external event network-vif-plugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.053 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.053 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.053 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.054 186962 DEBUG nova.virt.libvirt.vif [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:05:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2061473623',display_name='tempest-ServersTestJSON-server-2061473623',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-2061473623',id=75,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-f75ra1ee',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:05:57Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=82b53ac1-6801-4ad8-b4d1-34e59c5d20d6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "address": "fa:16:3e:86:ad:71", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27b3dbda-6b", "ovs_interfaceid": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.054 186962 DEBUG nova.network.os_vif_util [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "address": "fa:16:3e:86:ad:71", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27b3dbda-6b", "ovs_interfaceid": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.055 186962 DEBUG nova.network.os_vif_util [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:ad:71,bridge_name='br-int',has_traffic_filtering=True,id=27b3dbda-6b3a-4b71-8698-f91704c0d5fa,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27b3dbda-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.055 186962 DEBUG os_vif [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:ad:71,bridge_name='br-int',has_traffic_filtering=True,id=27b3dbda-6b3a-4b71-8698-f91704c0d5fa,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27b3dbda-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.056 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.056 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.056 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.060 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.060 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27b3dbda-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.060 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap27b3dbda-6b, col_values=(('external_ids', {'iface-id': '27b3dbda-6b3a-4b71-8698-f91704c0d5fa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:ad:71', 'vm-uuid': '82b53ac1-6801-4ad8-b4d1-34e59c5d20d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.062 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:09 np0005539505 NetworkManager[55134]: <info>  [1764399969.0637] manager: (tap27b3dbda-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.064 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.068 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.069 186962 INFO os_vif [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:ad:71,bridge_name='br-int',has_traffic_filtering=True,id=27b3dbda-6b3a-4b71-8698-f91704c0d5fa,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27b3dbda-6b')#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.086 186962 DEBUG nova.network.neutron [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.250 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.251 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.260 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No VIF found with MAC fa:16:3e:86:ad:71, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.261 186962 INFO nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Using config drive#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.431 186962 DEBUG nova.compute.manager [req-174d15d8-aa30-40d6-a459-816b2d4bdacf req-33feb37d-4427-4bd8-a0e3-95b170e601f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Received event network-changed-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.432 186962 DEBUG nova.compute.manager [req-174d15d8-aa30-40d6-a459-816b2d4bdacf req-33feb37d-4427-4bd8-a0e3-95b170e601f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Refreshing instance network info cache due to event network-changed-2fd6d05d-ccf2-45e4-8e10-8d64843e6150. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.432 186962 DEBUG oslo_concurrency.lockutils [req-174d15d8-aa30-40d6-a459-816b2d4bdacf req-33feb37d-4427-4bd8-a0e3-95b170e601f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-713a4384-a974-4edd-9e95-ef29b5169889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.654 186962 INFO nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Creating config drive at /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk.config#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.662 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuvb7gdlv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.798 186962 DEBUG oslo_concurrency.processutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuvb7gdlv" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:09 np0005539505 kernel: tap27b3dbda-6b: entered promiscuous mode
Nov 29 02:06:09 np0005539505 NetworkManager[55134]: <info>  [1764399969.8630] manager: (tap27b3dbda-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Nov 29 02:06:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:09Z|00277|binding|INFO|Claiming lport 27b3dbda-6b3a-4b71-8698-f91704c0d5fa for this chassis.
Nov 29 02:06:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:09Z|00278|binding|INFO|27b3dbda-6b3a-4b71-8698-f91704c0d5fa: Claiming fa:16:3e:86:ad:71 10.100.0.9
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.863 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.871 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:ad:71 10.100.0.9'], port_security=['fa:16:3e:86:ad:71 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '82b53ac1-6801-4ad8-b4d1-34e59c5d20d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=27b3dbda-6b3a-4b71-8698-f91704c0d5fa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.872 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 27b3dbda-6b3a-4b71-8698-f91704c0d5fa in datapath 9cf3a513-f54e-430e-b018-befaa643b464 bound to our chassis#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.874 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9cf3a513-f54e-430e-b018-befaa643b464#033[00m
Nov 29 02:06:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:09Z|00279|binding|INFO|Setting lport 27b3dbda-6b3a-4b71-8698-f91704c0d5fa ovn-installed in OVS
Nov 29 02:06:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:09Z|00280|binding|INFO|Setting lport 27b3dbda-6b3a-4b71-8698-f91704c0d5fa up in Southbound
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.877 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.879 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.889 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2124e2-04fa-4fd4-b5d9-ae5298d60894]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.891 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9cf3a513-f1 in ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.892 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9cf3a513-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.893 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e410230d-1f00-400d-945f-f0d192f3aac7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.893 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[36fb36ea-f837-4e04-b96e-26d2b0054cc4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:09 np0005539505 systemd-udevd[226106]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:06:09 np0005539505 systemd-machined[153285]: New machine qemu-36-instance-0000004b.
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.906 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[2c108a2c-72ac-4728-abc4-c4272e366b23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:09 np0005539505 NetworkManager[55134]: <info>  [1764399969.9120] device (tap27b3dbda-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:06:09 np0005539505 NetworkManager[55134]: <info>  [1764399969.9135] device (tap27b3dbda-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.920 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c3fb77e0-c1d0-418b-80a7-11e13834ca33]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:09 np0005539505 systemd[1]: Started Virtual Machine qemu-36-instance-0000004b.
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.940 186962 DEBUG nova.network.neutron [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Updating instance_info_cache with network_info: [{"id": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "address": "fa:16:3e:4f:c0:4a", "network": {"id": "4b7db8a7-cccc-445d-b22b-94b920fc0457", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-53676293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38296bde2b4d4a719f568c7ed5c6c0bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd6d05d-cc", "ovs_interfaceid": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.946 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[d95ed5ed-35d2-493f-b9fc-3433f0ff8867]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.952 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[313a75bc-dc84-4c44-b628-f08724801b7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:09 np0005539505 systemd-udevd[226110]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:06:09 np0005539505 NetworkManager[55134]: <info>  [1764399969.9541] manager: (tap9cf3a513-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/140)
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.960 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Releasing lock "refresh_cache-713a4384-a974-4edd-9e95-ef29b5169889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.961 186962 DEBUG nova.compute.manager [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Instance network_info: |[{"id": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "address": "fa:16:3e:4f:c0:4a", "network": {"id": "4b7db8a7-cccc-445d-b22b-94b920fc0457", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-53676293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38296bde2b4d4a719f568c7ed5c6c0bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd6d05d-cc", "ovs_interfaceid": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.962 186962 DEBUG oslo_concurrency.lockutils [req-174d15d8-aa30-40d6-a459-816b2d4bdacf req-33feb37d-4427-4bd8-a0e3-95b170e601f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-713a4384-a974-4edd-9e95-ef29b5169889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.962 186962 DEBUG nova.network.neutron [req-174d15d8-aa30-40d6-a459-816b2d4bdacf req-33feb37d-4427-4bd8-a0e3-95b170e601f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Refreshing network info cache for port 2fd6d05d-ccf2-45e4-8e10-8d64843e6150 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.965 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Start _get_guest_xml network_info=[{"id": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "address": "fa:16:3e:4f:c0:4a", "network": {"id": "4b7db8a7-cccc-445d-b22b-94b920fc0457", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-53676293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38296bde2b4d4a719f568c7ed5c6c0bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd6d05d-cc", "ovs_interfaceid": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.975 186962 WARNING nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.985 186962 DEBUG nova.virt.libvirt.host [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.986 186962 DEBUG nova.virt.libvirt.host [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.991 186962 DEBUG nova.virt.libvirt.host [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.992 186962 DEBUG nova.virt.libvirt.host [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.994 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.994 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.995 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.995 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.995 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.995 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.996 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.995 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd0644e-d526-4631-9bc6-d7d086256be5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.996 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.997 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.997 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.997 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:06:09 np0005539505 nova_compute[186958]: 2025-11-29 07:06:09.997 186962 DEBUG nova.virt.hardware [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:06:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:09.998 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3f98f9b5-a583-4dba-83ff-ea87167ae13b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.001 186962 DEBUG nova.virt.libvirt.vif [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:05:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1795175015',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1795175015',id=76,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38296bde2b4d4a719f568c7ed5c6c0bf',ramdisk_id='',reservation_id='r-cso933n2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-1014245885',owner_user_name='tempest-InstanceActionsV221TestJSON-1014245885-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:05:58Z,user_data=None,user_id='61b629c14bdd42a4ab950e1a86b22ac4',uuid=713a4384-a974-4edd-9e95-ef29b5169889,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "address": "fa:16:3e:4f:c0:4a", "network": {"id": "4b7db8a7-cccc-445d-b22b-94b920fc0457", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-53676293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38296bde2b4d4a719f568c7ed5c6c0bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd6d05d-cc", "ovs_interfaceid": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.001 186962 DEBUG nova.network.os_vif_util [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Converting VIF {"id": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "address": "fa:16:3e:4f:c0:4a", "network": {"id": "4b7db8a7-cccc-445d-b22b-94b920fc0457", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-53676293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38296bde2b4d4a719f568c7ed5c6c0bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd6d05d-cc", "ovs_interfaceid": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.002 186962 DEBUG nova.network.os_vif_util [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:c0:4a,bridge_name='br-int',has_traffic_filtering=True,id=2fd6d05d-ccf2-45e4-8e10-8d64843e6150,network=Network(4b7db8a7-cccc-445d-b22b-94b920fc0457),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd6d05d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.003 186962 DEBUG nova.objects.instance [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 713a4384-a974-4edd-9e95-ef29b5169889 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.018 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  <uuid>713a4384-a974-4edd-9e95-ef29b5169889</uuid>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  <name>instance-0000004c</name>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <nova:name>tempest-InstanceActionsV221TestJSON-server-1795175015</nova:name>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:06:09</nova:creationTime>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:        <nova:user uuid="61b629c14bdd42a4ab950e1a86b22ac4">tempest-InstanceActionsV221TestJSON-1014245885-project-member</nova:user>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:        <nova:project uuid="38296bde2b4d4a719f568c7ed5c6c0bf">tempest-InstanceActionsV221TestJSON-1014245885</nova:project>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:        <nova:port uuid="2fd6d05d-ccf2-45e4-8e10-8d64843e6150">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <entry name="serial">713a4384-a974-4edd-9e95-ef29b5169889</entry>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <entry name="uuid">713a4384-a974-4edd-9e95-ef29b5169889</entry>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk.config"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:4f:c0:4a"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <target dev="tap2fd6d05d-cc"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/console.log" append="off"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:06:10 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:06:10 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:06:10 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:06:10 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.023 186962 DEBUG nova.compute.manager [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Preparing to wait for external event network-vif-plugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:06:10 np0005539505 NetworkManager[55134]: <info>  [1764399970.0247] device (tap9cf3a513-f0): carrier: link connected
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.024 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Acquiring lock "713a4384-a974-4edd-9e95-ef29b5169889-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.025 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.026 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.026 186962 DEBUG nova.virt.libvirt.vif [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:05:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1795175015',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1795175015',id=76,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='38296bde2b4d4a719f568c7ed5c6c0bf',ramdisk_id='',reservation_id='r-cso933n2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-1014245885',owner_user_name='tempest-InstanceActionsV221TestJSON-1014245885-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:05:58Z,user_data=None,user_id='61b629c14bdd42a4ab950e1a86b22ac4',uuid=713a4384-a974-4edd-9e95-ef29b5169889,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "address": "fa:16:3e:4f:c0:4a", "network": {"id": "4b7db8a7-cccc-445d-b22b-94b920fc0457", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-53676293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38296bde2b4d4a719f568c7ed5c6c0bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd6d05d-cc", "ovs_interfaceid": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.027 186962 DEBUG nova.network.os_vif_util [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Converting VIF {"id": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "address": "fa:16:3e:4f:c0:4a", "network": {"id": "4b7db8a7-cccc-445d-b22b-94b920fc0457", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-53676293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38296bde2b4d4a719f568c7ed5c6c0bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd6d05d-cc", "ovs_interfaceid": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.027 186962 DEBUG nova.network.os_vif_util [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:c0:4a,bridge_name='br-int',has_traffic_filtering=True,id=2fd6d05d-ccf2-45e4-8e10-8d64843e6150,network=Network(4b7db8a7-cccc-445d-b22b-94b920fc0457),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd6d05d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.028 186962 DEBUG os_vif [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:c0:4a,bridge_name='br-int',has_traffic_filtering=True,id=2fd6d05d-ccf2-45e4-8e10-8d64843e6150,network=Network(4b7db8a7-cccc-445d-b22b-94b920fc0457),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd6d05d-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.028 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.029 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.029 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.031 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.032 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fd6d05d-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.032 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2fd6d05d-cc, col_values=(('external_ids', {'iface-id': '2fd6d05d-ccf2-45e4-8e10-8d64843e6150', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:c0:4a', 'vm-uuid': '713a4384-a974-4edd-9e95-ef29b5169889'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.031 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[986cb6e0-8265-4d7f-8cf1-099c4d6e5e64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.034 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 NetworkManager[55134]: <info>  [1764399970.0351] manager: (tap2fd6d05d-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.037 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.041 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.041 186962 INFO os_vif [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:c0:4a,bridge_name='br-int',has_traffic_filtering=True,id=2fd6d05d-ccf2-45e4-8e10-8d64843e6150,network=Network(4b7db8a7-cccc-445d-b22b-94b920fc0457),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd6d05d-cc')#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.051 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[48db6bb5-8d08-44b3-bd06-86988db69e83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541760, 'reachable_time': 42370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226140, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.068 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[09e25360-bee5-4b6d-bc4b-1f86ea722964]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:28ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 541760, 'tstamp': 541760}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226141, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.089 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a6f6de47-c021-47df-8397-793be065c7e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541760, 'reachable_time': 42370, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226142, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.118 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4e8c04-fecb-442c-8487-ed6cd4af294f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.173 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff521ad-246a-41d1-818f-38f5af44f4ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.174 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.174 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.175 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9cf3a513-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.176 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 NetworkManager[55134]: <info>  [1764399970.1776] manager: (tap9cf3a513-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Nov 29 02:06:10 np0005539505 kernel: tap9cf3a513-f0: entered promiscuous mode
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.182 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.183 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9cf3a513-f0, col_values=(('external_ids', {'iface-id': 'ed5aef73-67a0-4ad1-8aea-9c411786c18e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.184 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:10Z|00281|binding|INFO|Releasing lport ed5aef73-67a0-4ad1-8aea-9c411786c18e from this chassis (sb_readonly=0)
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.200 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.203 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.204 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.205 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[90730acb-9445-4dc1-ada0-68c6032cac82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.206 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.207 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'env', 'PROCESS_TAG=haproxy-9cf3a513-f54e-430e-b018-befaa643b464', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9cf3a513-f54e-430e-b018-befaa643b464.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.221 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.223 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.223 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] No VIF found with MAC fa:16:3e:4f:c0:4a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.223 186962 INFO nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Using config drive#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.407 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399970.4076583, 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.409 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] VM Started (Lifecycle Event)#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.434 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.438 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399970.4078028, 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.438 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.464 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.467 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.498 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.528 186962 DEBUG nova.compute.manager [req-bcb960c8-a43d-4e51-9a25-1b4b2eeee519 req-a02e1551-c41b-433c-a2b7-ef5e46b4ce3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Received event network-vif-plugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.530 186962 DEBUG oslo_concurrency.lockutils [req-bcb960c8-a43d-4e51-9a25-1b4b2eeee519 req-a02e1551-c41b-433c-a2b7-ef5e46b4ce3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.530 186962 DEBUG oslo_concurrency.lockutils [req-bcb960c8-a43d-4e51-9a25-1b4b2eeee519 req-a02e1551-c41b-433c-a2b7-ef5e46b4ce3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.530 186962 DEBUG oslo_concurrency.lockutils [req-bcb960c8-a43d-4e51-9a25-1b4b2eeee519 req-a02e1551-c41b-433c-a2b7-ef5e46b4ce3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.531 186962 DEBUG nova.compute.manager [req-bcb960c8-a43d-4e51-9a25-1b4b2eeee519 req-a02e1551-c41b-433c-a2b7-ef5e46b4ce3e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Processing event network-vif-plugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.532 186962 DEBUG nova.compute.manager [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.535 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399970.535741, 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.536 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.538 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.541 186962 INFO nova.virt.libvirt.driver [-] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Instance spawned successfully.#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.542 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.560 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.566 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.567 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.567 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.567 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.568 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.568 186962 DEBUG nova.virt.libvirt.driver [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.570 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.604 186962 DEBUG nova.network.neutron [req-bebf5820-ed7f-4c61-b030-b54d1b1f5259 req-b6afad57-1987-4e4e-a521-60dbe1aa949e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Updated VIF entry in instance network info cache for port 27b3dbda-6b3a-4b71-8698-f91704c0d5fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.605 186962 DEBUG nova.network.neutron [req-bebf5820-ed7f-4c61-b030-b54d1b1f5259 req-b6afad57-1987-4e4e-a521-60dbe1aa949e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Updating instance_info_cache with network_info: [{"id": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "address": "fa:16:3e:86:ad:71", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27b3dbda-6b", "ovs_interfaceid": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.606 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.609 186962 INFO nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Creating config drive at /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk.config#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.615 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq5_v3km1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:10 np0005539505 podman[226186]: 2025-11-29 07:06:10.547609583 +0000 UTC m=+0.026169360 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.644 186962 DEBUG oslo_concurrency.lockutils [req-bebf5820-ed7f-4c61-b030-b54d1b1f5259 req-b6afad57-1987-4e4e-a521-60dbe1aa949e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.679 186962 INFO nova.compute.manager [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Took 12.49 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.680 186962 DEBUG nova.compute.manager [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.743 186962 DEBUG oslo_concurrency.processutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq5_v3km1" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.774 186962 INFO nova.compute.manager [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Took 17.25 seconds to build instance.#033[00m
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.797 186962 DEBUG oslo_concurrency.lockutils [None req-596f93ca-0986-4528-b7a2-d73279ac4efb f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:10 np0005539505 kernel: tap2fd6d05d-cc: entered promiscuous mode
Nov 29 02:06:10 np0005539505 NetworkManager[55134]: <info>  [1764399970.8116] manager: (tap2fd6d05d-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/143)
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.812 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 systemd-udevd[226134]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:06:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:10Z|00282|binding|INFO|Claiming lport 2fd6d05d-ccf2-45e4-8e10-8d64843e6150 for this chassis.
Nov 29 02:06:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:10Z|00283|binding|INFO|2fd6d05d-ccf2-45e4-8e10-8d64843e6150: Claiming fa:16:3e:4f:c0:4a 10.100.0.8
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.819 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:10.831 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:c0:4a 10.100.0.8'], port_security=['fa:16:3e:4f:c0:4a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '713a4384-a974-4edd-9e95-ef29b5169889', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b7db8a7-cccc-445d-b22b-94b920fc0457', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38296bde2b4d4a719f568c7ed5c6c0bf', 'neutron:revision_number': '2', 'neutron:security_group_ids': '35aeb8e2-829e-4031-8525-cc861ebfb4bf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8248c843-6b85-4e58-8f04-912019203745, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=2fd6d05d-ccf2-45e4-8e10-8d64843e6150) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:10 np0005539505 NetworkManager[55134]: <info>  [1764399970.8368] device (tap2fd6d05d-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:06:10 np0005539505 NetworkManager[55134]: <info>  [1764399970.8375] device (tap2fd6d05d-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:06:10 np0005539505 systemd-machined[153285]: New machine qemu-37-instance-0000004c.
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.873 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:10 np0005539505 systemd[1]: Started Virtual Machine qemu-37-instance-0000004c.
Nov 29 02:06:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:10Z|00284|binding|INFO|Setting lport 2fd6d05d-ccf2-45e4-8e10-8d64843e6150 ovn-installed in OVS
Nov 29 02:06:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:10Z|00285|binding|INFO|Setting lport 2fd6d05d-ccf2-45e4-8e10-8d64843e6150 up in Southbound
Nov 29 02:06:10 np0005539505 nova_compute[186958]: 2025-11-29 07:06:10.878 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:11 np0005539505 podman[226186]: 2025-11-29 07:06:11.180768194 +0000 UTC m=+0.659327941 container create 74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.184 186962 DEBUG nova.network.neutron [req-174d15d8-aa30-40d6-a459-816b2d4bdacf req-33feb37d-4427-4bd8-a0e3-95b170e601f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Updated VIF entry in instance network info cache for port 2fd6d05d-ccf2-45e4-8e10-8d64843e6150. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.185 186962 DEBUG nova.network.neutron [req-174d15d8-aa30-40d6-a459-816b2d4bdacf req-33feb37d-4427-4bd8-a0e3-95b170e601f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Updating instance_info_cache with network_info: [{"id": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "address": "fa:16:3e:4f:c0:4a", "network": {"id": "4b7db8a7-cccc-445d-b22b-94b920fc0457", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-53676293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38296bde2b4d4a719f568c7ed5c6c0bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd6d05d-cc", "ovs_interfaceid": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.206 186962 DEBUG oslo_concurrency.lockutils [req-174d15d8-aa30-40d6-a459-816b2d4bdacf req-33feb37d-4427-4bd8-a0e3-95b170e601f8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-713a4384-a974-4edd-9e95-ef29b5169889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:06:11 np0005539505 systemd[1]: Started libpod-conmon-74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d.scope.
Nov 29 02:06:11 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:06:11 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a64238c8e03e1a897871dec4f921913d6b762d50e7636c040dbc6cf2f821ef8e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.420 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399971.4198105, 713a4384-a974-4edd-9e95-ef29b5169889 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.420 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] VM Started (Lifecycle Event)#033[00m
Nov 29 02:06:11 np0005539505 podman[226186]: 2025-11-29 07:06:11.424755141 +0000 UTC m=+0.903314918 container init 74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:06:11 np0005539505 podman[226186]: 2025-11-29 07:06:11.431020659 +0000 UTC m=+0.909580406 container start 74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.441 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.446 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399971.4200912, 713a4384-a974-4edd-9e95-ef29b5169889 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.446 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:06:11 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226226]: [NOTICE]   (226237) : New worker (226239) forked
Nov 29 02:06:11 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226226]: [NOTICE]   (226237) : Loading success.
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.473 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.478 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.506 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.542 186962 DEBUG nova.compute.manager [req-1b29f1b8-fc52-45d3-9d51-5a4528bed6f3 req-cb733149-5ddc-4c37-84d4-2adbec0c2587 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Received event network-vif-plugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.543 186962 DEBUG oslo_concurrency.lockutils [req-1b29f1b8-fc52-45d3-9d51-5a4528bed6f3 req-cb733149-5ddc-4c37-84d4-2adbec0c2587 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "713a4384-a974-4edd-9e95-ef29b5169889-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.543 186962 DEBUG oslo_concurrency.lockutils [req-1b29f1b8-fc52-45d3-9d51-5a4528bed6f3 req-cb733149-5ddc-4c37-84d4-2adbec0c2587 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.544 186962 DEBUG oslo_concurrency.lockutils [req-1b29f1b8-fc52-45d3-9d51-5a4528bed6f3 req-cb733149-5ddc-4c37-84d4-2adbec0c2587 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.544 186962 DEBUG nova.compute.manager [req-1b29f1b8-fc52-45d3-9d51-5a4528bed6f3 req-cb733149-5ddc-4c37-84d4-2adbec0c2587 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Processing event network-vif-plugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.545 186962 DEBUG nova.compute.manager [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.549 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399971.548742, 713a4384-a974-4edd-9e95-ef29b5169889 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.549 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.551 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.554 186962 INFO nova.virt.libvirt.driver [-] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Instance spawned successfully.#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.555 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.577 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.582 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.596 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.597 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.597 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.598 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.598 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.598 186962 DEBUG nova.virt.libvirt.driver [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.622 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.625 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 2fd6d05d-ccf2-45e4-8e10-8d64843e6150 in datapath 4b7db8a7-cccc-445d-b22b-94b920fc0457 unbound from our chassis#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.626 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b7db8a7-cccc-445d-b22b-94b920fc0457#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.639 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0811baa9-e597-4292-a5ed-242998cc4dff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.640 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b7db8a7-c1 in ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.646 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b7db8a7-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.647 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[eaa61f50-ebb8-4fe3-aa68-6a5b50a25988]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.648 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bcd24fc0-5bd9-44bf-a86e-21978f2aab7d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.660 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[011fa14f-178e-4cc8-8d87-42150d323260]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.672 186962 INFO nova.compute.manager [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Took 12.26 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.672 186962 DEBUG nova.compute.manager [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.702 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5d785b87-e03b-4bc3-8302-1ad0efd4cb53]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.731 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[d85d707b-968f-4834-8829-d851d1886666]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 NetworkManager[55134]: <info>  [1764399971.7370] manager: (tap4b7db8a7-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/144)
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.736 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d3af7848-93cc-423d-82cb-e01740dceace]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 podman[226251]: 2025-11-29 07:06:11.74266885 +0000 UTC m=+0.055035067 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.762 186962 INFO nova.compute.manager [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Took 16.62 seconds to build instance.#033[00m
Nov 29 02:06:11 np0005539505 podman[226249]: 2025-11-29 07:06:11.768727526 +0000 UTC m=+0.096446788 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.780 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[81e130c8-8d44-477d-89c2-e7cc0124a57a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.783 186962 DEBUG oslo_concurrency.lockutils [None req-16e4fa2c-a979-43d7-ac97-cced72af58a3 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.785 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[56b7b26f-61df-4ede-ae26-3d9c9da36d13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 NetworkManager[55134]: <info>  [1764399971.8156] device (tap4b7db8a7-c0): carrier: link connected
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.821 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d85ec4-2e8f-4a85-a5d0-eb353d7bc1b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.836 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a711119f-0101-48cf-9cf8-eeee5cfa05ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b7db8a7-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:8f:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541939, 'reachable_time': 35837, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226301, 'error': None, 'target': 'ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.857 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e146ed0e-3034-4205-b02f-dbe65fd7b855]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:8fd7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 541939, 'tstamp': 541939}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226302, 'error': None, 'target': 'ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.873 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ff06e5-7ddc-4239-b560-7c5150d7d606]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b7db8a7-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:8f:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541939, 'reachable_time': 35837, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226303, 'error': None, 'target': 'ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.911 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a6492777-b7cf-457e-8d8a-fa7281e444d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.980 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cbc76327-0609-4e99-bc10-9eda6ef75e44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.983 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b7db8a7-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.983 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:06:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.984 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b7db8a7-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.986 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:11 np0005539505 NetworkManager[55134]: <info>  [1764399971.9869] manager: (tap4b7db8a7-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Nov 29 02:06:11 np0005539505 kernel: tap4b7db8a7-c0: entered promiscuous mode
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.988 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:11 np0005539505 nova_compute[186958]: 2025-11-29 07:06:11.992 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:11Z|00286|binding|INFO|Releasing lport 9ee4ef35-f95d-46ca-b0e2-c1d3f1df8f77 from this chassis (sb_readonly=0)
Nov 29 02:06:12 np0005539505 nova_compute[186958]: 2025-11-29 07:06:12.006 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.991 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b7db8a7-c0, col_values=(('external_ids', {'iface-id': '9ee4ef35-f95d-46ca-b0e2-c1d3f1df8f77'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:11.993 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b7db8a7-cccc-445d-b22b-94b920fc0457.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b7db8a7-cccc-445d-b22b-94b920fc0457.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:12.004 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ab2d5617-794c-4fc5-bb4c-a1cea5e7b23b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:12.016 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-4b7db8a7-cccc-445d-b22b-94b920fc0457
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/4b7db8a7-cccc-445d-b22b-94b920fc0457.pid.haproxy
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 4b7db8a7-cccc-445d-b22b-94b920fc0457
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:06:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:12.016 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457', 'env', 'PROCESS_TAG=haproxy-4b7db8a7-cccc-445d-b22b-94b920fc0457', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b7db8a7-cccc-445d-b22b-94b920fc0457.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:06:12 np0005539505 podman[226336]: 2025-11-29 07:06:12.377058875 +0000 UTC m=+0.038988043 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:06:12 np0005539505 podman[226336]: 2025-11-29 07:06:12.465784734 +0000 UTC m=+0.127713892 container create 0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:06:12 np0005539505 nova_compute[186958]: 2025-11-29 07:06:12.646 186962 DEBUG nova.compute.manager [req-9376be6b-dfcc-4140-8ce2-d6d3b565b90f req-a6e451f1-abf5-49a6-879f-c3412c21b2db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Received event network-vif-plugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:12 np0005539505 nova_compute[186958]: 2025-11-29 07:06:12.647 186962 DEBUG oslo_concurrency.lockutils [req-9376be6b-dfcc-4140-8ce2-d6d3b565b90f req-a6e451f1-abf5-49a6-879f-c3412c21b2db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:12 np0005539505 nova_compute[186958]: 2025-11-29 07:06:12.647 186962 DEBUG oslo_concurrency.lockutils [req-9376be6b-dfcc-4140-8ce2-d6d3b565b90f req-a6e451f1-abf5-49a6-879f-c3412c21b2db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:12 np0005539505 nova_compute[186958]: 2025-11-29 07:06:12.648 186962 DEBUG oslo_concurrency.lockutils [req-9376be6b-dfcc-4140-8ce2-d6d3b565b90f req-a6e451f1-abf5-49a6-879f-c3412c21b2db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:12 np0005539505 nova_compute[186958]: 2025-11-29 07:06:12.648 186962 DEBUG nova.compute.manager [req-9376be6b-dfcc-4140-8ce2-d6d3b565b90f req-a6e451f1-abf5-49a6-879f-c3412c21b2db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] No waiting events found dispatching network-vif-plugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:12 np0005539505 nova_compute[186958]: 2025-11-29 07:06:12.648 186962 WARNING nova.compute.manager [req-9376be6b-dfcc-4140-8ce2-d6d3b565b90f req-a6e451f1-abf5-49a6-879f-c3412c21b2db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Received unexpected event network-vif-plugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa for instance with vm_state active and task_state None.#033[00m
Nov 29 02:06:12 np0005539505 systemd[1]: Started libpod-conmon-0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1.scope.
Nov 29 02:06:12 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:06:12 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8149ce55cb9af5b178b194b878807a774f659d33215d0cc44d1af052afdfe20b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:06:12 np0005539505 podman[226336]: 2025-11-29 07:06:12.956849997 +0000 UTC m=+0.618779165 container init 0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:06:12 np0005539505 podman[226336]: 2025-11-29 07:06:12.968662871 +0000 UTC m=+0.630592029 container start 0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:06:13 np0005539505 neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457[226352]: [NOTICE]   (226356) : New worker (226358) forked
Nov 29 02:06:13 np0005539505 neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457[226352]: [NOTICE]   (226356) : Loading success.
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.298 186962 DEBUG oslo_concurrency.lockutils [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.299 186962 DEBUG oslo_concurrency.lockutils [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.300 186962 DEBUG oslo_concurrency.lockutils [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.300 186962 DEBUG oslo_concurrency.lockutils [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.301 186962 DEBUG oslo_concurrency.lockutils [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.313 186962 INFO nova.compute.manager [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Terminating instance#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.327 186962 DEBUG nova.compute.manager [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:06:13 np0005539505 kernel: tap27b3dbda-6b (unregistering): left promiscuous mode
Nov 29 02:06:13 np0005539505 NetworkManager[55134]: <info>  [1764399973.3483] device (tap27b3dbda-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.356 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:13Z|00287|binding|INFO|Releasing lport 27b3dbda-6b3a-4b71-8698-f91704c0d5fa from this chassis (sb_readonly=0)
Nov 29 02:06:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:13Z|00288|binding|INFO|Setting lport 27b3dbda-6b3a-4b71-8698-f91704c0d5fa down in Southbound
Nov 29 02:06:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:13Z|00289|binding|INFO|Removing iface tap27b3dbda-6b ovn-installed in OVS
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.360 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.372 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:13.372 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:ad:71 10.100.0.9'], port_security=['fa:16:3e:86:ad:71 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '82b53ac1-6801-4ad8-b4d1-34e59c5d20d6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=27b3dbda-6b3a-4b71-8698-f91704c0d5fa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:13.374 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 27b3dbda-6b3a-4b71-8698-f91704c0d5fa in datapath 9cf3a513-f54e-430e-b018-befaa643b464 unbound from our chassis#033[00m
Nov 29 02:06:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:13.376 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cf3a513-f54e-430e-b018-befaa643b464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:06:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:13.377 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a382a0d8-e64d-4392-9925-07d5b2f78e5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:13.378 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace which is not needed anymore#033[00m
Nov 29 02:06:13 np0005539505 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Nov 29 02:06:13 np0005539505 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000004b.scope: Consumed 3.160s CPU time.
Nov 29 02:06:13 np0005539505 systemd-machined[153285]: Machine qemu-36-instance-0000004b terminated.
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.587 186962 INFO nova.virt.libvirt.driver [-] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Instance destroyed successfully.#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.587 186962 DEBUG nova.objects.instance [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'resources' on Instance uuid 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.599 186962 DEBUG nova.virt.libvirt.vif [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:202:202,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:05:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-2061473623',display_name='tempest-ServersTestJSON-server-2061473623',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-2061473623',id=75,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:06:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-f75ra1ee',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:06:12Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=82b53ac1-6801-4ad8-b4d1-34e59c5d20d6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "address": "fa:16:3e:86:ad:71", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27b3dbda-6b", "ovs_interfaceid": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.600 186962 DEBUG nova.network.os_vif_util [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "address": "fa:16:3e:86:ad:71", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27b3dbda-6b", "ovs_interfaceid": "27b3dbda-6b3a-4b71-8698-f91704c0d5fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.600 186962 DEBUG nova.network.os_vif_util [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:ad:71,bridge_name='br-int',has_traffic_filtering=True,id=27b3dbda-6b3a-4b71-8698-f91704c0d5fa,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27b3dbda-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.601 186962 DEBUG os_vif [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:ad:71,bridge_name='br-int',has_traffic_filtering=True,id=27b3dbda-6b3a-4b71-8698-f91704c0d5fa,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27b3dbda-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.603 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.603 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27b3dbda-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.605 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.606 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.608 186962 INFO os_vif [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:ad:71,bridge_name='br-int',has_traffic_filtering=True,id=27b3dbda-6b3a-4b71-8698-f91704c0d5fa,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27b3dbda-6b')#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.608 186962 INFO nova.virt.libvirt.driver [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Deleting instance files /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6_del#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.609 186962 INFO nova.virt.libvirt.driver [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Deletion of /var/lib/nova/instances/82b53ac1-6801-4ad8-b4d1-34e59c5d20d6_del complete#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.630 186962 DEBUG nova.compute.manager [req-fe6afad9-a516-46b0-99fa-77f288f35758 req-09c18e1e-31e7-4909-a4ed-05388f8c62b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Received event network-vif-plugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.630 186962 DEBUG oslo_concurrency.lockutils [req-fe6afad9-a516-46b0-99fa-77f288f35758 req-09c18e1e-31e7-4909-a4ed-05388f8c62b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "713a4384-a974-4edd-9e95-ef29b5169889-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.631 186962 DEBUG oslo_concurrency.lockutils [req-fe6afad9-a516-46b0-99fa-77f288f35758 req-09c18e1e-31e7-4909-a4ed-05388f8c62b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.631 186962 DEBUG oslo_concurrency.lockutils [req-fe6afad9-a516-46b0-99fa-77f288f35758 req-09c18e1e-31e7-4909-a4ed-05388f8c62b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.632 186962 DEBUG nova.compute.manager [req-fe6afad9-a516-46b0-99fa-77f288f35758 req-09c18e1e-31e7-4909-a4ed-05388f8c62b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] No waiting events found dispatching network-vif-plugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.632 186962 WARNING nova.compute.manager [req-fe6afad9-a516-46b0-99fa-77f288f35758 req-09c18e1e-31e7-4909-a4ed-05388f8c62b8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Received unexpected event network-vif-plugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.683 186962 INFO nova.compute.manager [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.684 186962 DEBUG oslo.service.loopingcall [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.684 186962 DEBUG nova.compute.manager [-] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.685 186962 DEBUG nova.network.neutron [-] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.693 186962 DEBUG oslo_concurrency.lockutils [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Acquiring lock "713a4384-a974-4edd-9e95-ef29b5169889" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.694 186962 DEBUG oslo_concurrency.lockutils [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.694 186962 DEBUG oslo_concurrency.lockutils [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Acquiring lock "713a4384-a974-4edd-9e95-ef29b5169889-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.694 186962 DEBUG oslo_concurrency.lockutils [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.695 186962 DEBUG oslo_concurrency.lockutils [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.709 186962 INFO nova.compute.manager [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Terminating instance#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.719 186962 DEBUG nova.compute.manager [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:06:13 np0005539505 kernel: tap2fd6d05d-cc (unregistering): left promiscuous mode
Nov 29 02:06:13 np0005539505 NetworkManager[55134]: <info>  [1764399973.7391] device (tap2fd6d05d-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:06:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:13Z|00290|binding|INFO|Releasing lport 2fd6d05d-ccf2-45e4-8e10-8d64843e6150 from this chassis (sb_readonly=0)
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.802 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:13Z|00291|binding|INFO|Setting lport 2fd6d05d-ccf2-45e4-8e10-8d64843e6150 down in Southbound
Nov 29 02:06:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:13Z|00292|binding|INFO|Removing iface tap2fd6d05d-cc ovn-installed in OVS
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.805 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:13.810 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:c0:4a 10.100.0.8'], port_security=['fa:16:3e:4f:c0:4a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '713a4384-a974-4edd-9e95-ef29b5169889', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b7db8a7-cccc-445d-b22b-94b920fc0457', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '38296bde2b4d4a719f568c7ed5c6c0bf', 'neutron:revision_number': '4', 'neutron:security_group_ids': '35aeb8e2-829e-4031-8525-cc861ebfb4bf', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8248c843-6b85-4e58-8f04-912019203745, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=2fd6d05d-ccf2-45e4-8e10-8d64843e6150) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.818 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:13 np0005539505 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004c.scope: Deactivated successfully.
Nov 29 02:06:13 np0005539505 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004c.scope: Consumed 2.618s CPU time.
Nov 29 02:06:13 np0005539505 systemd-machined[153285]: Machine qemu-37-instance-0000004c terminated.
Nov 29 02:06:13 np0005539505 NetworkManager[55134]: <info>  [1764399973.9382] manager: (tap2fd6d05d-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/146)
Nov 29 02:06:13 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226226]: [NOTICE]   (226237) : haproxy version is 2.8.14-c23fe91
Nov 29 02:06:13 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226226]: [NOTICE]   (226237) : path to executable is /usr/sbin/haproxy
Nov 29 02:06:13 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226226]: [WARNING]  (226237) : Exiting Master process...
Nov 29 02:06:13 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226226]: [WARNING]  (226237) : Exiting Master process...
Nov 29 02:06:13 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226226]: [ALERT]    (226237) : Current worker (226239) exited with code 143 (Terminated)
Nov 29 02:06:13 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226226]: [WARNING]  (226237) : All workers exited. Exiting... (0)
Nov 29 02:06:13 np0005539505 systemd[1]: libpod-74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d.scope: Deactivated successfully.
Nov 29 02:06:13 np0005539505 podman[226389]: 2025-11-29 07:06:13.969174598 +0000 UTC m=+0.505713759 container died 74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.978 186962 INFO nova.virt.libvirt.driver [-] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Instance destroyed successfully.#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.978 186962 DEBUG nova.objects.instance [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lazy-loading 'resources' on Instance uuid 713a4384-a974-4edd-9e95-ef29b5169889 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.991 186962 DEBUG nova.virt.libvirt.vif [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:05:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1795175015',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1795175015',id=76,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:06:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='38296bde2b4d4a719f568c7ed5c6c0bf',ramdisk_id='',reservation_id='r-cso933n2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsV221TestJSON-1014245885',owner_user_name='tempest-InstanceActionsV221TestJSON-1014245885-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:06:11Z,user_data=None,user_id='61b629c14bdd42a4ab950e1a86b22ac4',uuid=713a4384-a974-4edd-9e95-ef29b5169889,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "address": "fa:16:3e:4f:c0:4a", "network": {"id": "4b7db8a7-cccc-445d-b22b-94b920fc0457", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-53676293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38296bde2b4d4a719f568c7ed5c6c0bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd6d05d-cc", "ovs_interfaceid": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.992 186962 DEBUG nova.network.os_vif_util [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Converting VIF {"id": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "address": "fa:16:3e:4f:c0:4a", "network": {"id": "4b7db8a7-cccc-445d-b22b-94b920fc0457", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-53676293-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "38296bde2b4d4a719f568c7ed5c6c0bf", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd6d05d-cc", "ovs_interfaceid": "2fd6d05d-ccf2-45e4-8e10-8d64843e6150", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.992 186962 DEBUG nova.network.os_vif_util [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:c0:4a,bridge_name='br-int',has_traffic_filtering=True,id=2fd6d05d-ccf2-45e4-8e10-8d64843e6150,network=Network(4b7db8a7-cccc-445d-b22b-94b920fc0457),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd6d05d-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.992 186962 DEBUG os_vif [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:c0:4a,bridge_name='br-int',has_traffic_filtering=True,id=2fd6d05d-ccf2-45e4-8e10-8d64843e6150,network=Network(4b7db8a7-cccc-445d-b22b-94b920fc0457),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd6d05d-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.994 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.994 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fd6d05d-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.995 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.996 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.998 186962 INFO os_vif [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:c0:4a,bridge_name='br-int',has_traffic_filtering=True,id=2fd6d05d-ccf2-45e4-8e10-8d64843e6150,network=Network(4b7db8a7-cccc-445d-b22b-94b920fc0457),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd6d05d-cc')#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.998 186962 INFO nova.virt.libvirt.driver [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Deleting instance files /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889_del#033[00m
Nov 29 02:06:13 np0005539505 nova_compute[186958]: 2025-11-29 07:06:13.999 186962 INFO nova.virt.libvirt.driver [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Deletion of /var/lib/nova/instances/713a4384-a974-4edd-9e95-ef29b5169889_del complete#033[00m
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.022 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.097 186962 INFO nova.compute.manager [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.097 186962 DEBUG oslo.service.loopingcall [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.100 186962 DEBUG nova.compute.manager [-] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.100 186962 DEBUG nova.network.neutron [-] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:06:14 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d-userdata-shm.mount: Deactivated successfully.
Nov 29 02:06:14 np0005539505 systemd[1]: var-lib-containers-storage-overlay-a64238c8e03e1a897871dec4f921913d6b762d50e7636c040dbc6cf2f821ef8e-merged.mount: Deactivated successfully.
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.761 186962 DEBUG nova.network.neutron [-] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.785 186962 INFO nova.compute.manager [-] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Took 1.10 seconds to deallocate network for instance.#033[00m
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.870 186962 DEBUG oslo_concurrency.lockutils [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.870 186962 DEBUG oslo_concurrency.lockutils [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:14 np0005539505 podman[226389]: 2025-11-29 07:06:14.888209481 +0000 UTC m=+1.424748632 container cleanup 74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 02:06:14 np0005539505 systemd[1]: libpod-conmon-74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d.scope: Deactivated successfully.
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.942 186962 DEBUG nova.network.neutron [-] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.980 186962 DEBUG nova.compute.provider_tree [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:06:14 np0005539505 nova_compute[186958]: 2025-11-29 07:06:14.984 186962 INFO nova.compute.manager [-] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Took 0.88 seconds to deallocate network for instance.#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.008 186962 DEBUG nova.scheduler.client.report [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.015 186962 DEBUG nova.compute.manager [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Received event network-vif-unplugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.016 186962 DEBUG oslo_concurrency.lockutils [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.016 186962 DEBUG oslo_concurrency.lockutils [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.016 186962 DEBUG oslo_concurrency.lockutils [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.016 186962 DEBUG nova.compute.manager [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] No waiting events found dispatching network-vif-unplugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.016 186962 WARNING nova.compute.manager [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Received unexpected event network-vif-unplugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.017 186962 DEBUG nova.compute.manager [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Received event network-vif-plugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.017 186962 DEBUG oslo_concurrency.lockutils [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.017 186962 DEBUG oslo_concurrency.lockutils [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.017 186962 DEBUG oslo_concurrency.lockutils [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.017 186962 DEBUG nova.compute.manager [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] No waiting events found dispatching network-vif-plugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.018 186962 WARNING nova.compute.manager [req-00536ee5-f0ea-42cc-bf23-751ec826d1df req-c65fbd8e-67b5-4486-9187-4e45d383d977 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Received unexpected event network-vif-plugged-27b3dbda-6b3a-4b71-8698-f91704c0d5fa for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.053 186962 DEBUG oslo_concurrency.lockutils [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.081 186962 INFO nova.scheduler.client.report [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Deleted allocations for instance 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.084 186962 DEBUG oslo_concurrency.lockutils [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.084 186962 DEBUG oslo_concurrency.lockutils [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.130 186962 DEBUG nova.compute.provider_tree [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.146 186962 DEBUG nova.scheduler.client.report [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.167 186962 DEBUG oslo_concurrency.lockutils [None req-6cb9e491-fcb0-4011-94e5-dd4613b2143e f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "82b53ac1-6801-4ad8-b4d1-34e59c5d20d6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.868s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.169 186962 DEBUG oslo_concurrency.lockutils [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.202 186962 INFO nova.scheduler.client.report [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Deleted allocations for instance 713a4384-a974-4edd-9e95-ef29b5169889#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.282 186962 DEBUG oslo_concurrency.lockutils [None req-50c717a8-1f0e-496b-9322-b0fbd29bdf94 61b629c14bdd42a4ab950e1a86b22ac4 38296bde2b4d4a719f568c7ed5c6c0bf - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.757 186962 DEBUG nova.compute.manager [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Received event network-vif-unplugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.757 186962 DEBUG oslo_concurrency.lockutils [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "713a4384-a974-4edd-9e95-ef29b5169889-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.757 186962 DEBUG oslo_concurrency.lockutils [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.758 186962 DEBUG oslo_concurrency.lockutils [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.758 186962 DEBUG nova.compute.manager [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] No waiting events found dispatching network-vif-unplugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.758 186962 WARNING nova.compute.manager [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Received unexpected event network-vif-unplugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.758 186962 DEBUG nova.compute.manager [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Received event network-vif-plugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.758 186962 DEBUG oslo_concurrency.lockutils [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "713a4384-a974-4edd-9e95-ef29b5169889-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.759 186962 DEBUG oslo_concurrency.lockutils [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.759 186962 DEBUG oslo_concurrency.lockutils [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "713a4384-a974-4edd-9e95-ef29b5169889-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.759 186962 DEBUG nova.compute.manager [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] No waiting events found dispatching network-vif-plugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.759 186962 WARNING nova.compute.manager [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Received unexpected event network-vif-plugged-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.759 186962 DEBUG nova.compute.manager [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Received event network-vif-deleted-27b3dbda-6b3a-4b71-8698-f91704c0d5fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.760 186962 DEBUG nova.compute.manager [req-a967143f-343f-41e1-9ca7-799bc405dd8e req-f4f81d38-6a70-4cb9-ac30-e45d7ec243d1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Received event network-vif-deleted-2fd6d05d-ccf2-45e4-8e10-8d64843e6150 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:15 np0005539505 podman[226458]: 2025-11-29 07:06:15.76257516 +0000 UTC m=+0.846682558 container remove 74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.772 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2f2f1422-cf36-4b23-adc8-f5d736cc3161]: (4, ('Sat Nov 29 07:06:13 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d)\n74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d\nSat Nov 29 07:06:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d)\n74de8bbfcd5076d1e8cc2e8ce8e53c1892bfb2e5e30fe41a23cf8f4e8f507a9d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.774 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ca0271-fe7e-4050-9915-e3d47e4944f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.776 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:15 np0005539505 kernel: tap9cf3a513-f0: left promiscuous mode
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.779 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.784 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8be1a512-2d6e-4d09-a210-7f9ecaf54702]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:15 np0005539505 nova_compute[186958]: 2025-11-29 07:06:15.792 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.805 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1c209d10-59d0-4da0-a0c7-6c8ee9cdc85d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.807 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a4c906-f37d-4ba9-870d-915f444fa3a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.825 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[91524a6b-ff67-4aa3-98fb-3a466073e321]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541752, 'reachable_time': 32560, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226481, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:15 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9cf3a513\x2df54e\x2d430e\x2db018\x2dbefaa643b464.mount: Deactivated successfully.
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.830 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.831 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[744620f2-b5a5-4d1f-8751-416ff79db725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.832 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 2fd6d05d-ccf2-45e4-8e10-8d64843e6150 in datapath 4b7db8a7-cccc-445d-b22b-94b920fc0457 unbound from our chassis#033[00m
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.834 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b7db8a7-cccc-445d-b22b-94b920fc0457, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.835 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4957de2a-4279-403f-888f-dc5ec94c50bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:15.836 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457 namespace which is not needed anymore#033[00m
Nov 29 02:06:15 np0005539505 podman[226473]: 2025-11-29 07:06:15.867188608 +0000 UTC m=+0.061430208 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 29 02:06:16 np0005539505 neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457[226352]: [NOTICE]   (226356) : haproxy version is 2.8.14-c23fe91
Nov 29 02:06:16 np0005539505 neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457[226352]: [NOTICE]   (226356) : path to executable is /usr/sbin/haproxy
Nov 29 02:06:16 np0005539505 neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457[226352]: [WARNING]  (226356) : Exiting Master process...
Nov 29 02:06:16 np0005539505 neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457[226352]: [ALERT]    (226356) : Current worker (226358) exited with code 143 (Terminated)
Nov 29 02:06:16 np0005539505 neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457[226352]: [WARNING]  (226356) : All workers exited. Exiting... (0)
Nov 29 02:06:16 np0005539505 systemd[1]: libpod-0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1.scope: Deactivated successfully.
Nov 29 02:06:16 np0005539505 podman[226511]: 2025-11-29 07:06:16.178939561 +0000 UTC m=+0.259493157 container died 0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:06:16 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1-userdata-shm.mount: Deactivated successfully.
Nov 29 02:06:16 np0005539505 systemd[1]: var-lib-containers-storage-overlay-8149ce55cb9af5b178b194b878807a774f659d33215d0cc44d1af052afdfe20b-merged.mount: Deactivated successfully.
Nov 29 02:06:16 np0005539505 podman[226511]: 2025-11-29 07:06:16.685419751 +0000 UTC m=+0.765973367 container cleanup 0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:06:16 np0005539505 podman[226540]: 2025-11-29 07:06:16.855593032 +0000 UTC m=+0.151484544 container remove 0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:06:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:16.861 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9f01a5bd-b17b-4c61-870c-a10a06d8edf9]: (4, ('Sat Nov 29 07:06:15 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457 (0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1)\n0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1\nSat Nov 29 07:06:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457 (0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1)\n0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:16.863 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[51994e21-fc79-4450-ac13-22a9cc26e1cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:16.865 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b7db8a7-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:16 np0005539505 nova_compute[186958]: 2025-11-29 07:06:16.867 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:16 np0005539505 kernel: tap4b7db8a7-c0: left promiscuous mode
Nov 29 02:06:16 np0005539505 nova_compute[186958]: 2025-11-29 07:06:16.879 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:16 np0005539505 nova_compute[186958]: 2025-11-29 07:06:16.880 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:16.882 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[04aece22-4413-4663-8dc2-6e4dddc436ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:16.899 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[70fe95a7-9fc3-4e46-83f6-c3a9e053c0f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:16.901 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b1afe4c6-b4ce-4cc4-b67a-8a1cf6b97d07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:16.916 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dd5d9c79-bce8-4935-99dd-536ed3456782]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541931, 'reachable_time': 15275, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226554, 'error': None, 'target': 'ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:16 np0005539505 systemd[1]: run-netns-ovnmeta\x2d4b7db8a7\x2dcccc\x2d445d\x2db22b\x2d94b920fc0457.mount: Deactivated successfully.
Nov 29 02:06:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:16.919 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b7db8a7-cccc-445d-b22b-94b920fc0457 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:06:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:16.920 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[06fecf92-4f7b-4ad3-9dbd-6b9d4f5d0b02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:16 np0005539505 systemd[1]: libpod-conmon-0db86fc59f60bd4d4f2c89b82138d1dadac546af9d88ce45b0011b2843de58f1.scope: Deactivated successfully.
Nov 29 02:06:18 np0005539505 nova_compute[186958]: 2025-11-29 07:06:18.998 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:19 np0005539505 nova_compute[186958]: 2025-11-29 07:06:19.024 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.454 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.455 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.480 186962 DEBUG nova.compute.manager [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.580 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.580 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.586 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.587 186962 INFO nova.compute.claims [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.700 186962 DEBUG nova.compute.provider_tree [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.714 186962 DEBUG nova.scheduler.client.report [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.734 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.736 186962 DEBUG nova.compute.manager [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.792 186962 DEBUG nova.compute.manager [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.792 186962 DEBUG nova.network.neutron [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.809 186962 INFO nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.827 186962 DEBUG nova.compute.manager [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.961 186962 DEBUG nova.compute.manager [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.962 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.962 186962 INFO nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Creating image(s)#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.963 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "/var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.963 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.964 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "/var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:20 np0005539505 nova_compute[186958]: 2025-11-29 07:06:20.975 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.031 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.033 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.034 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.056 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.115 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.116 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.440 186962 DEBUG nova.policy [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.669 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk 1073741824" returned: 0 in 0.553s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.670 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.671 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.729 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.731 186962 DEBUG nova.virt.disk.api [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Checking if we can resize image /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.731 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.789 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.791 186962 DEBUG nova.virt.disk.api [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Cannot resize image /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:06:21 np0005539505 nova_compute[186958]: 2025-11-29 07:06:21.791 186962 DEBUG nova.objects.instance [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'migration_context' on Instance uuid d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:22 np0005539505 nova_compute[186958]: 2025-11-29 07:06:22.160 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:06:22 np0005539505 nova_compute[186958]: 2025-11-29 07:06:22.161 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Ensure instance console log exists: /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:06:22 np0005539505 nova_compute[186958]: 2025-11-29 07:06:22.162 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:22 np0005539505 nova_compute[186958]: 2025-11-29 07:06:22.162 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:22 np0005539505 nova_compute[186958]: 2025-11-29 07:06:22.162 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:22 np0005539505 nova_compute[186958]: 2025-11-29 07:06:22.890 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:23 np0005539505 nova_compute[186958]: 2025-11-29 07:06:23.263 186962 DEBUG nova.network.neutron [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Successfully created port: 79d28453-30ed-42cf-a66e-786727a5b61c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:06:24 np0005539505 nova_compute[186958]: 2025-11-29 07:06:24.000 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:24 np0005539505 nova_compute[186958]: 2025-11-29 07:06:24.027 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:24 np0005539505 podman[226574]: 2025-11-29 07:06:24.731792989 +0000 UTC m=+0.059924355 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:06:24 np0005539505 podman[226575]: 2025-11-29 07:06:24.752410902 +0000 UTC m=+0.079688474 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller)
Nov 29 02:06:24 np0005539505 nova_compute[186958]: 2025-11-29 07:06:24.947 186962 DEBUG nova.network.neutron [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Successfully updated port: 79d28453-30ed-42cf-a66e-786727a5b61c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:06:24 np0005539505 nova_compute[186958]: 2025-11-29 07:06:24.970 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "refresh_cache-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:06:24 np0005539505 nova_compute[186958]: 2025-11-29 07:06:24.971 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquired lock "refresh_cache-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:06:24 np0005539505 nova_compute[186958]: 2025-11-29 07:06:24.971 186962 DEBUG nova.network.neutron [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:06:25 np0005539505 nova_compute[186958]: 2025-11-29 07:06:25.334 186962 DEBUG nova.network.neutron [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.191 186962 DEBUG nova.network.neutron [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Updating instance_info_cache with network_info: [{"id": "79d28453-30ed-42cf-a66e-786727a5b61c", "address": "fa:16:3e:eb:2a:e1", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d28453-30", "ovs_interfaceid": "79d28453-30ed-42cf-a66e-786727a5b61c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.217 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Releasing lock "refresh_cache-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.218 186962 DEBUG nova.compute.manager [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Instance network_info: |[{"id": "79d28453-30ed-42cf-a66e-786727a5b61c", "address": "fa:16:3e:eb:2a:e1", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d28453-30", "ovs_interfaceid": "79d28453-30ed-42cf-a66e-786727a5b61c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.222 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Start _get_guest_xml network_info=[{"id": "79d28453-30ed-42cf-a66e-786727a5b61c", "address": "fa:16:3e:eb:2a:e1", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d28453-30", "ovs_interfaceid": "79d28453-30ed-42cf-a66e-786727a5b61c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.229 186962 WARNING nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.236 186962 DEBUG nova.virt.libvirt.host [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.236 186962 DEBUG nova.virt.libvirt.host [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.241 186962 DEBUG nova.virt.libvirt.host [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.241 186962 DEBUG nova.virt.libvirt.host [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.243 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.243 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.244 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.244 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.244 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.244 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.245 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.245 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.245 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.245 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.246 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.246 186962 DEBUG nova.virt.hardware [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.250 186962 DEBUG nova.virt.libvirt.vif [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:06:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-707522399',display_name='tempest-ServersTestJSON-server-707522399',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-707522399',id=77,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-p5rx40om',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:06:20Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "79d28453-30ed-42cf-a66e-786727a5b61c", "address": "fa:16:3e:eb:2a:e1", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d28453-30", "ovs_interfaceid": "79d28453-30ed-42cf-a66e-786727a5b61c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.250 186962 DEBUG nova.network.os_vif_util [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "79d28453-30ed-42cf-a66e-786727a5b61c", "address": "fa:16:3e:eb:2a:e1", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d28453-30", "ovs_interfaceid": "79d28453-30ed-42cf-a66e-786727a5b61c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.251 186962 DEBUG nova.network.os_vif_util [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:2a:e1,bridge_name='br-int',has_traffic_filtering=True,id=79d28453-30ed-42cf-a66e-786727a5b61c,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d28453-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.252 186962 DEBUG nova.objects.instance [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'pci_devices' on Instance uuid d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.270 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  <uuid>d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8</uuid>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  <name>instance-0000004d</name>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersTestJSON-server-707522399</nova:name>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:06:26</nova:creationTime>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:        <nova:user uuid="f2f86d3bd4814a09966b869dd539a6c9">tempest-ServersTestJSON-373958708-project-member</nova:user>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:        <nova:project uuid="1dba9539037a4e9dbf33cba140fe21fe">tempest-ServersTestJSON-373958708</nova:project>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:        <nova:port uuid="79d28453-30ed-42cf-a66e-786727a5b61c">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <entry name="serial">d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8</entry>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <entry name="uuid">d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8</entry>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.config"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:eb:2a:e1"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <target dev="tap79d28453-30"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/console.log" append="off"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:06:26 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:06:26 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:06:26 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:06:26 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.271 186962 DEBUG nova.compute.manager [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Preparing to wait for external event network-vif-plugged-79d28453-30ed-42cf-a66e-786727a5b61c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.272 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.272 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.272 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.273 186962 DEBUG nova.virt.libvirt.vif [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:06:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-707522399',display_name='tempest-ServersTestJSON-server-707522399',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-707522399',id=77,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-p5rx40om',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:06:20Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "79d28453-30ed-42cf-a66e-786727a5b61c", "address": "fa:16:3e:eb:2a:e1", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d28453-30", "ovs_interfaceid": "79d28453-30ed-42cf-a66e-786727a5b61c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.273 186962 DEBUG nova.network.os_vif_util [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "79d28453-30ed-42cf-a66e-786727a5b61c", "address": "fa:16:3e:eb:2a:e1", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d28453-30", "ovs_interfaceid": "79d28453-30ed-42cf-a66e-786727a5b61c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.274 186962 DEBUG nova.network.os_vif_util [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:2a:e1,bridge_name='br-int',has_traffic_filtering=True,id=79d28453-30ed-42cf-a66e-786727a5b61c,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d28453-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.274 186962 DEBUG os_vif [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:2a:e1,bridge_name='br-int',has_traffic_filtering=True,id=79d28453-30ed-42cf-a66e-786727a5b61c,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d28453-30') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.276 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.277 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.277 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.279 186962 DEBUG nova.compute.manager [req-b284dd4a-86ba-4c8c-ba29-71839d83e450 req-ef5f877d-307e-4b16-87ae-d8eaa603532f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Received event network-changed-79d28453-30ed-42cf-a66e-786727a5b61c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.280 186962 DEBUG nova.compute.manager [req-b284dd4a-86ba-4c8c-ba29-71839d83e450 req-ef5f877d-307e-4b16-87ae-d8eaa603532f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Refreshing instance network info cache due to event network-changed-79d28453-30ed-42cf-a66e-786727a5b61c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.280 186962 DEBUG oslo_concurrency.lockutils [req-b284dd4a-86ba-4c8c-ba29-71839d83e450 req-ef5f877d-307e-4b16-87ae-d8eaa603532f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.280 186962 DEBUG oslo_concurrency.lockutils [req-b284dd4a-86ba-4c8c-ba29-71839d83e450 req-ef5f877d-307e-4b16-87ae-d8eaa603532f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.280 186962 DEBUG nova.network.neutron [req-b284dd4a-86ba-4c8c-ba29-71839d83e450 req-ef5f877d-307e-4b16-87ae-d8eaa603532f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Refreshing network info cache for port 79d28453-30ed-42cf-a66e-786727a5b61c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.283 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.284 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap79d28453-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.284 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap79d28453-30, col_values=(('external_ids', {'iface-id': '79d28453-30ed-42cf-a66e-786727a5b61c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:2a:e1', 'vm-uuid': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.333 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:26 np0005539505 NetworkManager[55134]: <info>  [1764399986.3342] manager: (tap79d28453-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.335 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.338 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.339 186962 INFO os_vif [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:2a:e1,bridge_name='br-int',has_traffic_filtering=True,id=79d28453-30ed-42cf-a66e-786727a5b61c,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d28453-30')#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.892 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.892 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.893 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] No VIF found with MAC fa:16:3e:eb:2a:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:06:26 np0005539505 nova_compute[186958]: 2025-11-29 07:06:26.893 186962 INFO nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Using config drive#033[00m
Nov 29 02:06:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:26.944 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:26.945 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:26.945 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:27 np0005539505 nova_compute[186958]: 2025-11-29 07:06:27.608 186962 INFO nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Creating config drive at /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.config#033[00m
Nov 29 02:06:27 np0005539505 nova_compute[186958]: 2025-11-29 07:06:27.613 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgd0gwvq_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:06:27 np0005539505 nova_compute[186958]: 2025-11-29 07:06:27.740 186962 DEBUG oslo_concurrency.processutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgd0gwvq_" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:06:27 np0005539505 kernel: tap79d28453-30: entered promiscuous mode
Nov 29 02:06:27 np0005539505 NetworkManager[55134]: <info>  [1764399987.7912] manager: (tap79d28453-30): new Tun device (/org/freedesktop/NetworkManager/Devices/148)
Nov 29 02:06:27 np0005539505 nova_compute[186958]: 2025-11-29 07:06:27.791 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:27Z|00293|binding|INFO|Claiming lport 79d28453-30ed-42cf-a66e-786727a5b61c for this chassis.
Nov 29 02:06:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:27Z|00294|binding|INFO|79d28453-30ed-42cf-a66e-786727a5b61c: Claiming fa:16:3e:eb:2a:e1 10.100.0.8
Nov 29 02:06:27 np0005539505 nova_compute[186958]: 2025-11-29 07:06:27.796 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.818 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:2a:e1 10.100.0.8'], port_security=['fa:16:3e:eb:2a:e1 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=79d28453-30ed-42cf-a66e-786727a5b61c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.819 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 79d28453-30ed-42cf-a66e-786727a5b61c in datapath 9cf3a513-f54e-430e-b018-befaa643b464 bound to our chassis#033[00m
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.820 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9cf3a513-f54e-430e-b018-befaa643b464#033[00m
Nov 29 02:06:27 np0005539505 systemd-udevd[226645]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:06:27 np0005539505 systemd-machined[153285]: New machine qemu-38-instance-0000004d.
Nov 29 02:06:27 np0005539505 NetworkManager[55134]: <info>  [1764399987.8361] device (tap79d28453-30): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:06:27 np0005539505 NetworkManager[55134]: <info>  [1764399987.8383] device (tap79d28453-30): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.836 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f90eafea-7ff7-499c-bfd2-e25075810838]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.838 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9cf3a513-f1 in ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.840 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9cf3a513-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.840 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3d7d3352-b525-40a3-bde2-79e0c13dd6ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.840 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ce6656e9-df65-40c7-b668-e6d5ec9cfd39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:27 np0005539505 nova_compute[186958]: 2025-11-29 07:06:27.854 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:27 np0005539505 systemd[1]: Started Virtual Machine qemu-38-instance-0000004d.
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.856 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[8d387808-314b-487d-ab02-1bf4ec307a91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:27Z|00295|binding|INFO|Setting lport 79d28453-30ed-42cf-a66e-786727a5b61c ovn-installed in OVS
Nov 29 02:06:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:27Z|00296|binding|INFO|Setting lport 79d28453-30ed-42cf-a66e-786727a5b61c up in Southbound
Nov 29 02:06:27 np0005539505 nova_compute[186958]: 2025-11-29 07:06:27.861 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.870 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[125f9103-b228-4c08-86f2-5b6928965a22]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.903 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7086bd-3bd1-4587-9b87-cc15de10f0a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:27 np0005539505 NetworkManager[55134]: <info>  [1764399987.9103] manager: (tap9cf3a513-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/149)
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.909 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0de8e5-2d93-45c9-b880-99d9b20d5d7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:27 np0005539505 systemd-udevd[226649]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.939 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[fe50cc79-b910-419c-add4-b23e78263272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.942 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[5342610a-11e7-45ba-9bb2-4568cbaf6bfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:27 np0005539505 NetworkManager[55134]: <info>  [1764399987.9623] device (tap9cf3a513-f0): carrier: link connected
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.967 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[179a7bfb-4851-4ee4-87c7-6aa3968c7d8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:27.983 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0b6cc40e-0538-4b64-88fe-0cbb570f607c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543554, 'reachable_time': 39910, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226681, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.002 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[968f8e14-2ab7-41d1-9248-b0884614d8cb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:28ee'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 543554, 'tstamp': 543554}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226685, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.016 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[db5f60b1-ac8d-4569-8259-9a513d22f318]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9cf3a513-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:28:ee'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543554, 'reachable_time': 39910, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226687, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.049 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d875ad81-55f9-4e91-80f4-acf904f434b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.074 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399988.074528, d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.075 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] VM Started (Lifecycle Event)#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.095 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.098 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399988.0753827, d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.099 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.116 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[23301169-6931-4670-9c3e-e7bdf32950d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.118 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.118 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.118 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.119 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9cf3a513-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.120 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:28 np0005539505 NetworkManager[55134]: <info>  [1764399988.1210] manager: (tap9cf3a513-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Nov 29 02:06:28 np0005539505 kernel: tap9cf3a513-f0: entered promiscuous mode
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.122 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.124 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.125 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9cf3a513-f0, col_values=(('external_ids', {'iface-id': 'ed5aef73-67a0-4ad1-8aea-9c411786c18e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:28 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:28Z|00297|binding|INFO|Releasing lport ed5aef73-67a0-4ad1-8aea-9c411786c18e from this chassis (sb_readonly=0)
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.126 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.127 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.128 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fcd45295-685b-455f-a3fa-c2df125ea61d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.129 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9cf3a513-f54e-430e-b018-befaa643b464.pid.haproxy
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9cf3a513-f54e-430e-b018-befaa643b464
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:06:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:28.130 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'env', 'PROCESS_TAG=haproxy-9cf3a513-f54e-430e-b018-befaa643b464', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9cf3a513-f54e-430e-b018-befaa643b464.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.137 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.144 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.200 186962 DEBUG nova.compute.manager [req-d98bd268-45d3-4678-9817-1618b0a049db req-0c3368c7-98c2-4daf-955e-97ff1a22ed7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Received event network-vif-plugged-79d28453-30ed-42cf-a66e-786727a5b61c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.201 186962 DEBUG oslo_concurrency.lockutils [req-d98bd268-45d3-4678-9817-1618b0a049db req-0c3368c7-98c2-4daf-955e-97ff1a22ed7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.202 186962 DEBUG oslo_concurrency.lockutils [req-d98bd268-45d3-4678-9817-1618b0a049db req-0c3368c7-98c2-4daf-955e-97ff1a22ed7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.202 186962 DEBUG oslo_concurrency.lockutils [req-d98bd268-45d3-4678-9817-1618b0a049db req-0c3368c7-98c2-4daf-955e-97ff1a22ed7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.202 186962 DEBUG nova.compute.manager [req-d98bd268-45d3-4678-9817-1618b0a049db req-0c3368c7-98c2-4daf-955e-97ff1a22ed7b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Processing event network-vif-plugged-79d28453-30ed-42cf-a66e-786727a5b61c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.203 186962 DEBUG nova.compute.manager [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.206 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764399988.2063577, d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.206 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.208 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.210 186962 INFO nova.virt.libvirt.driver [-] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Instance spawned successfully.#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.211 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.231 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.238 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.241 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.241 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.242 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.242 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.242 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.243 186962 DEBUG nova.virt.libvirt.driver [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.278 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.325 186962 INFO nova.compute.manager [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Took 7.36 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.326 186962 DEBUG nova.compute.manager [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.393 186962 DEBUG nova.network.neutron [req-b284dd4a-86ba-4c8c-ba29-71839d83e450 req-ef5f877d-307e-4b16-87ae-d8eaa603532f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Updated VIF entry in instance network info cache for port 79d28453-30ed-42cf-a66e-786727a5b61c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.395 186962 DEBUG nova.network.neutron [req-b284dd4a-86ba-4c8c-ba29-71839d83e450 req-ef5f877d-307e-4b16-87ae-d8eaa603532f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Updating instance_info_cache with network_info: [{"id": "79d28453-30ed-42cf-a66e-786727a5b61c", "address": "fa:16:3e:eb:2a:e1", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d28453-30", "ovs_interfaceid": "79d28453-30ed-42cf-a66e-786727a5b61c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.433 186962 DEBUG oslo_concurrency.lockutils [req-b284dd4a-86ba-4c8c-ba29-71839d83e450 req-ef5f877d-307e-4b16-87ae-d8eaa603532f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.477 186962 INFO nova.compute.manager [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Took 7.93 seconds to build instance.#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.513 186962 DEBUG oslo_concurrency.lockutils [None req-23ecd6cf-28ca-43a2-b70a-d1974815a4d0 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.058s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:28 np0005539505 podman[226719]: 2025-11-29 07:06:28.450848744 +0000 UTC m=+0.023204547 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.586 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399973.5842898, 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.586 186962 INFO nova.compute.manager [-] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.607 186962 DEBUG nova.compute.manager [None req-220ce164-73b7-462e-a405-0635f4f06655 - - - - - -] [instance: 82b53ac1-6801-4ad8-b4d1-34e59c5d20d6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.977 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764399973.976222, 713a4384-a974-4edd-9e95-ef29b5169889 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:06:28 np0005539505 nova_compute[186958]: 2025-11-29 07:06:28.979 186962 INFO nova.compute.manager [-] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:06:29 np0005539505 nova_compute[186958]: 2025-11-29 07:06:29.029 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:29 np0005539505 nova_compute[186958]: 2025-11-29 07:06:29.051 186962 DEBUG nova.compute.manager [None req-f20037a0-7060-484a-ab0a-7536618a221f - - - - - -] [instance: 713a4384-a974-4edd-9e95-ef29b5169889] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:30 np0005539505 podman[226719]: 2025-11-29 07:06:30.157472693 +0000 UTC m=+1.729828466 container create 5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 29 02:06:30 np0005539505 systemd[1]: Started libpod-conmon-5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d.scope.
Nov 29 02:06:30 np0005539505 nova_compute[186958]: 2025-11-29 07:06:30.381 186962 DEBUG nova.compute.manager [req-ba22734d-2fee-46cb-ad82-371358c9c5b9 req-b3ea0486-3ef2-4ea5-88fa-3285f03e0151 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Received event network-vif-plugged-79d28453-30ed-42cf-a66e-786727a5b61c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:30 np0005539505 nova_compute[186958]: 2025-11-29 07:06:30.383 186962 DEBUG oslo_concurrency.lockutils [req-ba22734d-2fee-46cb-ad82-371358c9c5b9 req-b3ea0486-3ef2-4ea5-88fa-3285f03e0151 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:30 np0005539505 nova_compute[186958]: 2025-11-29 07:06:30.383 186962 DEBUG oslo_concurrency.lockutils [req-ba22734d-2fee-46cb-ad82-371358c9c5b9 req-b3ea0486-3ef2-4ea5-88fa-3285f03e0151 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:30 np0005539505 nova_compute[186958]: 2025-11-29 07:06:30.384 186962 DEBUG oslo_concurrency.lockutils [req-ba22734d-2fee-46cb-ad82-371358c9c5b9 req-b3ea0486-3ef2-4ea5-88fa-3285f03e0151 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:30 np0005539505 nova_compute[186958]: 2025-11-29 07:06:30.384 186962 DEBUG nova.compute.manager [req-ba22734d-2fee-46cb-ad82-371358c9c5b9 req-b3ea0486-3ef2-4ea5-88fa-3285f03e0151 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] No waiting events found dispatching network-vif-plugged-79d28453-30ed-42cf-a66e-786727a5b61c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:30 np0005539505 nova_compute[186958]: 2025-11-29 07:06:30.385 186962 WARNING nova.compute.manager [req-ba22734d-2fee-46cb-ad82-371358c9c5b9 req-b3ea0486-3ef2-4ea5-88fa-3285f03e0151 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Received unexpected event network-vif-plugged-79d28453-30ed-42cf-a66e-786727a5b61c for instance with vm_state active and task_state None.#033[00m
Nov 29 02:06:30 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:06:30 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c897ff95fa385462f4ccee135732f55f8dbb75a1cc49451ce6ebf052a6efa46/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:06:30 np0005539505 podman[226732]: 2025-11-29 07:06:30.439209459 +0000 UTC m=+0.247886960 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:06:30 np0005539505 podman[226719]: 2025-11-29 07:06:30.906332525 +0000 UTC m=+2.478688378 container init 5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:06:30 np0005539505 podman[226719]: 2025-11-29 07:06:30.911545473 +0000 UTC m=+2.483901246 container start 5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:06:30 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226744]: [NOTICE]   (226760) : New worker (226762) forked
Nov 29 02:06:30 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226744]: [NOTICE]   (226760) : Loading success.
Nov 29 02:06:31 np0005539505 nova_compute[186958]: 2025-11-29 07:06:31.334 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:32 np0005539505 podman[226771]: 2025-11-29 07:06:32.72311864 +0000 UTC m=+0.053482113 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd)
Nov 29 02:06:32 np0005539505 nova_compute[186958]: 2025-11-29 07:06:32.826 186962 DEBUG oslo_concurrency.lockutils [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:32 np0005539505 nova_compute[186958]: 2025-11-29 07:06:32.827 186962 DEBUG oslo_concurrency.lockutils [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:32 np0005539505 nova_compute[186958]: 2025-11-29 07:06:32.828 186962 DEBUG nova.compute.manager [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:32 np0005539505 nova_compute[186958]: 2025-11-29 07:06:32.832 186962 DEBUG nova.compute.manager [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 29 02:06:32 np0005539505 nova_compute[186958]: 2025-11-29 07:06:32.833 186962 DEBUG nova.objects.instance [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'flavor' on Instance uuid d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:32 np0005539505 nova_compute[186958]: 2025-11-29 07:06:32.864 186962 DEBUG nova.objects.instance [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'info_cache' on Instance uuid d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:32 np0005539505 nova_compute[186958]: 2025-11-29 07:06:32.894 186962 DEBUG nova.virt.libvirt.driver [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:06:34 np0005539505 nova_compute[186958]: 2025-11-29 07:06:34.032 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:34.428 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:34.429 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:06:34 np0005539505 nova_compute[186958]: 2025-11-29 07:06:34.479 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:36 np0005539505 nova_compute[186958]: 2025-11-29 07:06:36.336 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:39 np0005539505 nova_compute[186958]: 2025-11-29 07:06:39.033 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:39.432 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:41 np0005539505 nova_compute[186958]: 2025-11-29 07:06:41.339 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:42 np0005539505 podman[226809]: 2025-11-29 07:06:42.731303832 +0000 UTC m=+0.052587638 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:06:42 np0005539505 podman[226808]: 2025-11-29 07:06:42.736166299 +0000 UTC m=+0.062165498 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Nov 29 02:06:42 np0005539505 nova_compute[186958]: 2025-11-29 07:06:42.939 186962 DEBUG nova.virt.libvirt.driver [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:06:43 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:43Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:eb:2a:e1 10.100.0.8
Nov 29 02:06:43 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:43Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:eb:2a:e1 10.100.0.8
Nov 29 02:06:44 np0005539505 nova_compute[186958]: 2025-11-29 07:06:44.035 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:46 np0005539505 nova_compute[186958]: 2025-11-29 07:06:46.341 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:46 np0005539505 podman[226852]: 2025-11-29 07:06:46.723188271 +0000 UTC m=+0.056437737 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.086 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'name': 'tempest-ServersTestJSON-server-707522399', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004d', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '1dba9539037a4e9dbf33cba140fe21fe', 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'hostId': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.087 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.090 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 / tap79d28453-30 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.090 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab79389b-6541-416a-9688-1bda01b5fdc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-0000004d-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-tap79d28453-30', 'timestamp': '2025-11-29T07:06:48.087556', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'tap79d28453-30', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:eb:2a:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap79d28453-30'}, 'message_id': 'f8be847c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.72841091, 'message_signature': '6af77e84b0007bab50604fbd209c4e849b85ecaf9d7cc706d96e4f931f03776a'}]}, 'timestamp': '2025-11-29 07:06:48.090753', '_unique_id': '07e429335763400a92901dd49f1d5efb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.092 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6833de87-b077-452f-9d2b-eb79aaca0c79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-0000004d-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-tap79d28453-30', 'timestamp': '2025-11-29T07:06:48.093078', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'tap79d28453-30', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:eb:2a:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap79d28453-30'}, 'message_id': 'f8beee3a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.72841091, 'message_signature': '9a59cdddb4e13da4b5fb6ab854cfe4f8a96fbaf16f7f3905008a2944f2ae3931'}]}, 'timestamp': '2025-11-29 07:06:48.093370', '_unique_id': '43e80964e88c4960a3eb3f708a572b6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.093 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.104 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.104 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c519cc53-2e28-451a-895e-cd35a814a397', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-vda', 'timestamp': '2025-11-29T07:06:48.094571', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8c0acc0-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.735427179, 'message_signature': '8e6d9fdb945264a32eff7ba367fd4af0ff26e5c64a338040e924065031d38b45'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-sda', 'timestamp': '2025-11-29T07:06:48.094571', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8c0bb48-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.735427179, 'message_signature': '0532885f03570cea6bcb598cbad6ab2853ce5ca415bcf6d3901eec58c881ac39'}]}, 'timestamp': '2025-11-29 07:06:48.105192', '_unique_id': '96c63a95cd264344aa2aa07abb56b137'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.106 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.130 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.write.requests volume: 305 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.131 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7028dbd-c5c2-4d80-abe2-fe6a402b7fe7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 305, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-vda', 'timestamp': '2025-11-29T07:06:48.107289', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8c4aa1e-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': '5f24b46f1c32834979092c9cc7de4f9b3efa24733d59c523a3b2a4f17441d4fb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-sda', 'timestamp': '2025-11-29T07:06:48.107289', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8c4b7f2-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': 'f9ad8d6b08a2118c5abc8503e7c60937e88a235e77400b4b592f24528b07bcaf'}]}, 'timestamp': '2025-11-29 07:06:48.131345', '_unique_id': '83608981837d4041a5e0a41998f3fbff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.133 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.146 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/memory.usage volume: 40.5546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97fd0fd7-60d7-4cd8-86b9-b48a1915d1a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.5546875, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'timestamp': '2025-11-29T07:06:48.133501', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f8c723c0-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.787538552, 'message_signature': 'c1b3c5279b459c930958f8d771e7e8feb80594b3842d49836271ff82b94808b2'}]}, 'timestamp': '2025-11-29 07:06:48.147249', '_unique_id': '3fad330f3567424bbd931cba71621194'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.149 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1132627c-5bc6-43c5-98ef-fe1f6a284a6f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-0000004d-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-tap79d28453-30', 'timestamp': '2025-11-29T07:06:48.149131', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'tap79d28453-30', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:eb:2a:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap79d28453-30'}, 'message_id': 'f8c77cbc-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.72841091, 'message_signature': '3217937f031e4d0eefeacca909de8d85d43f870ec946cb5b68ded3a9c00c33e0'}]}, 'timestamp': '2025-11-29 07:06:48.149458', '_unique_id': '2f3ccbc81a784f39af2e84c487dcca7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.150 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/network.outgoing.bytes volume: 1396 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29cce029-84ff-4b6e-9633-e302d3a70b0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1396, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-0000004d-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-tap79d28453-30', 'timestamp': '2025-11-29T07:06:48.150661', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'tap79d28453-30', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:eb:2a:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap79d28453-30'}, 'message_id': 'f8c7b5d8-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.72841091, 'message_signature': '498866174cd1a87b973424df6ce1adc333928b387547e75158d9d079bd158115'}]}, 'timestamp': '2025-11-29 07:06:48.150892', '_unique_id': '391f9ba306744b6aa19a33ca7a52bc87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39072b5f-8e3f-4d79-8759-4ee081b24940', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-0000004d-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-tap79d28453-30', 'timestamp': '2025-11-29T07:06:48.152161', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'tap79d28453-30', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:eb:2a:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap79d28453-30'}, 'message_id': 'f8c7f2c8-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.72841091, 'message_signature': '79692084d86f89a0c14a18df3baf6141a0a8c74de160c85b919af7af4d29575f'}]}, 'timestamp': '2025-11-29 07:06:48.152451', '_unique_id': '8e8a4c354c3d44c38d8c88f9df39bd81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.153 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.read.requests volume: 1092 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.153 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32134ef0-c7b0-4c45-86a4-8f3194b8a005', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1092, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-vda', 'timestamp': '2025-11-29T07:06:48.153595', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8c82874-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': '727930339654a2859d2fbf360695e6807b5ce7557e7a9425083889b14c028373'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-sda', 'timestamp': '2025-11-29T07:06:48.153595', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8c83080-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': '270aff08fcdfae0a5f83d9800bbbfa6047987f272e2587b1a9c7ad9842bf6e99'}]}, 'timestamp': '2025-11-29 07:06:48.154034', '_unique_id': 'cb7217ddd81a4e209a0e4eb75edc42a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.155 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.155 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersTestJSON-server-707522399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestJSON-server-707522399>]
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.155 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28419c17-6f94-4f7b-bf6d-9fe5f44ab02c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-0000004d-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-tap79d28453-30', 'timestamp': '2025-11-29T07:06:48.155867', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'tap79d28453-30', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:eb:2a:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap79d28453-30'}, 'message_id': 'f8c881a2-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.72841091, 'message_signature': 'f7b96e0d3906e76f7947023e0d5e3ebd062e8d7a4f2d75fc0b1c780526e5671b'}]}, 'timestamp': '2025-11-29 07:06:48.156108', '_unique_id': '1d3758aadc45481c81b13026e15633f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.157 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f2d1083-1a39-466a-ba5c-e5c900c63fcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-0000004d-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-tap79d28453-30', 'timestamp': '2025-11-29T07:06:48.157354', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'tap79d28453-30', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:eb:2a:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap79d28453-30'}, 'message_id': 'f8c8bc80-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.72841091, 'message_signature': '0d6b8411f1ba6391905d0ee08126e582610c10a2fb9ddff834d63020102c6a29'}]}, 'timestamp': '2025-11-29 07:06:48.157615', '_unique_id': 'ce37246483a241d38edc3cf2972f1100'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.158 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/network.incoming.bytes volume: 1262 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afbfed37-6847-4e55-b752-be4fbc1e146c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1262, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-0000004d-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-tap79d28453-30', 'timestamp': '2025-11-29T07:06:48.158886', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'tap79d28453-30', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:eb:2a:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap79d28453-30'}, 'message_id': 'f8c8f718-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.72841091, 'message_signature': '75367d2fb41b36fc1ed56167225edc3fae87c54375260cd4bc3e8d1eab39f417'}]}, 'timestamp': '2025-11-29 07:06:48.159248', '_unique_id': 'b261bfb39f83456ca466a49ab09a56c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.160 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.160 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea5c9d71-8b0c-47bf-bd66-2aa58d665aa5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-0000004d-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-tap79d28453-30', 'timestamp': '2025-11-29T07:06:48.160605', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'tap79d28453-30', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:eb:2a:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap79d28453-30'}, 'message_id': 'f8c93a7a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.72841091, 'message_signature': '34b70ed14abf818ea7c7ab11c2cb9ae127d729866da61e9b603f50516f2cbaa1'}]}, 'timestamp': '2025-11-29 07:06:48.160863', '_unique_id': '10d913c38e1a40d2badf29c44ea0905e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.162 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.162 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f60db4c4-b1fc-42ca-b9f3-a5255f3d1cb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-vda', 'timestamp': '2025-11-29T07:06:48.162062', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8c97350-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.735427179, 'message_signature': 'e13e0b1b50ac46cf3dfe3f5f060a1c2ca6a0382809e436ce20c1b7480ee4a2ea'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-sda', 'timestamp': '2025-11-29T07:06:48.162062', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8c97cce-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.735427179, 'message_signature': '4434edbbd1c40682b900354663312057295f450344d9bec34bf0342b291a9c33'}]}, 'timestamp': '2025-11-29 07:06:48.162522', '_unique_id': '7a309837a357486aba14f7cd54dec419'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.163 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.read.latency volume: 424334946 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.read.latency volume: 25996550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9982ae95-e5eb-4352-88e2-cc534dc40ff6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 424334946, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-vda', 'timestamp': '2025-11-29T07:06:48.163899', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8c9bb6c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': '99d11ea029de2b3d05602009e94604731d096258e301073e6f96ed0024b4a65f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25996550, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-sda', 'timestamp': '2025-11-29T07:06:48.163899', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8c9c56c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': '5f6066b43aa54d972fa8c1ea28e6fd8166d4b9f79e5890c1d88e93f814787cc2'}]}, 'timestamp': '2025-11-29 07:06:48.164422', '_unique_id': '945ea9033b864814bd0fc2b91c505342'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.164 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.165 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.165 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersTestJSON-server-707522399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestJSON-server-707522399>]
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.165 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.write.latency volume: 24979419686 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae4883a5-83ca-40f1-b7ee-00e01c3010ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24979419686, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-vda', 'timestamp': '2025-11-29T07:06:48.165976', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8ca0c34-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': '2f89d0dc162dd17b1696389c99b7e7c61ca643537949cbce166275ebf22407d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-sda', 'timestamp': '2025-11-29T07:06:48.165976', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8ca160c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': '7ddbb7fec6fa5319ab13c4c73fe8cc97f50108e583962ab0cfc9bb3776297ddd'}]}, 'timestamp': '2025-11-29 07:06:48.166444', '_unique_id': 'ffb22aabd7ba4288abeb7139f6390f5c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.166 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.167 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.167 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.167 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersTestJSON-server-707522399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestJSON-server-707522399>]
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.167 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.167 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/cpu volume: 11490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24b06632-a2ae-422f-a9c5-28976542cc95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11490000000, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'timestamp': '2025-11-29T07:06:48.167855', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f8ca5572-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.787538552, 'message_signature': '19eca38b1ed59ec2b0eb6984bbbc13efceb7ed877b00fa4f580889fe01a1f1c0'}]}, 'timestamp': '2025-11-29 07:06:48.168076', '_unique_id': 'a5a5c835e6414782b2ceb646b099b9d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.168 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.169 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '920dac68-9510-40d8-a4ef-48e060c007c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'instance-0000004d-d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-tap79d28453-30', 'timestamp': '2025-11-29T07:06:48.169318', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'tap79d28453-30', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:eb:2a:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap79d28453-30'}, 'message_id': 'f8ca8e84-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.72841091, 'message_signature': '71b269ed3fd902b1615032342d452fbbb35ff5c2ef45d4b7b38bf9c105461bd3'}]}, 'timestamp': '2025-11-29 07:06:48.169543', '_unique_id': 'e5f5a1ea5e704e818e9f79bd73127765'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersTestJSON-server-707522399>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersTestJSON-server-707522399>]
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.170 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.read.bytes volume: 30325248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88d8670e-e03f-4c74-9156-9641aa5eaf03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30325248, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-vda', 'timestamp': '2025-11-29T07:06:48.171006', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cad04c-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': 'a37bc3252be248055cae9f07137f455d31d901a9944a5e855ffdb2ca2da876ec'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-sda', 'timestamp': '2025-11-29T07:06:48.171006', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cad92a-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': '4f0cc9ce59603e839e9d23a1c9f6df4a718835420e62ca439aa934859a25c7ab'}]}, 'timestamp': '2025-11-29 07:06:48.171440', '_unique_id': '7337073517a14ccb9f4106a1d2a1f4f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.171 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.172 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.172 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95ec0d49-c938-4d09-aa18-2f7b2cc05e18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-vda', 'timestamp': '2025-11-29T07:06:48.172629', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cb0fda-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.735427179, 'message_signature': '75d9f30bfbb0386cb622b4921b68e51f43fcf1e0dd71824e517cca02099f566c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-sda', 'timestamp': '2025-11-29T07:06:48.172629', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cb176e-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.735427179, 'message_signature': 'cf4ba436c4dbb5c49e58f042f7eeb4e5d29867e5cdfaa82f60774d2d193998c1'}]}, 'timestamp': '2025-11-29 07:06:48.173061', '_unique_id': '45fc856d0f4a4ef4827005b965dbab4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.174 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.write.bytes volume: 72863744 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.174 12 DEBUG ceilometer.compute.pollsters [-] d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0527aceb-cc7e-4e32-bc8f-7142273cf25e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72863744, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-vda', 'timestamp': '2025-11-29T07:06:48.174312', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f8cb51de-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': 'cd5f6a7fe6b78a994d3be7b57537945a014c22d2047564d10f403cb6fb97a3c2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f2f86d3bd4814a09966b869dd539a6c9', 'user_name': None, 'project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'project_name': None, 'resource_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-sda', 'timestamp': '2025-11-29T07:06:48.174312', 'resource_metadata': {'display_name': 'tempest-ServersTestJSON-server-707522399', 'name': 'instance-0000004d', 'instance_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'instance_type': 'm1.nano', 'host': 'aaa01ba80346530e0967826d885b9e0cae043c40f76ee08f616d7cca', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f8cb5d14-ccf1-11f0-8954-fa163e5a5606', 'monotonic_time': 5455.74819336, 'message_signature': '710b6c500043113e7b53ee8430fd92f6c3f37ca8c6bf1245b9840c1631d24e07'}]}, 'timestamp': '2025-11-29 07:06:48.174824', '_unique_id': '4f61edb144ef407c804ba711e945d8b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:06:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:06:48 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:06:49 np0005539505 nova_compute[186958]: 2025-11-29 07:06:49.044 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:49 np0005539505 nova_compute[186958]: 2025-11-29 07:06:49.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:51 np0005539505 kernel: tap79d28453-30 (unregistering): left promiscuous mode
Nov 29 02:06:51 np0005539505 NetworkManager[55134]: <info>  [1764400011.3212] device (tap79d28453-30): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:06:51 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:51Z|00298|binding|INFO|Releasing lport 79d28453-30ed-42cf-a66e-786727a5b61c from this chassis (sb_readonly=0)
Nov 29 02:06:51 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:51Z|00299|binding|INFO|Setting lport 79d28453-30ed-42cf-a66e-786727a5b61c down in Southbound
Nov 29 02:06:51 np0005539505 ovn_controller[95143]: 2025-11-29T07:06:51Z|00300|binding|INFO|Removing iface tap79d28453-30 ovn-installed in OVS
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.328 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.332 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:51.338 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:2a:e1 10.100.0.8'], port_security=['fa:16:3e:eb:2a:e1 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'd1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf3a513-f54e-430e-b018-befaa643b464', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1dba9539037a4e9dbf33cba140fe21fe', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fc8ab121-ee69-4ab4-9a39-25b26b293132', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a3fd0639-e84a-4389-a7f3-f9ac2c360b5e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=79d28453-30ed-42cf-a66e-786727a5b61c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:06:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:51.340 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 79d28453-30ed-42cf-a66e-786727a5b61c in datapath 9cf3a513-f54e-430e-b018-befaa643b464 unbound from our chassis#033[00m
Nov 29 02:06:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:51.341 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9cf3a513-f54e-430e-b018-befaa643b464, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:06:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:51.342 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0f75a81a-835c-4d45-a569-5e7ee1f6728a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:51.343 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 namespace which is not needed anymore#033[00m
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.343 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.350 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:51 np0005539505 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Nov 29 02:06:51 np0005539505 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000004d.scope: Consumed 13.021s CPU time.
Nov 29 02:06:51 np0005539505 systemd-machined[153285]: Machine qemu-38-instance-0000004d terminated.
Nov 29 02:06:51 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226744]: [NOTICE]   (226760) : haproxy version is 2.8.14-c23fe91
Nov 29 02:06:51 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226744]: [NOTICE]   (226760) : path to executable is /usr/sbin/haproxy
Nov 29 02:06:51 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226744]: [WARNING]  (226760) : Exiting Master process...
Nov 29 02:06:51 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226744]: [ALERT]    (226760) : Current worker (226762) exited with code 143 (Terminated)
Nov 29 02:06:51 np0005539505 neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464[226744]: [WARNING]  (226760) : All workers exited. Exiting... (0)
Nov 29 02:06:51 np0005539505 systemd[1]: libpod-5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d.scope: Deactivated successfully.
Nov 29 02:06:51 np0005539505 podman[226893]: 2025-11-29 07:06:51.594580304 +0000 UTC m=+0.166878949 container died 5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.633 186962 DEBUG nova.compute.manager [req-387103b1-9526-4762-8c98-c865bf920229 req-d365a7b1-c849-4ba8-bc1e-8821a0002c1c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Received event network-vif-unplugged-79d28453-30ed-42cf-a66e-786727a5b61c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.634 186962 DEBUG oslo_concurrency.lockutils [req-387103b1-9526-4762-8c98-c865bf920229 req-d365a7b1-c849-4ba8-bc1e-8821a0002c1c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.634 186962 DEBUG oslo_concurrency.lockutils [req-387103b1-9526-4762-8c98-c865bf920229 req-d365a7b1-c849-4ba8-bc1e-8821a0002c1c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.634 186962 DEBUG oslo_concurrency.lockutils [req-387103b1-9526-4762-8c98-c865bf920229 req-d365a7b1-c849-4ba8-bc1e-8821a0002c1c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.634 186962 DEBUG nova.compute.manager [req-387103b1-9526-4762-8c98-c865bf920229 req-d365a7b1-c849-4ba8-bc1e-8821a0002c1c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] No waiting events found dispatching network-vif-unplugged-79d28453-30ed-42cf-a66e-786727a5b61c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.635 186962 WARNING nova.compute.manager [req-387103b1-9526-4762-8c98-c865bf920229 req-d365a7b1-c849-4ba8-bc1e-8821a0002c1c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Received unexpected event network-vif-unplugged-79d28453-30ed-42cf-a66e-786727a5b61c for instance with vm_state active and task_state powering-off.#033[00m
Nov 29 02:06:51 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d-userdata-shm.mount: Deactivated successfully.
Nov 29 02:06:51 np0005539505 systemd[1]: var-lib-containers-storage-overlay-3c897ff95fa385462f4ccee135732f55f8dbb75a1cc49451ce6ebf052a6efa46-merged.mount: Deactivated successfully.
Nov 29 02:06:51 np0005539505 podman[226893]: 2025-11-29 07:06:51.905620007 +0000 UTC m=+0.477918652 container cleanup 5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:06:51 np0005539505 systemd[1]: libpod-conmon-5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d.scope: Deactivated successfully.
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.975 186962 INFO nova.virt.libvirt.driver [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Instance shutdown successfully after 19 seconds.#033[00m
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.980 186962 INFO nova.virt.libvirt.driver [-] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Instance destroyed successfully.#033[00m
Nov 29 02:06:51 np0005539505 nova_compute[186958]: 2025-11-29 07:06:51.980 186962 DEBUG nova.objects.instance [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'numa_topology' on Instance uuid d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:52 np0005539505 nova_compute[186958]: 2025-11-29 07:06:52.018 186962 DEBUG nova.compute.manager [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:06:52 np0005539505 nova_compute[186958]: 2025-11-29 07:06:52.086 186962 DEBUG oslo_concurrency.lockutils [None req-c7023af5-36f0-4b78-a156-fab61d564b65 f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 19.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:52 np0005539505 nova_compute[186958]: 2025-11-29 07:06:52.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:52 np0005539505 podman[226937]: 2025-11-29 07:06:52.695487139 +0000 UTC m=+0.769643741 container remove 5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:06:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:52.703 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[242b563c-c90d-4172-ab30-8b6375f17583]: (4, ('Sat Nov 29 07:06:51 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d)\n5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d\nSat Nov 29 07:06:51 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 (5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d)\n5d6b86dd4df22eef20cd9423c4a1b2cac280b65b3a4db0c0e2463899bcb02f8d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:52.706 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0e717990-441b-40dc-86e5-31fd02bf3ab5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:52.707 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9cf3a513-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:52 np0005539505 nova_compute[186958]: 2025-11-29 07:06:52.709 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:52 np0005539505 kernel: tap9cf3a513-f0: left promiscuous mode
Nov 29 02:06:52 np0005539505 nova_compute[186958]: 2025-11-29 07:06:52.724 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:52.727 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7e91bfd6-84e2-4668-a509-fb01a335ed2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:52.749 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1ac70086-deaa-4ad1-bd4b-e182cf39ae9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:52.751 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[964b08bf-c838-4bef-ad3c-a7a19164fb70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:52.765 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7c4bd808-7554-4ec7-85ba-65bf01e7873c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543548, 'reachable_time': 16319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226956, 'error': None, 'target': 'ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:52.768 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9cf3a513-f54e-430e-b018-befaa643b464 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:06:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:06:52.768 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea5822f-e3cb-4f76-bd0c-0b6148cc5567]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:06:52 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9cf3a513\x2df54e\x2d430e\x2db018\x2dbefaa643b464.mount: Deactivated successfully.
Nov 29 02:06:54 np0005539505 nova_compute[186958]: 2025-11-29 07:06:54.045 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:54 np0005539505 nova_compute[186958]: 2025-11-29 07:06:54.160 186962 DEBUG nova.compute.manager [req-de888bc5-7ba5-4e3b-8204-5449eadce012 req-d4a13f8f-a91e-4a10-b316-c75b24dd3f59 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Received event network-vif-plugged-79d28453-30ed-42cf-a66e-786727a5b61c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:54 np0005539505 nova_compute[186958]: 2025-11-29 07:06:54.161 186962 DEBUG oslo_concurrency.lockutils [req-de888bc5-7ba5-4e3b-8204-5449eadce012 req-d4a13f8f-a91e-4a10-b316-c75b24dd3f59 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:54 np0005539505 nova_compute[186958]: 2025-11-29 07:06:54.161 186962 DEBUG oslo_concurrency.lockutils [req-de888bc5-7ba5-4e3b-8204-5449eadce012 req-d4a13f8f-a91e-4a10-b316-c75b24dd3f59 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:54 np0005539505 nova_compute[186958]: 2025-11-29 07:06:54.161 186962 DEBUG oslo_concurrency.lockutils [req-de888bc5-7ba5-4e3b-8204-5449eadce012 req-d4a13f8f-a91e-4a10-b316-c75b24dd3f59 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:54 np0005539505 nova_compute[186958]: 2025-11-29 07:06:54.161 186962 DEBUG nova.compute.manager [req-de888bc5-7ba5-4e3b-8204-5449eadce012 req-d4a13f8f-a91e-4a10-b316-c75b24dd3f59 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] No waiting events found dispatching network-vif-plugged-79d28453-30ed-42cf-a66e-786727a5b61c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:06:54 np0005539505 nova_compute[186958]: 2025-11-29 07:06:54.161 186962 WARNING nova.compute.manager [req-de888bc5-7ba5-4e3b-8204-5449eadce012 req-d4a13f8f-a91e-4a10-b316-c75b24dd3f59 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Received unexpected event network-vif-plugged-79d28453-30ed-42cf-a66e-786727a5b61c for instance with vm_state stopped and task_state None.#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.404 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:06:55 np0005539505 podman[226957]: 2025-11-29 07:06:55.722438008 +0000 UTC m=+0.053951526 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.735 186962 DEBUG oslo_concurrency.lockutils [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.735 186962 DEBUG oslo_concurrency.lockutils [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.736 186962 DEBUG oslo_concurrency.lockutils [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.736 186962 DEBUG oslo_concurrency.lockutils [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.736 186962 DEBUG oslo_concurrency.lockutils [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:55 np0005539505 podman[226958]: 2025-11-29 07:06:55.762291345 +0000 UTC m=+0.089839071 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.786 186962 INFO nova.compute.manager [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Terminating instance#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.797 186962 DEBUG nova.compute.manager [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.803 186962 INFO nova.virt.libvirt.driver [-] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Instance destroyed successfully.#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.804 186962 DEBUG nova.objects.instance [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lazy-loading 'resources' on Instance uuid d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.820 186962 DEBUG nova.virt.libvirt.vif [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:06:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-707522399',display_name='tempest-Íñstáñcé-1781493797',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-707522399',id=77,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:06:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='1dba9539037a4e9dbf33cba140fe21fe',ramdisk_id='',reservation_id='r-p5rx40om',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-373958708',owner_user_name='tempest-ServersTestJSON-373958708-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:06:53Z,user_data=None,user_id='f2f86d3bd4814a09966b869dd539a6c9',uuid=d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "79d28453-30ed-42cf-a66e-786727a5b61c", "address": "fa:16:3e:eb:2a:e1", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d28453-30", "ovs_interfaceid": "79d28453-30ed-42cf-a66e-786727a5b61c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.821 186962 DEBUG nova.network.os_vif_util [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converting VIF {"id": "79d28453-30ed-42cf-a66e-786727a5b61c", "address": "fa:16:3e:eb:2a:e1", "network": {"id": "9cf3a513-f54e-430e-b018-befaa643b464", "bridge": "br-int", "label": "tempest-ServersTestJSON-2082198627-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1dba9539037a4e9dbf33cba140fe21fe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap79d28453-30", "ovs_interfaceid": "79d28453-30ed-42cf-a66e-786727a5b61c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.822 186962 DEBUG nova.network.os_vif_util [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:2a:e1,bridge_name='br-int',has_traffic_filtering=True,id=79d28453-30ed-42cf-a66e-786727a5b61c,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d28453-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.822 186962 DEBUG os_vif [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:2a:e1,bridge_name='br-int',has_traffic_filtering=True,id=79d28453-30ed-42cf-a66e-786727a5b61c,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d28453-30') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.824 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.825 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap79d28453-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.828 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.831 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.835 186962 INFO os_vif [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:2a:e1,bridge_name='br-int',has_traffic_filtering=True,id=79d28453-30ed-42cf-a66e-786727a5b61c,network=Network(9cf3a513-f54e-430e-b018-befaa643b464),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap79d28453-30')#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.836 186962 INFO nova.virt.libvirt.driver [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Deleting instance files /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8_del#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.836 186962 INFO nova.virt.libvirt.driver [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Deletion of /var/lib/nova/instances/d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8_del complete#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.941 186962 INFO nova.compute.manager [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Took 0.14 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.941 186962 DEBUG oslo.service.loopingcall [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.943 186962 DEBUG nova.compute.manager [-] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:06:55 np0005539505 nova_compute[186958]: 2025-11-29 07:06:55.943 186962 DEBUG nova.network.neutron [-] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.403 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.431 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.431 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.431 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.432 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.604 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.604 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5719MB free_disk=73.22614669799805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.605 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.605 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.721 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.722 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.722 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.773 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.787 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.808 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:06:56 np0005539505 nova_compute[186958]: 2025-11-29 07:06:56.808 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:57 np0005539505 nova_compute[186958]: 2025-11-29 07:06:57.377 186962 DEBUG nova.network.neutron [-] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:06:57 np0005539505 nova_compute[186958]: 2025-11-29 07:06:57.399 186962 INFO nova.compute.manager [-] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Took 1.46 seconds to deallocate network for instance.#033[00m
Nov 29 02:06:57 np0005539505 nova_compute[186958]: 2025-11-29 07:06:57.509 186962 DEBUG oslo_concurrency.lockutils [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:06:57 np0005539505 nova_compute[186958]: 2025-11-29 07:06:57.510 186962 DEBUG oslo_concurrency.lockutils [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:06:57 np0005539505 nova_compute[186958]: 2025-11-29 07:06:57.552 186962 DEBUG nova.compute.manager [req-5ba281e5-921d-4098-88b5-ef95bd09e3bf req-0fdb2afc-97c2-4ab4-93b1-2d10c93aac8a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Received event network-vif-deleted-79d28453-30ed-42cf-a66e-786727a5b61c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:06:57 np0005539505 nova_compute[186958]: 2025-11-29 07:06:57.580 186962 DEBUG nova.compute.provider_tree [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:06:57 np0005539505 nova_compute[186958]: 2025-11-29 07:06:57.595 186962 DEBUG nova.scheduler.client.report [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:06:57 np0005539505 nova_compute[186958]: 2025-11-29 07:06:57.629 186962 DEBUG oslo_concurrency.lockutils [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:57 np0005539505 nova_compute[186958]: 2025-11-29 07:06:57.821 186962 INFO nova.scheduler.client.report [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Deleted allocations for instance d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8#033[00m
Nov 29 02:06:58 np0005539505 nova_compute[186958]: 2025-11-29 07:06:58.157 186962 DEBUG oslo_concurrency.lockutils [None req-e4cc4e18-67a2-4423-8605-ec9ee181023f f2f86d3bd4814a09966b869dd539a6c9 1dba9539037a4e9dbf33cba140fe21fe - - default default] Lock "d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:06:58 np0005539505 nova_compute[186958]: 2025-11-29 07:06:58.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:06:59 np0005539505 nova_compute[186958]: 2025-11-29 07:06:59.046 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:00 np0005539505 podman[227008]: 2025-11-29 07:07:00.746923961 +0000 UTC m=+0.069890207 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 02:07:00 np0005539505 nova_compute[186958]: 2025-11-29 07:07:00.827 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:01 np0005539505 nova_compute[186958]: 2025-11-29 07:07:01.440 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:02 np0005539505 nova_compute[186958]: 2025-11-29 07:07:02.941 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:02 np0005539505 nova_compute[186958]: 2025-11-29 07:07:02.941 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:02 np0005539505 nova_compute[186958]: 2025-11-29 07:07:02.958 186962 DEBUG nova.compute.manager [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.042 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.042 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.049 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.050 186962 INFO nova.compute.claims [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.224 186962 DEBUG nova.compute.provider_tree [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.239 186962 DEBUG nova.scheduler.client.report [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.270 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.271 186962 DEBUG nova.compute.manager [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.347 186962 DEBUG nova.compute.manager [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.348 186962 DEBUG nova.network.neutron [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.367 186962 INFO nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.385 186962 DEBUG nova.compute.manager [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.399 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.399 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.530 186962 DEBUG nova.compute.manager [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.533 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.534 186962 INFO nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Creating image(s)#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.535 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.536 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.536 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.549 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.608 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.609 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.610 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.621 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.644 186962 DEBUG nova.policy [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e1b8fbcc8caa4d94b69570f233c56d18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e6c366001df43fb91731faf7a9578fc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.685 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.685 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.730 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.731 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.732 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:03 np0005539505 podman[227034]: 2025-11-29 07:07:03.733288101 +0000 UTC m=+0.061290724 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.795 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.797 186962 DEBUG nova.virt.disk.api [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Checking if we can resize image /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.797 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.862 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.864 186962 DEBUG nova.virt.disk.api [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Cannot resize image /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.864 186962 DEBUG nova.objects.instance [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'migration_context' on Instance uuid 42c368fd-3d19-43c7-a528-68d642f739a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.879 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.880 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Ensure instance console log exists: /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.880 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.881 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:03 np0005539505 nova_compute[186958]: 2025-11-29 07:07:03.881 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:04 np0005539505 nova_compute[186958]: 2025-11-29 07:07:04.049 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:04 np0005539505 nova_compute[186958]: 2025-11-29 07:07:04.376 186962 DEBUG nova.network.neutron [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Successfully created port: 13880ab3-f23b-4b30-b581-49b775aa5bfd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:07:04 np0005539505 nova_compute[186958]: 2025-11-29 07:07:04.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:04 np0005539505 nova_compute[186958]: 2025-11-29 07:07:04.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:07:05 np0005539505 nova_compute[186958]: 2025-11-29 07:07:05.829 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:05 np0005539505 nova_compute[186958]: 2025-11-29 07:07:05.946 186962 DEBUG nova.network.neutron [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Successfully updated port: 13880ab3-f23b-4b30-b581-49b775aa5bfd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:07:05 np0005539505 nova_compute[186958]: 2025-11-29 07:07:05.970 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-42c368fd-3d19-43c7-a528-68d642f739a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:07:05 np0005539505 nova_compute[186958]: 2025-11-29 07:07:05.971 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-42c368fd-3d19-43c7-a528-68d642f739a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:07:05 np0005539505 nova_compute[186958]: 2025-11-29 07:07:05.971 186962 DEBUG nova.network.neutron [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:07:06 np0005539505 nova_compute[186958]: 2025-11-29 07:07:06.366 186962 DEBUG nova.network.neutron [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:07:06 np0005539505 nova_compute[186958]: 2025-11-29 07:07:06.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:06 np0005539505 nova_compute[186958]: 2025-11-29 07:07:06.600 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400011.5993357, d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:06 np0005539505 nova_compute[186958]: 2025-11-29 07:07:06.601 186962 INFO nova.compute.manager [-] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:07:06 np0005539505 nova_compute[186958]: 2025-11-29 07:07:06.736 186962 DEBUG nova.compute.manager [None req-9b9dbe85-4e18-4b18-a5fd-71f502039441 - - - - - -] [instance: d1ff1f5b-1bb6-4aed-bd56-a782dfbc50c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:07 np0005539505 nova_compute[186958]: 2025-11-29 07:07:07.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:07 np0005539505 nova_compute[186958]: 2025-11-29 07:07:07.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:07:07 np0005539505 nova_compute[186958]: 2025-11-29 07:07:07.496 186962 DEBUG nova.compute.manager [req-92911645-b4e8-45d2-84b6-acf2616f9821 req-908a29f0-8aa4-485f-b257-139a55363e30 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received event network-changed-13880ab3-f23b-4b30-b581-49b775aa5bfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:07 np0005539505 nova_compute[186958]: 2025-11-29 07:07:07.497 186962 DEBUG nova.compute.manager [req-92911645-b4e8-45d2-84b6-acf2616f9821 req-908a29f0-8aa4-485f-b257-139a55363e30 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Refreshing instance network info cache due to event network-changed-13880ab3-f23b-4b30-b581-49b775aa5bfd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:07:07 np0005539505 nova_compute[186958]: 2025-11-29 07:07:07.497 186962 DEBUG oslo_concurrency.lockutils [req-92911645-b4e8-45d2-84b6-acf2616f9821 req-908a29f0-8aa4-485f-b257-139a55363e30 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-42c368fd-3d19-43c7-a528-68d642f739a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.018 186962 DEBUG nova.network.neutron [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Updating instance_info_cache with network_info: [{"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.041 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-42c368fd-3d19-43c7-a528-68d642f739a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.042 186962 DEBUG nova.compute.manager [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Instance network_info: |[{"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.042 186962 DEBUG oslo_concurrency.lockutils [req-92911645-b4e8-45d2-84b6-acf2616f9821 req-908a29f0-8aa4-485f-b257-139a55363e30 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-42c368fd-3d19-43c7-a528-68d642f739a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.042 186962 DEBUG nova.network.neutron [req-92911645-b4e8-45d2-84b6-acf2616f9821 req-908a29f0-8aa4-485f-b257-139a55363e30 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Refreshing network info cache for port 13880ab3-f23b-4b30-b581-49b775aa5bfd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.045 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Start _get_guest_xml network_info=[{"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.050 186962 WARNING nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.054 186962 DEBUG nova.virt.libvirt.host [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.055 186962 DEBUG nova.virt.libvirt.host [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.058 186962 DEBUG nova.virt.libvirt.host [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.059 186962 DEBUG nova.virt.libvirt.host [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.060 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.060 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.061 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.061 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.061 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.061 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.061 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.062 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.062 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.062 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.062 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.062 186962 DEBUG nova.virt.hardware [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.066 186962 DEBUG nova.virt.libvirt.vif [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:07:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1310590920',display_name='tempest-tempest.common.compute-instance-1310590920',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1310590920',id=80,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-efvjtvso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:07:03Z,user_data=None,user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=42c368fd-3d19-43c7-a528-68d642f739a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.067 186962 DEBUG nova.network.os_vif_util [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.067 186962 DEBUG nova.network.os_vif_util [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.069 186962 DEBUG nova.objects.instance [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 42c368fd-3d19-43c7-a528-68d642f739a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.088 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  <uuid>42c368fd-3d19-43c7-a528-68d642f739a2</uuid>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  <name>instance-00000050</name>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <nova:name>tempest-tempest.common.compute-instance-1310590920</nova:name>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:07:08</nova:creationTime>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:        <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:        <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:        <nova:port uuid="13880ab3-f23b-4b30-b581-49b775aa5bfd">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <entry name="serial">42c368fd-3d19-43c7-a528-68d642f739a2</entry>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <entry name="uuid">42c368fd-3d19-43c7-a528-68d642f739a2</entry>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.config"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:68:af:c6"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <target dev="tap13880ab3-f2"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/console.log" append="off"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:07:08 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:07:08 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:07:08 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:07:08 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.089 186962 DEBUG nova.compute.manager [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Preparing to wait for external event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.090 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.090 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.090 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.091 186962 DEBUG nova.virt.libvirt.vif [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:07:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1310590920',display_name='tempest-tempest.common.compute-instance-1310590920',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1310590920',id=80,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-efvjtvso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:07:03Z,user_data=None,user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=42c368fd-3d19-43c7-a528-68d642f739a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.091 186962 DEBUG nova.network.os_vif_util [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.091 186962 DEBUG nova.network.os_vif_util [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.092 186962 DEBUG os_vif [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.092 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.093 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.093 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.097 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.098 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13880ab3-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.098 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap13880ab3-f2, col_values=(('external_ids', {'iface-id': '13880ab3-f23b-4b30-b581-49b775aa5bfd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:af:c6', 'vm-uuid': '42c368fd-3d19-43c7-a528-68d642f739a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.100 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 NetworkManager[55134]: <info>  [1764400028.1011] manager: (tap13880ab3-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.102 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.106 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.108 186962 INFO os_vif [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2')#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.172 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.173 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.173 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No VIF found with MAC fa:16:3e:68:af:c6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.174 186962 INFO nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Using config drive#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.467 186962 INFO nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Creating config drive at /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.config#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.472 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptdf84lyt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.571 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.598 186962 DEBUG oslo_concurrency.processutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptdf84lyt" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:08 np0005539505 kernel: tap13880ab3-f2: entered promiscuous mode
Nov 29 02:07:08 np0005539505 NetworkManager[55134]: <info>  [1764400028.6633] manager: (tap13880ab3-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/152)
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.664 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:08Z|00301|binding|INFO|Claiming lport 13880ab3-f23b-4b30-b581-49b775aa5bfd for this chassis.
Nov 29 02:07:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:08Z|00302|binding|INFO|13880ab3-f23b-4b30-b581-49b775aa5bfd: Claiming fa:16:3e:68:af:c6 10.100.0.7
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.670 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.672 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.676 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 NetworkManager[55134]: <info>  [1764400028.6831] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.682 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 NetworkManager[55134]: <info>  [1764400028.6836] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Nov 29 02:07:08 np0005539505 systemd-udevd[227084]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.694 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:af:c6 10.100.0.7'], port_security=['fa:16:3e:68:af:c6 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '42c368fd-3d19-43c7-a528-68d642f739a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2afbfb6d-a7e3-43fe-9124-4ed2124bd3c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=13880ab3-f23b-4b30-b581-49b775aa5bfd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.696 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 13880ab3-f23b-4b30-b581-49b775aa5bfd in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.698 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399#033[00m
Nov 29 02:07:08 np0005539505 NetworkManager[55134]: <info>  [1764400028.7053] device (tap13880ab3-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:07:08 np0005539505 NetworkManager[55134]: <info>  [1764400028.7063] device (tap13880ab3-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:07:08 np0005539505 systemd-machined[153285]: New machine qemu-39-instance-00000050.
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.710 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[26d15101-4df2-4e3a-be1f-d7b9e8e91f1f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.711 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.714 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.714 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2aacb14f-e413-42bc-af34-4d729214edb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.715 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c2dda2f7-92bd-4889-b654-bd0dc3297f98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.728 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[a783e3c8-3324-45f8-96d0-a9ca797c2859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 systemd[1]: Started Virtual Machine qemu-39-instance-00000050.
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.759 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f873e12a-e2bf-4fd3-b8e1-f3792fbda085]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.790 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ef5fba90-8bf1-4136-8fb0-511498816f91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 NetworkManager[55134]: <info>  [1764400028.8103] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/155)
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.809 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[86945469-0948-4516-922a-160b7f87e9cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.820 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.840 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:08Z|00303|binding|INFO|Setting lport 13880ab3-f23b-4b30-b581-49b775aa5bfd ovn-installed in OVS
Nov 29 02:07:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:08Z|00304|binding|INFO|Setting lport 13880ab3-f23b-4b30-b581-49b775aa5bfd up in Southbound
Nov 29 02:07:08 np0005539505 nova_compute[186958]: 2025-11-29 07:07:08.851 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.854 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[064ad79d-d998-4708-b498-a8770853e74c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.859 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[838c5715-99f8-4be1-856e-441371829b1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 NetworkManager[55134]: <info>  [1764400028.8890] device (tap9226dea3-60): carrier: link connected
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.895 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[09c8f0f7-6ec3-401b-a074-c4a9d2022c30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.915 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f7e9050d-04ac-4039-9cb9-4aa6acf0c84b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547647, 'reachable_time': 36191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227118, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.934 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9e4e4f2c-de45-43d9-8718-07ab03a568bb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547647, 'tstamp': 547647}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227119, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.952 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5672b8d2-00d6-4ab7-abdf-f1645b49d7af]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547647, 'reachable_time': 36191, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227120, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:08.985 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e55ba2fe-3ae1-4458-b8c5-b25d6974c7c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.050 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:09.051 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2d9b1238-4265-4c03-b02b-abbe12128898]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:09.053 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:09.053 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:09.054 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.055 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:09 np0005539505 NetworkManager[55134]: <info>  [1764400029.0567] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/156)
Nov 29 02:07:09 np0005539505 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:09.060 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.062 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:09Z|00305|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:09.063 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:09.064 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[765fe1f6-9d15-4858-882d-33514e4b3201]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:09.065 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:07:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:09.067 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.074 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.111 186962 DEBUG nova.compute.manager [req-59dfe93a-473e-4fa2-8030-d37fada1fe4f req-07ee17aa-058f-48a9-9f5e-7c89832afde0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.112 186962 DEBUG oslo_concurrency.lockutils [req-59dfe93a-473e-4fa2-8030-d37fada1fe4f req-07ee17aa-058f-48a9-9f5e-7c89832afde0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.113 186962 DEBUG oslo_concurrency.lockutils [req-59dfe93a-473e-4fa2-8030-d37fada1fe4f req-07ee17aa-058f-48a9-9f5e-7c89832afde0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.113 186962 DEBUG oslo_concurrency.lockutils [req-59dfe93a-473e-4fa2-8030-d37fada1fe4f req-07ee17aa-058f-48a9-9f5e-7c89832afde0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.113 186962 DEBUG nova.compute.manager [req-59dfe93a-473e-4fa2-8030-d37fada1fe4f req-07ee17aa-058f-48a9-9f5e-7c89832afde0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Processing event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.264 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400029.2639084, 42c368fd-3d19-43c7-a528-68d642f739a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.265 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] VM Started (Lifecycle Event)#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.274 186962 DEBUG nova.compute.manager [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.278 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.282 186962 INFO nova.virt.libvirt.driver [-] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Instance spawned successfully.#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.283 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.300 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.308 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.312 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.313 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.313 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.314 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.314 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.315 186962 DEBUG nova.virt.libvirt.driver [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.348 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.348 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400029.264097, 42c368fd-3d19-43c7-a528-68d642f739a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.349 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.381 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.390 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400029.2776353, 42c368fd-3d19-43c7-a528-68d642f739a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.390 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.402 186962 INFO nova.compute.manager [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Took 5.87 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.403 186962 DEBUG nova.compute.manager [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.419 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.425 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.474 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:07:09 np0005539505 podman[227159]: 2025-11-29 07:07:09.501600402 +0000 UTC m=+0.056998902 container create 1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.534 186962 INFO nova.compute.manager [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Took 6.53 seconds to build instance.#033[00m
Nov 29 02:07:09 np0005539505 systemd[1]: Started libpod-conmon-1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5.scope.
Nov 29 02:07:09 np0005539505 podman[227159]: 2025-11-29 07:07:09.46897536 +0000 UTC m=+0.024373870 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:07:09 np0005539505 nova_compute[186958]: 2025-11-29 07:07:09.569 186962 DEBUG oslo_concurrency.lockutils [None req-702f9337-cce9-40f7-b8bd-ec575008d79e e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:09 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:07:09 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e65601702a817c538bb12b6a0c1aa91a768b49dffd0011d0280dd4c0d1cdb2c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:07:09 np0005539505 podman[227159]: 2025-11-29 07:07:09.611997834 +0000 UTC m=+0.167396364 container init 1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:07:09 np0005539505 podman[227159]: 2025-11-29 07:07:09.617250402 +0000 UTC m=+0.172648902 container start 1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 02:07:09 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227173]: [NOTICE]   (227177) : New worker (227179) forked
Nov 29 02:07:09 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227173]: [NOTICE]   (227177) : Loading success.
Nov 29 02:07:10 np0005539505 nova_compute[186958]: 2025-11-29 07:07:10.621 186962 DEBUG nova.network.neutron [req-92911645-b4e8-45d2-84b6-acf2616f9821 req-908a29f0-8aa4-485f-b257-139a55363e30 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Updated VIF entry in instance network info cache for port 13880ab3-f23b-4b30-b581-49b775aa5bfd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:07:10 np0005539505 nova_compute[186958]: 2025-11-29 07:07:10.622 186962 DEBUG nova.network.neutron [req-92911645-b4e8-45d2-84b6-acf2616f9821 req-908a29f0-8aa4-485f-b257-139a55363e30 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Updating instance_info_cache with network_info: [{"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:07:10 np0005539505 nova_compute[186958]: 2025-11-29 07:07:10.643 186962 DEBUG oslo_concurrency.lockutils [req-92911645-b4e8-45d2-84b6-acf2616f9821 req-908a29f0-8aa4-485f-b257-139a55363e30 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-42c368fd-3d19-43c7-a528-68d642f739a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:07:11 np0005539505 nova_compute[186958]: 2025-11-29 07:07:11.212 186962 DEBUG nova.compute.manager [req-0576f4bb-3f86-4d5b-93d3-3019793bf5dc req-b108a3d6-e69b-48bb-9898-e6f5ba775c1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:11 np0005539505 nova_compute[186958]: 2025-11-29 07:07:11.212 186962 DEBUG oslo_concurrency.lockutils [req-0576f4bb-3f86-4d5b-93d3-3019793bf5dc req-b108a3d6-e69b-48bb-9898-e6f5ba775c1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:11 np0005539505 nova_compute[186958]: 2025-11-29 07:07:11.213 186962 DEBUG oslo_concurrency.lockutils [req-0576f4bb-3f86-4d5b-93d3-3019793bf5dc req-b108a3d6-e69b-48bb-9898-e6f5ba775c1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:11 np0005539505 nova_compute[186958]: 2025-11-29 07:07:11.213 186962 DEBUG oslo_concurrency.lockutils [req-0576f4bb-3f86-4d5b-93d3-3019793bf5dc req-b108a3d6-e69b-48bb-9898-e6f5ba775c1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:11 np0005539505 nova_compute[186958]: 2025-11-29 07:07:11.213 186962 DEBUG nova.compute.manager [req-0576f4bb-3f86-4d5b-93d3-3019793bf5dc req-b108a3d6-e69b-48bb-9898-e6f5ba775c1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] No waiting events found dispatching network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:11 np0005539505 nova_compute[186958]: 2025-11-29 07:07:11.214 186962 WARNING nova.compute.manager [req-0576f4bb-3f86-4d5b-93d3-3019793bf5dc req-b108a3d6-e69b-48bb-9898-e6f5ba775c1e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received unexpected event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd for instance with vm_state active and task_state None.#033[00m
Nov 29 02:07:12 np0005539505 nova_compute[186958]: 2025-11-29 07:07:12.150 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "5690e5c7-982b-467a-bfd2-cf03ba81672c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:12 np0005539505 nova_compute[186958]: 2025-11-29 07:07:12.151 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:12 np0005539505 nova_compute[186958]: 2025-11-29 07:07:12.207 186962 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:07:12 np0005539505 nova_compute[186958]: 2025-11-29 07:07:12.650 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:12 np0005539505 nova_compute[186958]: 2025-11-29 07:07:12.651 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:12 np0005539505 nova_compute[186958]: 2025-11-29 07:07:12.658 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:07:12 np0005539505 nova_compute[186958]: 2025-11-29 07:07:12.659 186962 INFO nova.compute.claims [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:07:12 np0005539505 nova_compute[186958]: 2025-11-29 07:07:12.901 186962 DEBUG nova.compute.provider_tree [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:07:12 np0005539505 nova_compute[186958]: 2025-11-29 07:07:12.935 186962 DEBUG nova.scheduler.client.report [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:07:12 np0005539505 nova_compute[186958]: 2025-11-29 07:07:12.996 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:12 np0005539505 nova_compute[186958]: 2025-11-29 07:07:12.997 186962 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.067 186962 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.068 186962 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.091 186962 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.102 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.121 186962 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.274 186962 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.275 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.276 186962 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Creating image(s)#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.276 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "/var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.277 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "/var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.277 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "/var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.294 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.361 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.362 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.362 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.376 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.397 186962 DEBUG nova.policy [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3c9a3fa9f480479d98f522f6f02870fb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '477b89fb35da42f69c15b3f01054754a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.402 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.442 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.443 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.700 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk 1073741824" returned: 0 in 0.257s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.701 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.701 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:13 np0005539505 podman[227197]: 2025-11-29 07:07:13.733104166 +0000 UTC m=+0.058130395 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 29 02:07:13 np0005539505 podman[227198]: 2025-11-29 07:07:13.733778825 +0000 UTC m=+0.056392926 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.757 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.758 186962 DEBUG nova.virt.disk.api [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Checking if we can resize image /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.759 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.823 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.824 186962 DEBUG nova.virt.disk.api [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Cannot resize image /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:07:13 np0005539505 nova_compute[186958]: 2025-11-29 07:07:13.825 186962 DEBUG nova.objects.instance [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lazy-loading 'migration_context' on Instance uuid 5690e5c7-982b-467a-bfd2-cf03ba81672c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:14 np0005539505 nova_compute[186958]: 2025-11-29 07:07:14.052 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:17 np0005539505 podman[227248]: 2025-11-29 07:07:17.734308828 +0000 UTC m=+0.061111399 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 29 02:07:18 np0005539505 nova_compute[186958]: 2025-11-29 07:07:18.115 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:18 np0005539505 nova_compute[186958]: 2025-11-29 07:07:18.667 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:07:18 np0005539505 nova_compute[186958]: 2025-11-29 07:07:18.667 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Ensure instance console log exists: /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:07:18 np0005539505 nova_compute[186958]: 2025-11-29 07:07:18.668 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:18 np0005539505 nova_compute[186958]: 2025-11-29 07:07:18.668 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:18 np0005539505 nova_compute[186958]: 2025-11-29 07:07:18.669 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:18 np0005539505 nova_compute[186958]: 2025-11-29 07:07:18.994 186962 INFO nova.compute.manager [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Rebuilding instance#033[00m
Nov 29 02:07:19 np0005539505 nova_compute[186958]: 2025-11-29 07:07:19.055 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:19 np0005539505 nova_compute[186958]: 2025-11-29 07:07:19.344 186962 DEBUG nova.compute.manager [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:19 np0005539505 nova_compute[186958]: 2025-11-29 07:07:19.420 186962 DEBUG nova.objects.instance [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_requests' on Instance uuid 42c368fd-3d19-43c7-a528-68d642f739a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:19 np0005539505 nova_compute[186958]: 2025-11-29 07:07:19.532 186962 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Successfully created port: 5be8ebc8-375b-4152-819e-76e386deb759 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:07:19 np0005539505 nova_compute[186958]: 2025-11-29 07:07:19.870 186962 DEBUG nova.objects.instance [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 42c368fd-3d19-43c7-a528-68d642f739a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:19 np0005539505 nova_compute[186958]: 2025-11-29 07:07:19.889 186962 DEBUG nova.objects.instance [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 42c368fd-3d19-43c7-a528-68d642f739a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:19 np0005539505 nova_compute[186958]: 2025-11-29 07:07:19.915 186962 DEBUG nova.objects.instance [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'migration_context' on Instance uuid 42c368fd-3d19-43c7-a528-68d642f739a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:19 np0005539505 nova_compute[186958]: 2025-11-29 07:07:19.928 186962 DEBUG nova.objects.instance [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:07:19 np0005539505 nova_compute[186958]: 2025-11-29 07:07:19.932 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:07:20 np0005539505 nova_compute[186958]: 2025-11-29 07:07:20.593 186962 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Successfully updated port: 5be8ebc8-375b-4152-819e-76e386deb759 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:07:20 np0005539505 nova_compute[186958]: 2025-11-29 07:07:20.616 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "refresh_cache-5690e5c7-982b-467a-bfd2-cf03ba81672c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:07:20 np0005539505 nova_compute[186958]: 2025-11-29 07:07:20.617 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquired lock "refresh_cache-5690e5c7-982b-467a-bfd2-cf03ba81672c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:07:20 np0005539505 nova_compute[186958]: 2025-11-29 07:07:20.617 186962 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:07:20 np0005539505 nova_compute[186958]: 2025-11-29 07:07:20.786 186962 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:07:20 np0005539505 nova_compute[186958]: 2025-11-29 07:07:20.796 186962 DEBUG nova.compute.manager [req-02a0250a-3c88-416b-9a42-9ec13c323977 req-64cf4eff-78d9-44fb-99c0-cdea969302e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Received event network-changed-5be8ebc8-375b-4152-819e-76e386deb759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:20 np0005539505 nova_compute[186958]: 2025-11-29 07:07:20.797 186962 DEBUG nova.compute.manager [req-02a0250a-3c88-416b-9a42-9ec13c323977 req-64cf4eff-78d9-44fb-99c0-cdea969302e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Refreshing instance network info cache due to event network-changed-5be8ebc8-375b-4152-819e-76e386deb759. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:07:20 np0005539505 nova_compute[186958]: 2025-11-29 07:07:20.797 186962 DEBUG oslo_concurrency.lockutils [req-02a0250a-3c88-416b-9a42-9ec13c323977 req-64cf4eff-78d9-44fb-99c0-cdea969302e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5690e5c7-982b-467a-bfd2-cf03ba81672c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.747 186962 DEBUG nova.network.neutron [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Updating instance_info_cache with network_info: [{"id": "5be8ebc8-375b-4152-819e-76e386deb759", "address": "fa:16:3e:4f:cf:a3", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be8ebc8-37", "ovs_interfaceid": "5be8ebc8-375b-4152-819e-76e386deb759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.772 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Releasing lock "refresh_cache-5690e5c7-982b-467a-bfd2-cf03ba81672c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.773 186962 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Instance network_info: |[{"id": "5be8ebc8-375b-4152-819e-76e386deb759", "address": "fa:16:3e:4f:cf:a3", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be8ebc8-37", "ovs_interfaceid": "5be8ebc8-375b-4152-819e-76e386deb759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.774 186962 DEBUG oslo_concurrency.lockutils [req-02a0250a-3c88-416b-9a42-9ec13c323977 req-64cf4eff-78d9-44fb-99c0-cdea969302e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5690e5c7-982b-467a-bfd2-cf03ba81672c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.774 186962 DEBUG nova.network.neutron [req-02a0250a-3c88-416b-9a42-9ec13c323977 req-64cf4eff-78d9-44fb-99c0-cdea969302e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Refreshing network info cache for port 5be8ebc8-375b-4152-819e-76e386deb759 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.780 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Start _get_guest_xml network_info=[{"id": "5be8ebc8-375b-4152-819e-76e386deb759", "address": "fa:16:3e:4f:cf:a3", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be8ebc8-37", "ovs_interfaceid": "5be8ebc8-375b-4152-819e-76e386deb759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.789 186962 WARNING nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.794 186962 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.795 186962 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.805 186962 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.806 186962 DEBUG nova.virt.libvirt.host [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.807 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.807 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.807 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.808 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.808 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.808 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.808 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.808 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.809 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.809 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.809 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.809 186962 DEBUG nova.virt.hardware [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.812 186962 DEBUG nova.virt.libvirt.vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1009215778',display_name='tempest-ListServersNegativeTestJSON-server-1009215778-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1009215778-3',id=83,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='477b89fb35da42f69c15b3f01054754a',ramdisk_id='',reservation_id='r-fs8srmd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-316367608',owner_user_name='tempest-ListServersNegativeTestJSON-316367608-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:07:13Z,user_data=None,user_id='3c9a3fa9f480479d98f522f6f02870fb',uuid=5690e5c7-982b-467a-bfd2-cf03ba81672c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5be8ebc8-375b-4152-819e-76e386deb759", "address": "fa:16:3e:4f:cf:a3", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be8ebc8-37", "ovs_interfaceid": "5be8ebc8-375b-4152-819e-76e386deb759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.813 186962 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converting VIF {"id": "5be8ebc8-375b-4152-819e-76e386deb759", "address": "fa:16:3e:4f:cf:a3", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be8ebc8-37", "ovs_interfaceid": "5be8ebc8-375b-4152-819e-76e386deb759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.814 186962 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:cf:a3,bridge_name='br-int',has_traffic_filtering=True,id=5be8ebc8-375b-4152-819e-76e386deb759,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be8ebc8-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:21 np0005539505 nova_compute[186958]: 2025-11-29 07:07:21.814 186962 DEBUG nova.objects.instance [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lazy-loading 'pci_devices' on Instance uuid 5690e5c7-982b-467a-bfd2-cf03ba81672c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.019 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  <uuid>5690e5c7-982b-467a-bfd2-cf03ba81672c</uuid>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  <name>instance-00000053</name>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <nova:name>tempest-ListServersNegativeTestJSON-server-1009215778-3</nova:name>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:07:21</nova:creationTime>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:        <nova:user uuid="3c9a3fa9f480479d98f522f6f02870fb">tempest-ListServersNegativeTestJSON-316367608-project-member</nova:user>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:        <nova:project uuid="477b89fb35da42f69c15b3f01054754a">tempest-ListServersNegativeTestJSON-316367608</nova:project>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:        <nova:port uuid="5be8ebc8-375b-4152-819e-76e386deb759">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <entry name="serial">5690e5c7-982b-467a-bfd2-cf03ba81672c</entry>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <entry name="uuid">5690e5c7-982b-467a-bfd2-cf03ba81672c</entry>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk.config"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:4f:cf:a3"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <target dev="tap5be8ebc8-37"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/console.log" append="off"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:07:22 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:07:22 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:07:22 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:07:22 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.020 186962 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Preparing to wait for external event network-vif-plugged-5be8ebc8-375b-4152-819e-76e386deb759 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.021 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.021 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.022 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.023 186962 DEBUG nova.virt.libvirt.vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1009215778',display_name='tempest-ListServersNegativeTestJSON-server-1009215778-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1009215778-3',id=83,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='477b89fb35da42f69c15b3f01054754a',ramdisk_id='',reservation_id='r-fs8srmd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-316367608',owner_user_name='tempest-ListServersNegativeTestJSON-316367608-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:07:13Z,user_data=None,user_id='3c9a3fa9f480479d98f522f6f02870fb',uuid=5690e5c7-982b-467a-bfd2-cf03ba81672c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5be8ebc8-375b-4152-819e-76e386deb759", "address": "fa:16:3e:4f:cf:a3", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be8ebc8-37", "ovs_interfaceid": "5be8ebc8-375b-4152-819e-76e386deb759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.023 186962 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converting VIF {"id": "5be8ebc8-375b-4152-819e-76e386deb759", "address": "fa:16:3e:4f:cf:a3", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be8ebc8-37", "ovs_interfaceid": "5be8ebc8-375b-4152-819e-76e386deb759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.024 186962 DEBUG nova.network.os_vif_util [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:cf:a3,bridge_name='br-int',has_traffic_filtering=True,id=5be8ebc8-375b-4152-819e-76e386deb759,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be8ebc8-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.025 186962 DEBUG os_vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:cf:a3,bridge_name='br-int',has_traffic_filtering=True,id=5be8ebc8-375b-4152-819e-76e386deb759,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be8ebc8-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.026 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.026 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.027 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.031 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.031 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5be8ebc8-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.032 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5be8ebc8-37, col_values=(('external_ids', {'iface-id': '5be8ebc8-375b-4152-819e-76e386deb759', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:cf:a3', 'vm-uuid': '5690e5c7-982b-467a-bfd2-cf03ba81672c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:22 np0005539505 NetworkManager[55134]: <info>  [1764400042.0351] manager: (tap5be8ebc8-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.036 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.044 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.046 186962 INFO os_vif [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:cf:a3,bridge_name='br-int',has_traffic_filtering=True,id=5be8ebc8-375b-4152-819e-76e386deb759,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be8ebc8-37')#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.131 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.132 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.132 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] No VIF found with MAC fa:16:3e:4f:cf:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.132 186962 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Using config drive#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.732 186962 INFO nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Creating config drive at /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk.config#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.737 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp1hs__pe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.869 186962 DEBUG oslo_concurrency.processutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpp1hs__pe" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:22 np0005539505 kernel: tap5be8ebc8-37: entered promiscuous mode
Nov 29 02:07:22 np0005539505 NetworkManager[55134]: <info>  [1764400042.9266] manager: (tap5be8ebc8-37): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Nov 29 02:07:22 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:22Z|00306|binding|INFO|Claiming lport 5be8ebc8-375b-4152-819e-76e386deb759 for this chassis.
Nov 29 02:07:22 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:22Z|00307|binding|INFO|5be8ebc8-375b-4152-819e-76e386deb759: Claiming fa:16:3e:4f:cf:a3 10.100.0.11
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.928 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:22.935 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:cf:a3 10.100.0.11'], port_security=['fa:16:3e:4f:cf:a3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5690e5c7-982b-467a-bfd2-cf03ba81672c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '477b89fb35da42f69c15b3f01054754a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9d87673c-3e8a-46ed-9956-50ea661306ab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=114ba21f-c978-4c05-97f9-429aa66017b7, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=5be8ebc8-375b-4152-819e-76e386deb759) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:07:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:22.939 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 5be8ebc8-375b-4152-819e-76e386deb759 in datapath c8a3c675-42f5-48a4-83d7-2d39dd3304b9 bound to our chassis#033[00m
Nov 29 02:07:22 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:22Z|00308|binding|INFO|Setting lport 5be8ebc8-375b-4152-819e-76e386deb759 ovn-installed in OVS
Nov 29 02:07:22 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:22Z|00309|binding|INFO|Setting lport 5be8ebc8-375b-4152-819e-76e386deb759 up in Southbound
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.942 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:22.942 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c8a3c675-42f5-48a4-83d7-2d39dd3304b9#033[00m
Nov 29 02:07:22 np0005539505 nova_compute[186958]: 2025-11-29 07:07:22.945 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:22 np0005539505 systemd-udevd[227306]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:07:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:22.956 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f54a2613-7398-4d51-8eec-cf3527f12fd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:22.957 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc8a3c675-41 in ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:07:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:22.960 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc8a3c675-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:07:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:22.960 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[de2e3d8e-bc01-4f3e-827c-520ef331daa2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:22.961 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea24682-daae-40ea-adf4-ce9b2d417b19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:22 np0005539505 NetworkManager[55134]: <info>  [1764400042.9712] device (tap5be8ebc8-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:07:22 np0005539505 systemd-machined[153285]: New machine qemu-40-instance-00000053.
Nov 29 02:07:22 np0005539505 NetworkManager[55134]: <info>  [1764400042.9732] device (tap5be8ebc8-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:07:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:22.976 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[3261e4f1-f8c5-40ac-a490-e30db2b2e2e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:22.991 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[663a922f-724d-4019-a93a-7b2959febdef]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 systemd[1]: Started Virtual Machine qemu-40-instance-00000053.
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.022 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[da1a771a-4cbb-42e9-a6ab-793bd882f8df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.028 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ddabd3d6-ca47-4455-aa86-174bc1af67a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 NetworkManager[55134]: <info>  [1764400043.0301] manager: (tapc8a3c675-40): new Veth device (/org/freedesktop/NetworkManager/Devices/159)
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.068 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[197286f1-3759-4ebe-8a5c-ae83eae9791a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.072 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[074b1503-e060-49c3-8014-00baf4ed8f8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 NetworkManager[55134]: <info>  [1764400043.0995] device (tapc8a3c675-40): carrier: link connected
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.106 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa7b2e8-a170-4c72-ae14-68c371a1a589]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.122 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb5a681-1599-4950-8733-920fd26c332a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8a3c675-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:f6:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549068, 'reachable_time': 28972, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227340, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.135 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[97ce36a3-628b-4943-af24-670f8618365e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe06:f663'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 549068, 'tstamp': 549068}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227341, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.152 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f3798a24-e3d6-4cee-9ec6-af3c9fb2e4ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc8a3c675-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:f6:63'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 102], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549068, 'reachable_time': 28972, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227342, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.176 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2f2e245c-8201-4542-9ee5-0ecaa7710602]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.221 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0b91b48e-d216-4420-b7d7-e977c0a30376]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.222 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8a3c675-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.223 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.223 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8a3c675-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.224 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:23 np0005539505 kernel: tapc8a3c675-40: entered promiscuous mode
Nov 29 02:07:23 np0005539505 NetworkManager[55134]: <info>  [1764400043.2255] manager: (tapc8a3c675-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.226 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.230 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc8a3c675-40, col_values=(('external_ids', {'iface-id': '2a5ced08-2785-4bf9-8fa1-c89240d15794'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.231 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:23 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:23Z|00310|binding|INFO|Releasing lport 2a5ced08-2785-4bf9-8fa1-c89240d15794 from this chassis (sb_readonly=0)
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.241 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.243 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.245 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1c84dc6a-1083-4a4a-9605-3ce187dd53f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.246 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-c8a3c675-42f5-48a4-83d7-2d39dd3304b9
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.pid.haproxy
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID c8a3c675-42f5-48a4-83d7-2d39dd3304b9
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:07:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:23.248 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'env', 'PROCESS_TAG=haproxy-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c8a3c675-42f5-48a4-83d7-2d39dd3304b9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.660 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400043.6602597, 5690e5c7-982b-467a-bfd2-cf03ba81672c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.662 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] VM Started (Lifecycle Event)#033[00m
Nov 29 02:07:23 np0005539505 podman[227378]: 2025-11-29 07:07:23.571529497 +0000 UTC m=+0.020800789 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.728 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.734 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400043.6613755, 5690e5c7-982b-467a-bfd2-cf03ba81672c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.734 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.770 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.774 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:07:23 np0005539505 nova_compute[186958]: 2025-11-29 07:07:23.794 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:07:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:24Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:68:af:c6 10.100.0.7
Nov 29 02:07:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:24Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:68:af:c6 10.100.0.7
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.082 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.532 186962 DEBUG nova.compute.manager [req-f9fd0935-bf57-4fad-b7da-161f8cccced8 req-463ee08e-4bb0-4a57-b635-fbaec6da4ea6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Received event network-vif-plugged-5be8ebc8-375b-4152-819e-76e386deb759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.533 186962 DEBUG oslo_concurrency.lockutils [req-f9fd0935-bf57-4fad-b7da-161f8cccced8 req-463ee08e-4bb0-4a57-b635-fbaec6da4ea6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.533 186962 DEBUG oslo_concurrency.lockutils [req-f9fd0935-bf57-4fad-b7da-161f8cccced8 req-463ee08e-4bb0-4a57-b635-fbaec6da4ea6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.534 186962 DEBUG oslo_concurrency.lockutils [req-f9fd0935-bf57-4fad-b7da-161f8cccced8 req-463ee08e-4bb0-4a57-b635-fbaec6da4ea6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.534 186962 DEBUG nova.compute.manager [req-f9fd0935-bf57-4fad-b7da-161f8cccced8 req-463ee08e-4bb0-4a57-b635-fbaec6da4ea6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Processing event network-vif-plugged-5be8ebc8-375b-4152-819e-76e386deb759 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.534 186962 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.537 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400044.5377223, 5690e5c7-982b-467a-bfd2-cf03ba81672c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.538 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.540 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.543 186962 INFO nova.virt.libvirt.driver [-] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Instance spawned successfully.#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.543 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.559 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:24 np0005539505 podman[227378]: 2025-11-29 07:07:24.561173366 +0000 UTC m=+1.010444638 container create d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.565 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.572 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.572 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.573 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.573 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.574 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.574 186962 DEBUG nova.virt.libvirt.driver [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.586 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.654 186962 INFO nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Took 11.38 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:07:24 np0005539505 nova_compute[186958]: 2025-11-29 07:07:24.655 186962 DEBUG nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:25 np0005539505 systemd[1]: Started libpod-conmon-d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11.scope.
Nov 29 02:07:25 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:07:25 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6dd218dc11bc6657215d52b91b492a4b3ee497f7fd45c65581af02c2846561dc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:07:25 np0005539505 nova_compute[186958]: 2025-11-29 07:07:25.540 186962 INFO nova.compute.manager [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Took 13.24 seconds to build instance.#033[00m
Nov 29 02:07:25 np0005539505 nova_compute[186958]: 2025-11-29 07:07:25.648 186962 DEBUG oslo_concurrency.lockutils [None req-6e2d4b0d-921a-4b2b-8027-277d05424bc4 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.497s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:25Z|00311|binding|INFO|Releasing lport 2a5ced08-2785-4bf9-8fa1-c89240d15794 from this chassis (sb_readonly=0)
Nov 29 02:07:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:25Z|00312|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:07:25 np0005539505 podman[227378]: 2025-11-29 07:07:25.694000893 +0000 UTC m=+2.143272205 container init d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:07:25 np0005539505 podman[227378]: 2025-11-29 07:07:25.719488474 +0000 UTC m=+2.168759746 container start d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:07:25 np0005539505 nova_compute[186958]: 2025-11-29 07:07:25.752 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:25 np0005539505 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227393]: [NOTICE]   (227397) : New worker (227399) forked
Nov 29 02:07:25 np0005539505 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227393]: [NOTICE]   (227397) : Loading success.
Nov 29 02:07:25 np0005539505 nova_compute[186958]: 2025-11-29 07:07:25.990 186962 DEBUG nova.network.neutron [req-02a0250a-3c88-416b-9a42-9ec13c323977 req-64cf4eff-78d9-44fb-99c0-cdea969302e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Updated VIF entry in instance network info cache for port 5be8ebc8-375b-4152-819e-76e386deb759. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:07:25 np0005539505 nova_compute[186958]: 2025-11-29 07:07:25.991 186962 DEBUG nova.network.neutron [req-02a0250a-3c88-416b-9a42-9ec13c323977 req-64cf4eff-78d9-44fb-99c0-cdea969302e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Updating instance_info_cache with network_info: [{"id": "5be8ebc8-375b-4152-819e-76e386deb759", "address": "fa:16:3e:4f:cf:a3", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be8ebc8-37", "ovs_interfaceid": "5be8ebc8-375b-4152-819e-76e386deb759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:07:26 np0005539505 nova_compute[186958]: 2025-11-29 07:07:26.017 186962 DEBUG oslo_concurrency.lockutils [req-02a0250a-3c88-416b-9a42-9ec13c323977 req-64cf4eff-78d9-44fb-99c0-cdea969302e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5690e5c7-982b-467a-bfd2-cf03ba81672c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:07:26 np0005539505 nova_compute[186958]: 2025-11-29 07:07:26.703 186962 DEBUG nova.compute.manager [req-1d62a269-7bcb-46a9-a778-c061c18b1d07 req-44c0fa11-7bb8-4cf9-8d36-2846c67e53b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Received event network-vif-plugged-5be8ebc8-375b-4152-819e-76e386deb759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:26 np0005539505 nova_compute[186958]: 2025-11-29 07:07:26.704 186962 DEBUG oslo_concurrency.lockutils [req-1d62a269-7bcb-46a9-a778-c061c18b1d07 req-44c0fa11-7bb8-4cf9-8d36-2846c67e53b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:26 np0005539505 nova_compute[186958]: 2025-11-29 07:07:26.704 186962 DEBUG oslo_concurrency.lockutils [req-1d62a269-7bcb-46a9-a778-c061c18b1d07 req-44c0fa11-7bb8-4cf9-8d36-2846c67e53b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:26 np0005539505 nova_compute[186958]: 2025-11-29 07:07:26.705 186962 DEBUG oslo_concurrency.lockutils [req-1d62a269-7bcb-46a9-a778-c061c18b1d07 req-44c0fa11-7bb8-4cf9-8d36-2846c67e53b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:26 np0005539505 nova_compute[186958]: 2025-11-29 07:07:26.705 186962 DEBUG nova.compute.manager [req-1d62a269-7bcb-46a9-a778-c061c18b1d07 req-44c0fa11-7bb8-4cf9-8d36-2846c67e53b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] No waiting events found dispatching network-vif-plugged-5be8ebc8-375b-4152-819e-76e386deb759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:26 np0005539505 nova_compute[186958]: 2025-11-29 07:07:26.705 186962 WARNING nova.compute.manager [req-1d62a269-7bcb-46a9-a778-c061c18b1d07 req-44c0fa11-7bb8-4cf9-8d36-2846c67e53b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Received unexpected event network-vif-plugged-5be8ebc8-375b-4152-819e-76e386deb759 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:07:26 np0005539505 podman[227408]: 2025-11-29 07:07:26.720839543 +0000 UTC m=+0.054733828 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:07:26 np0005539505 podman[227409]: 2025-11-29 07:07:26.784276017 +0000 UTC m=+0.115552238 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:07:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:26.945 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:26.945 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:26.946 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:27 np0005539505 nova_compute[186958]: 2025-11-29 07:07:27.035 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:29 np0005539505 nova_compute[186958]: 2025-11-29 07:07:29.086 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:30 np0005539505 nova_compute[186958]: 2025-11-29 07:07:30.024 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:07:31 np0005539505 podman[227456]: 2025-11-29 07:07:31.727991566 +0000 UTC m=+0.065578115 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 29 02:07:32 np0005539505 nova_compute[186958]: 2025-11-29 07:07:32.036 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:33 np0005539505 kernel: tap13880ab3-f2 (unregistering): left promiscuous mode
Nov 29 02:07:33 np0005539505 NetworkManager[55134]: <info>  [1764400053.3147] device (tap13880ab3-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:07:33 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:33Z|00313|binding|INFO|Releasing lport 13880ab3-f23b-4b30-b581-49b775aa5bfd from this chassis (sb_readonly=0)
Nov 29 02:07:33 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:33Z|00314|binding|INFO|Setting lport 13880ab3-f23b-4b30-b581-49b775aa5bfd down in Southbound
Nov 29 02:07:33 np0005539505 nova_compute[186958]: 2025-11-29 07:07:33.323 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:33 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:33Z|00315|binding|INFO|Removing iface tap13880ab3-f2 ovn-installed in OVS
Nov 29 02:07:33 np0005539505 nova_compute[186958]: 2025-11-29 07:07:33.327 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:33 np0005539505 nova_compute[186958]: 2025-11-29 07:07:33.339 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:33 np0005539505 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000050.scope: Deactivated successfully.
Nov 29 02:07:33 np0005539505 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000050.scope: Consumed 13.582s CPU time.
Nov 29 02:07:33 np0005539505 systemd-machined[153285]: Machine qemu-39-instance-00000050 terminated.
Nov 29 02:07:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:33.489 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:af:c6 10.100.0.7'], port_security=['fa:16:3e:68:af:c6 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '42c368fd-3d19-43c7-a528-68d642f739a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2afbfb6d-a7e3-43fe-9124-4ed2124bd3c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=13880ab3-f23b-4b30-b581-49b775aa5bfd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:07:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:33.491 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 13880ab3-f23b-4b30-b581-49b775aa5bfd in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis#033[00m
Nov 29 02:07:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:33.493 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:07:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:33.495 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e1f591-7578-4fb0-b171-88fcb6d9c24b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:33.496 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore#033[00m
Nov 29 02:07:33 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227173]: [NOTICE]   (227177) : haproxy version is 2.8.14-c23fe91
Nov 29 02:07:33 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227173]: [NOTICE]   (227177) : path to executable is /usr/sbin/haproxy
Nov 29 02:07:33 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227173]: [WARNING]  (227177) : Exiting Master process...
Nov 29 02:07:33 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227173]: [WARNING]  (227177) : Exiting Master process...
Nov 29 02:07:33 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227173]: [ALERT]    (227177) : Current worker (227179) exited with code 143 (Terminated)
Nov 29 02:07:33 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227173]: [WARNING]  (227177) : All workers exited. Exiting... (0)
Nov 29 02:07:33 np0005539505 systemd[1]: libpod-1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5.scope: Deactivated successfully.
Nov 29 02:07:33 np0005539505 podman[227514]: 2025-11-29 07:07:33.720328254 +0000 UTC m=+0.128127874 container died 1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.046 186962 INFO nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Instance shutdown successfully after 14 seconds.#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.053 186962 INFO nova.virt.libvirt.driver [-] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Instance destroyed successfully.#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.061 186962 INFO nova.virt.libvirt.driver [-] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Instance destroyed successfully.#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.062 186962 DEBUG nova.virt.libvirt.vif [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:07:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1310590920',display_name='tempest-ServerActionsTestJSON-server-187057732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1310590920',id=80,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:07:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-efvjtvso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:07:12Z,user_data=None,user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=42c368fd-3d19-43c7-a528-68d642f739a2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.063 186962 DEBUG nova.network.os_vif_util [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.063 186962 DEBUG nova.network.os_vif_util [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.064 186962 DEBUG os_vif [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.067 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.068 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13880ab3-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.069 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.072 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.076 186962 INFO os_vif [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2')#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.076 186962 INFO nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Deleting instance files /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2_del#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.077 186962 INFO nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Deletion of /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2_del complete#033[00m
Nov 29 02:07:34 np0005539505 systemd[1]: var-lib-containers-storage-overlay-6e65601702a817c538bb12b6a0c1aa91a768b49dffd0011d0280dd4c0d1cdb2c-merged.mount: Deactivated successfully.
Nov 29 02:07:34 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5-userdata-shm.mount: Deactivated successfully.
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.088 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.131 186962 DEBUG nova.compute.manager [req-86a2bd40-d4dd-4e0c-8a96-79fa853f0a80 req-b737e6d6-3e41-40f0-88ae-92e4d465ee6d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received event network-vif-unplugged-13880ab3-f23b-4b30-b581-49b775aa5bfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.131 186962 DEBUG oslo_concurrency.lockutils [req-86a2bd40-d4dd-4e0c-8a96-79fa853f0a80 req-b737e6d6-3e41-40f0-88ae-92e4d465ee6d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.132 186962 DEBUG oslo_concurrency.lockutils [req-86a2bd40-d4dd-4e0c-8a96-79fa853f0a80 req-b737e6d6-3e41-40f0-88ae-92e4d465ee6d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.132 186962 DEBUG oslo_concurrency.lockutils [req-86a2bd40-d4dd-4e0c-8a96-79fa853f0a80 req-b737e6d6-3e41-40f0-88ae-92e4d465ee6d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.132 186962 DEBUG nova.compute.manager [req-86a2bd40-d4dd-4e0c-8a96-79fa853f0a80 req-b737e6d6-3e41-40f0-88ae-92e4d465ee6d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] No waiting events found dispatching network-vif-unplugged-13880ab3-f23b-4b30-b581-49b775aa5bfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:34 np0005539505 nova_compute[186958]: 2025-11-29 07:07:34.133 186962 WARNING nova.compute.manager [req-86a2bd40-d4dd-4e0c-8a96-79fa853f0a80 req-b737e6d6-3e41-40f0-88ae-92e4d465ee6d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received unexpected event network-vif-unplugged-13880ab3-f23b-4b30-b581-49b775aa5bfd for instance with vm_state active and task_state rebuilding.#033[00m
Nov 29 02:07:34 np0005539505 podman[227543]: 2025-11-29 07:07:34.170253173 +0000 UTC m=+0.235604962 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:07:34 np0005539505 podman[227514]: 2025-11-29 07:07:34.443706154 +0000 UTC m=+0.851505824 container cleanup 1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:07:34 np0005539505 systemd[1]: libpod-conmon-1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5.scope: Deactivated successfully.
Nov 29 02:07:35 np0005539505 podman[227564]: 2025-11-29 07:07:35.510553227 +0000 UTC m=+1.042818814 container remove 1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:07:35 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:35.515 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[16c4e121-61d0-4508-abf6-a0bac9daa24b]: (4, ('Sat Nov 29 07:07:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5)\n1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5\nSat Nov 29 07:07:34 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5)\n1968bc1d60a5337021bd878ea7f908863cab4830f9b380489cd0ebac3782ecc5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:35 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:35.518 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c4476c2d-3334-41df-aed2-11ad332eae3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:35 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:35.519 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:35 np0005539505 nova_compute[186958]: 2025-11-29 07:07:35.521 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:35 np0005539505 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 02:07:35 np0005539505 nova_compute[186958]: 2025-11-29 07:07:35.533 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:35 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:35.537 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[08a3d707-b396-428c-9dc9-e1a3fb564d41]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:35 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:35.550 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee9e697-3a2d-4ff2-ab0b-747b4b408cd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:35 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:35.551 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dbe061a0-6072-491e-aa84-d55e8ac9611d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:35 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:35.568 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ec93c8a9-5efa-44b5-948a-19f15662420f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547636, 'reachable_time': 29512, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227579, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:35 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:35.571 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:07:35 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:35.572 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[de098f8d-ef56-4808-a991-9b26535035c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:35 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 02:07:36 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:36Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4f:cf:a3 10.100.0.11
Nov 29 02:07:36 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:36Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4f:cf:a3 10.100.0.11
Nov 29 02:07:39 np0005539505 nova_compute[186958]: 2025-11-29 07:07:39.071 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:39 np0005539505 nova_compute[186958]: 2025-11-29 07:07:39.091 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.489 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.490 186962 INFO nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Creating image(s)#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.491 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.491 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.492 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.508 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.568 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.569 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.570 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.582 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.637 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.638 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.691 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.692 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.693 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.754 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.756 186962 DEBUG nova.virt.disk.api [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Checking if we can resize image /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.756 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.808 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.809 186962 DEBUG nova.virt.disk.api [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Cannot resize image /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.809 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.810 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Ensure instance console log exists: /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.810 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.811 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.811 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.813 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Start _get_guest_xml network_info=[{"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.817 186962 WARNING nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.827 186962 DEBUG nova.virt.libvirt.host [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.828 186962 DEBUG nova.virt.libvirt.host [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.831 186962 DEBUG nova.virt.libvirt.host [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.831 186962 DEBUG nova.virt.libvirt.host [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.832 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.833 186962 DEBUG nova.virt.hardware [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.833 186962 DEBUG nova.virt.hardware [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.833 186962 DEBUG nova.virt.hardware [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.834 186962 DEBUG nova.virt.hardware [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.834 186962 DEBUG nova.virt.hardware [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.834 186962 DEBUG nova.virt.hardware [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.834 186962 DEBUG nova.virt.hardware [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.834 186962 DEBUG nova.virt.hardware [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.835 186962 DEBUG nova.virt.hardware [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.835 186962 DEBUG nova.virt.hardware [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.835 186962 DEBUG nova.virt.hardware [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:07:41 np0005539505 nova_compute[186958]: 2025-11-29 07:07:41.835 186962 DEBUG nova.objects.instance [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'vcpu_model' on Instance uuid 42c368fd-3d19-43c7-a528-68d642f739a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:42 np0005539505 nova_compute[186958]: 2025-11-29 07:07:42.112 186962 DEBUG nova.compute.manager [req-68a087d7-c054-4862-9a53-30e2f6c4ddcf req-ff88994c-9e6f-4d9b-9906-57f0c6beb0dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:42 np0005539505 nova_compute[186958]: 2025-11-29 07:07:42.113 186962 DEBUG oslo_concurrency.lockutils [req-68a087d7-c054-4862-9a53-30e2f6c4ddcf req-ff88994c-9e6f-4d9b-9906-57f0c6beb0dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:42 np0005539505 nova_compute[186958]: 2025-11-29 07:07:42.113 186962 DEBUG oslo_concurrency.lockutils [req-68a087d7-c054-4862-9a53-30e2f6c4ddcf req-ff88994c-9e6f-4d9b-9906-57f0c6beb0dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:42 np0005539505 nova_compute[186958]: 2025-11-29 07:07:42.114 186962 DEBUG oslo_concurrency.lockutils [req-68a087d7-c054-4862-9a53-30e2f6c4ddcf req-ff88994c-9e6f-4d9b-9906-57f0c6beb0dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:42 np0005539505 nova_compute[186958]: 2025-11-29 07:07:42.114 186962 DEBUG nova.compute.manager [req-68a087d7-c054-4862-9a53-30e2f6c4ddcf req-ff88994c-9e6f-4d9b-9906-57f0c6beb0dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] No waiting events found dispatching network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:42 np0005539505 nova_compute[186958]: 2025-11-29 07:07:42.114 186962 WARNING nova.compute.manager [req-68a087d7-c054-4862-9a53-30e2f6c4ddcf req-ff88994c-9e6f-4d9b-9906-57f0c6beb0dd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received unexpected event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 29 02:07:44 np0005539505 nova_compute[186958]: 2025-11-29 07:07:44.074 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:44 np0005539505 nova_compute[186958]: 2025-11-29 07:07:44.093 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:44 np0005539505 podman[227612]: 2025-11-29 07:07:44.717014121 +0000 UTC m=+0.047081552 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:07:44 np0005539505 podman[227611]: 2025-11-29 07:07:44.726079478 +0000 UTC m=+0.058040052 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.670 186962 DEBUG nova.virt.libvirt.vif [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:07:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1310590920',display_name='tempest-ServerActionsTestJSON-server-187057732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1310590920',id=80,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:07:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-efvjtvso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:07:40Z,user_data=None,user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=42c368fd-3d19-43c7-a528-68d642f739a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.670 186962 DEBUG nova.network.os_vif_util [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.671 186962 DEBUG nova.network.os_vif_util [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.672 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  <uuid>42c368fd-3d19-43c7-a528-68d642f739a2</uuid>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  <name>instance-00000050</name>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerActionsTestJSON-server-187057732</nova:name>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:07:41</nova:creationTime>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:        <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:        <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="3372b7b2-657b-4c4d-9d9d-7c5b771a630a"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:        <nova:port uuid="13880ab3-f23b-4b30-b581-49b775aa5bfd">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <entry name="serial">42c368fd-3d19-43c7-a528-68d642f739a2</entry>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <entry name="uuid">42c368fd-3d19-43c7-a528-68d642f739a2</entry>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.config"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:68:af:c6"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <target dev="tap13880ab3-f2"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/console.log" append="off"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:07:45 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:07:45 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:07:45 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:07:45 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.672 186962 DEBUG nova.compute.manager [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Preparing to wait for external event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.673 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.673 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.673 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.674 186962 DEBUG nova.virt.libvirt.vif [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:07:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1310590920',display_name='tempest-ServerActionsTestJSON-server-187057732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1310590920',id=80,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:07:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-efvjtvso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:07:40Z,user_data=None,user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=42c368fd-3d19-43c7-a528-68d642f739a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.674 186962 DEBUG nova.network.os_vif_util [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.674 186962 DEBUG nova.network.os_vif_util [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.674 186962 DEBUG os_vif [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.675 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.675 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.676 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.678 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.678 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13880ab3-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.679 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap13880ab3-f2, col_values=(('external_ids', {'iface-id': '13880ab3-f23b-4b30-b581-49b775aa5bfd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:68:af:c6', 'vm-uuid': '42c368fd-3d19-43c7-a528-68d642f739a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:45.681 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:07:45 np0005539505 NetworkManager[55134]: <info>  [1764400065.6823] manager: (tap13880ab3-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.683 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:07:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:45.683 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.687 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.688 186962 INFO os_vif [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2')#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.752 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.753 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.753 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No VIF found with MAC fa:16:3e:68:af:c6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.753 186962 INFO nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Using config drive#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.772 186962 DEBUG nova.objects.instance [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'ec2_ids' on Instance uuid 42c368fd-3d19-43c7-a528-68d642f739a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:45 np0005539505 nova_compute[186958]: 2025-11-29 07:07:45.803 186962 DEBUG nova.objects.instance [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'keypairs' on Instance uuid 42c368fd-3d19-43c7-a528-68d642f739a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.009 186962 INFO nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Creating config drive at /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.config#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.014 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0vjhvdb6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.139 186962 DEBUG oslo_concurrency.processutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0vjhvdb6" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:07:47 np0005539505 kernel: tap13880ab3-f2: entered promiscuous mode
Nov 29 02:07:47 np0005539505 NetworkManager[55134]: <info>  [1764400067.1970] manager: (tap13880ab3-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/162)
Nov 29 02:07:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:47Z|00316|binding|INFO|Claiming lport 13880ab3-f23b-4b30-b581-49b775aa5bfd for this chassis.
Nov 29 02:07:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:47Z|00317|binding|INFO|13880ab3-f23b-4b30-b581-49b775aa5bfd: Claiming fa:16:3e:68:af:c6 10.100.0.7
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.199 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.208 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:af:c6 10.100.0.7'], port_security=['fa:16:3e:68:af:c6 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '42c368fd-3d19-43c7-a528-68d642f739a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '5', 'neutron:security_group_ids': '2afbfb6d-a7e3-43fe-9124-4ed2124bd3c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=13880ab3-f23b-4b30-b581-49b775aa5bfd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.210 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 13880ab3-f23b-4b30-b581-49b775aa5bfd in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.212 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399#033[00m
Nov 29 02:07:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:47Z|00318|binding|INFO|Setting lport 13880ab3-f23b-4b30-b581-49b775aa5bfd ovn-installed in OVS
Nov 29 02:07:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:47Z|00319|binding|INFO|Setting lport 13880ab3-f23b-4b30-b581-49b775aa5bfd up in Southbound
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.218 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.223 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f69b06d7-ae8f-4051-9884-5a9e04bb6e80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.224 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.226 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.226 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1efd1cfc-6fac-4216-bdce-6f21ced08247]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.228 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[99191260-467f-4c63-921c-e2a8094c9b66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 systemd-udevd[227676]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:07:47 np0005539505 systemd-machined[153285]: New machine qemu-41-instance-00000050.
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.242 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[13695879-8729-4c8a-82d7-910adca25474]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 NetworkManager[55134]: <info>  [1764400067.2471] device (tap13880ab3-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:07:47 np0005539505 NetworkManager[55134]: <info>  [1764400067.2479] device (tap13880ab3-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:07:47 np0005539505 systemd[1]: Started Virtual Machine qemu-41-instance-00000050.
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.258 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[386637e2-11fd-48d4-9a76-e8875859fe50]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.287 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[7e8241f7-c9bd-4de3-a51d-edfad635cd8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 systemd-udevd[227680]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.293 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cd9897c0-9e67-42e6-a8f5-54f65d53014e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 NetworkManager[55134]: <info>  [1764400067.2950] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/163)
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.324 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[cdba9060-4c01-4beb-ac59-f5072a126fcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.327 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[fca131f0-7109-406c-8e71-bcc342df5852]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 NetworkManager[55134]: <info>  [1764400067.3501] device (tap9226dea3-60): carrier: link connected
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.357 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[0476bd16-ebc7-4770-996b-3e7414873574]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.375 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[15c5dea3-308d-44e8-aa36-a7c2690b361a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551493, 'reachable_time': 40246, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227709, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.391 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[caba0146-5bf6-41f6-b8b8-ea1a9e0e9a20]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 551493, 'tstamp': 551493}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227710, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.409 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b549f0e7-4f66-4cf0-a679-ef3557729ef0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551493, 'reachable_time': 40246, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227711, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.437 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2481a12a-5166-4fc9-8571-16bef7a59951]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.492 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[63de2508-3e41-4b01-8f05-e0f21e64e182]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.494 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.494 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.494 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.496 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:47 np0005539505 NetworkManager[55134]: <info>  [1764400067.4970] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Nov 29 02:07:47 np0005539505 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.500 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.499 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:47Z|00320|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.501 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.515 186962 DEBUG nova.compute.manager [req-5c5754ea-1dad-4382-a530-c4609da3a6cc req-6a5f8665-bfa2-4d90-b4a2-7dd44f5e1df7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.515 186962 DEBUG oslo_concurrency.lockutils [req-5c5754ea-1dad-4382-a530-c4609da3a6cc req-6a5f8665-bfa2-4d90-b4a2-7dd44f5e1df7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.515 186962 DEBUG oslo_concurrency.lockutils [req-5c5754ea-1dad-4382-a530-c4609da3a6cc req-6a5f8665-bfa2-4d90-b4a2-7dd44f5e1df7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.516 186962 DEBUG oslo_concurrency.lockutils [req-5c5754ea-1dad-4382-a530-c4609da3a6cc req-6a5f8665-bfa2-4d90-b4a2-7dd44f5e1df7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.516 186962 DEBUG nova.compute.manager [req-5c5754ea-1dad-4382-a530-c4609da3a6cc req-6a5f8665-bfa2-4d90-b4a2-7dd44f5e1df7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Processing event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.524 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.524 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.525 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[911602cd-51c6-48c1-ae8a-94dda8aee878]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.526 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:07:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:47.526 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.593 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Removed pending event for 42c368fd-3d19-43c7-a528-68d642f739a2 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.594 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400067.5931299, 42c368fd-3d19-43c7-a528-68d642f739a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.594 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] VM Started (Lifecycle Event)#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.596 186962 DEBUG nova.compute.manager [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.600 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.603 186962 INFO nova.virt.libvirt.driver [-] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Instance spawned successfully.#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.603 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.633 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.638 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.641 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.642 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.642 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.642 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.643 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.643 186962 DEBUG nova.virt.libvirt.driver [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.687 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.688 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400067.5934713, 42c368fd-3d19-43c7-a528-68d642f739a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.688 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.726 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.729 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400067.5993605, 42c368fd-3d19-43c7-a528-68d642f739a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.730 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.756 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.758 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.777 186962 DEBUG nova.compute.manager [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.783 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.879 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.879 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.880 186962 DEBUG nova.objects.instance [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:07:47 np0005539505 podman[227750]: 2025-11-29 07:07:47.873538714 +0000 UTC m=+0.021456568 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:07:47 np0005539505 nova_compute[186958]: 2025-11-29 07:07:47.984 186962 DEBUG oslo_concurrency.lockutils [None req-d963d68f-47db-43e3-b732-35780b8d231c e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:48 np0005539505 podman[227750]: 2025-11-29 07:07:48.537423722 +0000 UTC m=+0.685341556 container create 7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:07:48 np0005539505 systemd[1]: Started libpod-conmon-7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311.scope.
Nov 29 02:07:48 np0005539505 podman[227763]: 2025-11-29 07:07:48.751876585 +0000 UTC m=+0.169610326 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:07:48 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:07:48 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe3c3aeb262d817ff2d03bf1a4746855891f06f9a2e1bc2c45c9856beff71fcf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:07:48 np0005539505 podman[227750]: 2025-11-29 07:07:48.876877999 +0000 UTC m=+1.024795853 container init 7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:07:48 np0005539505 podman[227750]: 2025-11-29 07:07:48.884301929 +0000 UTC m=+1.032219763 container start 7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:07:48 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227784]: [NOTICE]   (227791) : New worker (227793) forked
Nov 29 02:07:48 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227784]: [NOTICE]   (227791) : Loading success.
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.154 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.634 186962 DEBUG oslo_concurrency.lockutils [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.634 186962 DEBUG oslo_concurrency.lockutils [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.634 186962 DEBUG oslo_concurrency.lockutils [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.635 186962 DEBUG oslo_concurrency.lockutils [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.635 186962 DEBUG oslo_concurrency.lockutils [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.651 186962 INFO nova.compute.manager [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Terminating instance#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.692 186962 DEBUG nova.compute.manager [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:07:49 np0005539505 kernel: tap13880ab3-f2 (unregistering): left promiscuous mode
Nov 29 02:07:49 np0005539505 NetworkManager[55134]: <info>  [1764400069.7155] device (tap13880ab3-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.729 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:49 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:49Z|00321|binding|INFO|Releasing lport 13880ab3-f23b-4b30-b581-49b775aa5bfd from this chassis (sb_readonly=0)
Nov 29 02:07:49 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:49Z|00322|binding|INFO|Setting lport 13880ab3-f23b-4b30-b581-49b775aa5bfd down in Southbound
Nov 29 02:07:49 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:49Z|00323|binding|INFO|Removing iface tap13880ab3-f2 ovn-installed in OVS
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.731 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.741 186962 DEBUG nova.compute.manager [req-e11073b0-8867-4d24-b27d-a57b3869f056 req-fa136671-39d1-41c7-82c3-a1b48e887afd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.741 186962 DEBUG oslo_concurrency.lockutils [req-e11073b0-8867-4d24-b27d-a57b3869f056 req-fa136671-39d1-41c7-82c3-a1b48e887afd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.741 186962 DEBUG oslo_concurrency.lockutils [req-e11073b0-8867-4d24-b27d-a57b3869f056 req-fa136671-39d1-41c7-82c3-a1b48e887afd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.742 186962 DEBUG oslo_concurrency.lockutils [req-e11073b0-8867-4d24-b27d-a57b3869f056 req-fa136671-39d1-41c7-82c3-a1b48e887afd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.742 186962 DEBUG nova.compute.manager [req-e11073b0-8867-4d24-b27d-a57b3869f056 req-fa136671-39d1-41c7-82c3-a1b48e887afd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] No waiting events found dispatching network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.742 186962 WARNING nova.compute.manager [req-e11073b0-8867-4d24-b27d-a57b3869f056 req-fa136671-39d1-41c7-82c3-a1b48e887afd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received unexpected event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:07:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:49.743 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:af:c6 10.100.0.7'], port_security=['fa:16:3e:68:af:c6 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '42c368fd-3d19-43c7-a528-68d642f739a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '6', 'neutron:security_group_ids': '2afbfb6d-a7e3-43fe-9124-4ed2124bd3c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=13880ab3-f23b-4b30-b581-49b775aa5bfd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:07:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:49.744 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 13880ab3-f23b-4b30-b581-49b775aa5bfd in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis#033[00m
Nov 29 02:07:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:49.745 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:07:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:49.747 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7200bbaa-58e2-430f-b453-6fe668ceb6de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:49.747 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.749 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:49 np0005539505 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000050.scope: Deactivated successfully.
Nov 29 02:07:49 np0005539505 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000050.scope: Consumed 2.457s CPU time.
Nov 29 02:07:49 np0005539505 systemd-machined[153285]: Machine qemu-41-instance-00000050 terminated.
Nov 29 02:07:49 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227784]: [NOTICE]   (227791) : haproxy version is 2.8.14-c23fe91
Nov 29 02:07:49 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227784]: [NOTICE]   (227791) : path to executable is /usr/sbin/haproxy
Nov 29 02:07:49 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227784]: [WARNING]  (227791) : Exiting Master process...
Nov 29 02:07:49 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227784]: [ALERT]    (227791) : Current worker (227793) exited with code 143 (Terminated)
Nov 29 02:07:49 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[227784]: [WARNING]  (227791) : All workers exited. Exiting... (0)
Nov 29 02:07:49 np0005539505 systemd[1]: libpod-7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311.scope: Deactivated successfully.
Nov 29 02:07:49 np0005539505 podman[227822]: 2025-11-29 07:07:49.95073273 +0000 UTC m=+0.114091877 container died 7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.967 186962 INFO nova.virt.libvirt.driver [-] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Instance destroyed successfully.#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.967 186962 DEBUG nova.objects.instance [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 42c368fd-3d19-43c7-a528-68d642f739a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.984 186962 DEBUG nova.virt.libvirt.vif [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:07:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1310590920',display_name='tempest-ServerActionsTestJSON-server-187057732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1310590920',id=80,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:07:47Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-efvjtvso',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:07:47Z,user_data=None,user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=42c368fd-3d19-43c7-a528-68d642f739a2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.985 186962 DEBUG nova.network.os_vif_util [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "address": "fa:16:3e:68:af:c6", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13880ab3-f2", "ovs_interfaceid": "13880ab3-f23b-4b30-b581-49b775aa5bfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.985 186962 DEBUG nova.network.os_vif_util [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.986 186962 DEBUG os_vif [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.989 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.989 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13880ab3-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.991 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.993 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.995 186962 INFO os_vif [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:af:c6,bridge_name='br-int',has_traffic_filtering=True,id=13880ab3-f23b-4b30-b581-49b775aa5bfd,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13880ab3-f2')#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.996 186962 INFO nova.virt.libvirt.driver [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Deleting instance files /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2_del#033[00m
Nov 29 02:07:49 np0005539505 nova_compute[186958]: 2025-11-29 07:07:49.996 186962 INFO nova.virt.libvirt.driver [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Deletion of /var/lib/nova/instances/42c368fd-3d19-43c7-a528-68d642f739a2_del complete#033[00m
Nov 29 02:07:50 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311-userdata-shm.mount: Deactivated successfully.
Nov 29 02:07:50 np0005539505 systemd[1]: var-lib-containers-storage-overlay-fe3c3aeb262d817ff2d03bf1a4746855891f06f9a2e1bc2c45c9856beff71fcf-merged.mount: Deactivated successfully.
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.077 186962 INFO nova.compute.manager [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.079 186962 DEBUG oslo.service.loopingcall [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.079 186962 DEBUG nova.compute.manager [-] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.079 186962 DEBUG nova.network.neutron [-] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:07:50 np0005539505 podman[227822]: 2025-11-29 07:07:50.084050119 +0000 UTC m=+0.247409266 container cleanup 7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:07:50 np0005539505 systemd[1]: libpod-conmon-7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311.scope: Deactivated successfully.
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.215 186962 DEBUG oslo_concurrency.lockutils [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "5690e5c7-982b-467a-bfd2-cf03ba81672c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.216 186962 DEBUG oslo_concurrency.lockutils [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.216 186962 DEBUG oslo_concurrency.lockutils [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.216 186962 DEBUG oslo_concurrency.lockutils [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.216 186962 DEBUG oslo_concurrency.lockutils [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.227 186962 INFO nova.compute.manager [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Terminating instance#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.236 186962 DEBUG nova.compute.manager [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:07:50 np0005539505 kernel: tap5be8ebc8-37 (unregistering): left promiscuous mode
Nov 29 02:07:50 np0005539505 NetworkManager[55134]: <info>  [1764400070.2667] device (tap5be8ebc8-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:07:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:50Z|00324|binding|INFO|Releasing lport 5be8ebc8-375b-4152-819e-76e386deb759 from this chassis (sb_readonly=0)
Nov 29 02:07:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:50Z|00325|binding|INFO|Setting lport 5be8ebc8-375b-4152-819e-76e386deb759 down in Southbound
Nov 29 02:07:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:07:50Z|00326|binding|INFO|Removing iface tap5be8ebc8-37 ovn-installed in OVS
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.277 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.280 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.286 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:cf:a3 10.100.0.11'], port_security=['fa:16:3e:4f:cf:a3 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5690e5c7-982b-467a-bfd2-cf03ba81672c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '477b89fb35da42f69c15b3f01054754a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9d87673c-3e8a-46ed-9956-50ea661306ab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=114ba21f-c978-4c05-97f9-429aa66017b7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=5be8ebc8-375b-4152-819e-76e386deb759) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.296 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:50 np0005539505 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000053.scope: Deactivated successfully.
Nov 29 02:07:50 np0005539505 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000053.scope: Consumed 13.854s CPU time.
Nov 29 02:07:50 np0005539505 systemd-machined[153285]: Machine qemu-40-instance-00000053 terminated.
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.502 186962 INFO nova.virt.libvirt.driver [-] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Instance destroyed successfully.#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.502 186962 DEBUG nova.objects.instance [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lazy-loading 'resources' on Instance uuid 5690e5c7-982b-467a-bfd2-cf03ba81672c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.522 186962 DEBUG nova.virt.libvirt.vif [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:07:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-1009215778',display_name='tempest-ListServersNegativeTestJSON-server-1009215778-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-1009215778-3',id=83,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2025-11-29T07:07:24Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='477b89fb35da42f69c15b3f01054754a',ramdisk_id='',reservation_id='r-fs8srmd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-316367608',owner_user_name='tempest-ListServersNegativeTestJSON-316367608-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:07:24Z,user_data=None,user_id='3c9a3fa9f480479d98f522f6f02870fb',uuid=5690e5c7-982b-467a-bfd2-cf03ba81672c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5be8ebc8-375b-4152-819e-76e386deb759", "address": "fa:16:3e:4f:cf:a3", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be8ebc8-37", "ovs_interfaceid": "5be8ebc8-375b-4152-819e-76e386deb759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.523 186962 DEBUG nova.network.os_vif_util [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converting VIF {"id": "5be8ebc8-375b-4152-819e-76e386deb759", "address": "fa:16:3e:4f:cf:a3", "network": {"id": "c8a3c675-42f5-48a4-83d7-2d39dd3304b9", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1711984379-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "477b89fb35da42f69c15b3f01054754a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5be8ebc8-37", "ovs_interfaceid": "5be8ebc8-375b-4152-819e-76e386deb759", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.523 186962 DEBUG nova.network.os_vif_util [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:cf:a3,bridge_name='br-int',has_traffic_filtering=True,id=5be8ebc8-375b-4152-819e-76e386deb759,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be8ebc8-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.524 186962 DEBUG os_vif [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:cf:a3,bridge_name='br-int',has_traffic_filtering=True,id=5be8ebc8-375b-4152-819e-76e386deb759,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be8ebc8-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.525 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.525 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5be8ebc8-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.526 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.528 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.530 186962 INFO os_vif [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:cf:a3,bridge_name='br-int',has_traffic_filtering=True,id=5be8ebc8-375b-4152-819e-76e386deb759,network=Network(c8a3c675-42f5-48a4-83d7-2d39dd3304b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5be8ebc8-37')#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.531 186962 INFO nova.virt.libvirt.driver [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Deleting instance files /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c_del#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.531 186962 INFO nova.virt.libvirt.driver [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Deletion of /var/lib/nova/instances/5690e5c7-982b-467a-bfd2-cf03ba81672c_del complete#033[00m
Nov 29 02:07:50 np0005539505 podman[227866]: 2025-11-29 07:07:50.601137058 +0000 UTC m=+0.494302436 container remove 7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.607 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fe7e9a46-ead9-432f-be8b-cbf9ff71cf14]: (4, ('Sat Nov 29 07:07:49 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311)\n7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311\nSat Nov 29 07:07:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311)\n7cbf0f4a82c81739bd2bc6ad92b75e2e40cd89feeb7c4cc66acbfa1ba3f6d311\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.609 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb7a82b-c05f-4c49-8665-d1442aca3173]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.610 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.612 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:50 np0005539505 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.629 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.630 186962 INFO nova.compute.manager [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.631 186962 DEBUG oslo.service.loopingcall [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.631 186962 DEBUG nova.compute.manager [-] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.631 186962 DEBUG nova.network.neutron [-] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.630 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd9f850-68b9-4f6d-919c-48b097526189]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.643 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8d9f02ca-b37e-4c5d-ab00-cfbd8a0abf50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.645 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2644fd-942d-41b8-9b78-d96a52c9629b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.661 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5e9a2a18-429b-4f48-965b-79f0a234ba56]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 551486, 'reachable_time': 23709, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227906, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.663 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.663 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[c7af32c6-81e6-4047-9d47-1f076e9fad5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.664 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 5be8ebc8-375b-4152-819e-76e386deb759 in datapath c8a3c675-42f5-48a4-83d7-2d39dd3304b9 unbound from our chassis#033[00m
Nov 29 02:07:50 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.665 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8a3c675-42f5-48a4-83d7-2d39dd3304b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.666 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[438fa149-e8da-4f7a-aca8-a021ea5b2ab9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:50.667 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 namespace which is not needed anymore#033[00m
Nov 29 02:07:50 np0005539505 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227393]: [NOTICE]   (227397) : haproxy version is 2.8.14-c23fe91
Nov 29 02:07:50 np0005539505 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227393]: [NOTICE]   (227397) : path to executable is /usr/sbin/haproxy
Nov 29 02:07:50 np0005539505 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227393]: [WARNING]  (227397) : Exiting Master process...
Nov 29 02:07:50 np0005539505 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227393]: [WARNING]  (227397) : Exiting Master process...
Nov 29 02:07:50 np0005539505 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227393]: [ALERT]    (227397) : Current worker (227399) exited with code 143 (Terminated)
Nov 29 02:07:50 np0005539505 neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9[227393]: [WARNING]  (227397) : All workers exited. Exiting... (0)
Nov 29 02:07:50 np0005539505 systemd[1]: libpod-d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11.scope: Deactivated successfully.
Nov 29 02:07:50 np0005539505 podman[227922]: 2025-11-29 07:07:50.813478391 +0000 UTC m=+0.051771994 container died d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:07:50 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11-userdata-shm.mount: Deactivated successfully.
Nov 29 02:07:50 np0005539505 systemd[1]: var-lib-containers-storage-overlay-6dd218dc11bc6657215d52b91b492a4b3ee497f7fd45c65581af02c2846561dc-merged.mount: Deactivated successfully.
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.871 186962 DEBUG nova.network.neutron [-] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:07:50 np0005539505 podman[227922]: 2025-11-29 07:07:50.877738328 +0000 UTC m=+0.116031931 container cleanup d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:07:50 np0005539505 systemd[1]: libpod-conmon-d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11.scope: Deactivated successfully.
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.913 186962 INFO nova.compute.manager [-] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Took 0.83 seconds to deallocate network for instance.#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.983 186962 DEBUG nova.compute.manager [req-f7fd8676-4569-449a-8ced-f1744f34e522 req-6e1cc66d-dce0-41fe-b99d-6866b88398ff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received event network-vif-deleted-13880ab3-f23b-4b30-b581-49b775aa5bfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.997 186962 DEBUG oslo_concurrency.lockutils [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:50 np0005539505 nova_compute[186958]: 2025-11-29 07:07:50.998 186962 DEBUG oslo_concurrency.lockutils [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:51 np0005539505 podman[227951]: 2025-11-29 07:07:51.180675563 +0000 UTC m=+0.280315356 container remove d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:07:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:51.187 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e2af24-9afd-4d3c-a401-4ce1da9fda75]: (4, ('Sat Nov 29 07:07:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 (d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11)\nd9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11\nSat Nov 29 07:07:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 (d9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11)\nd9364391f8a899f7d2af1329aad3b28d5c59d8c74160769f175e97e70288da11\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:51.189 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bd8ab23d-a1fe-4e29-82c5-4ea896651e32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:51.190 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8a3c675-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:51 np0005539505 nova_compute[186958]: 2025-11-29 07:07:51.192 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:51 np0005539505 kernel: tapc8a3c675-40: left promiscuous mode
Nov 29 02:07:51 np0005539505 nova_compute[186958]: 2025-11-29 07:07:51.198 186962 DEBUG nova.compute.provider_tree [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:07:51 np0005539505 nova_compute[186958]: 2025-11-29 07:07:51.205 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:51.208 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0614b4f1-fa56-4159-bec4-44e78a35b329]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:51 np0005539505 nova_compute[186958]: 2025-11-29 07:07:51.216 186962 DEBUG nova.scheduler.client.report [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:07:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:51.226 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e98958-19d8-42b0-bcf4-38eae0a2a379]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:51.228 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[59ab33f3-329d-45a8-a4fc-6b724c818475]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:51.244 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[17536179-dcf2-4860-977e-1507dfbefa53]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 549060, 'reachable_time': 35266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227966, 'error': None, 'target': 'ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:51.246 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c8a3c675-42f5-48a4-83d7-2d39dd3304b9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:07:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:51.247 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f06e17-dc6d-4119-9fb6-35b0d3b5e92b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:07:51 np0005539505 systemd[1]: run-netns-ovnmeta\x2dc8a3c675\x2d42f5\x2d48a4\x2d83d7\x2d2d39dd3304b9.mount: Deactivated successfully.
Nov 29 02:07:51 np0005539505 nova_compute[186958]: 2025-11-29 07:07:51.247 186962 DEBUG oslo_concurrency.lockutils [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:51 np0005539505 nova_compute[186958]: 2025-11-29 07:07:51.280 186962 INFO nova.scheduler.client.report [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Deleted allocations for instance 42c368fd-3d19-43c7-a528-68d642f739a2#033[00m
Nov 29 02:07:51 np0005539505 nova_compute[186958]: 2025-11-29 07:07:51.473 186962 DEBUG oslo_concurrency.lockutils [None req-a785b689-d7b6-4825-a3d1-ffff5d5a50a0 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.839s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.052 186962 DEBUG nova.network.neutron [-] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.061 186962 DEBUG nova.compute.manager [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received event network-vif-unplugged-13880ab3-f23b-4b30-b581-49b775aa5bfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.062 186962 DEBUG oslo_concurrency.lockutils [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.062 186962 DEBUG oslo_concurrency.lockutils [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.062 186962 DEBUG oslo_concurrency.lockutils [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.062 186962 DEBUG nova.compute.manager [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] No waiting events found dispatching network-vif-unplugged-13880ab3-f23b-4b30-b581-49b775aa5bfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.062 186962 WARNING nova.compute.manager [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received unexpected event network-vif-unplugged-13880ab3-f23b-4b30-b581-49b775aa5bfd for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.063 186962 DEBUG nova.compute.manager [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.063 186962 DEBUG oslo_concurrency.lockutils [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.063 186962 DEBUG oslo_concurrency.lockutils [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.063 186962 DEBUG oslo_concurrency.lockutils [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "42c368fd-3d19-43c7-a528-68d642f739a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.063 186962 DEBUG nova.compute.manager [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] No waiting events found dispatching network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.064 186962 WARNING nova.compute.manager [req-61f726c7-ce15-460c-a6df-17d5d3a42076 req-45a71cc5-ab94-4058-9d06-3be7647bed9b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Received unexpected event network-vif-plugged-13880ab3-f23b-4b30-b581-49b775aa5bfd for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.095 186962 INFO nova.compute.manager [-] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Took 1.46 seconds to deallocate network for instance.#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.221 186962 DEBUG oslo_concurrency.lockutils [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.221 186962 DEBUG oslo_concurrency.lockutils [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.270 186962 DEBUG nova.compute.provider_tree [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.285 186962 DEBUG nova.scheduler.client.report [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.309 186962 DEBUG oslo_concurrency.lockutils [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.333 186962 INFO nova.scheduler.client.report [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Deleted allocations for instance 5690e5c7-982b-467a-bfd2-cf03ba81672c#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.419 186962 DEBUG oslo_concurrency.lockutils [None req-d18e6698-d844-4408-b6f9-ced66d9d67a3 3c9a3fa9f480479d98f522f6f02870fb 477b89fb35da42f69c15b3f01054754a - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.447 186962 DEBUG nova.compute.manager [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Received event network-vif-unplugged-5be8ebc8-375b-4152-819e-76e386deb759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.447 186962 DEBUG oslo_concurrency.lockutils [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.448 186962 DEBUG oslo_concurrency.lockutils [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.448 186962 DEBUG oslo_concurrency.lockutils [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.448 186962 DEBUG nova.compute.manager [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] No waiting events found dispatching network-vif-unplugged-5be8ebc8-375b-4152-819e-76e386deb759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.448 186962 WARNING nova.compute.manager [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Received unexpected event network-vif-unplugged-5be8ebc8-375b-4152-819e-76e386deb759 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.448 186962 DEBUG nova.compute.manager [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Received event network-vif-plugged-5be8ebc8-375b-4152-819e-76e386deb759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.449 186962 DEBUG oslo_concurrency.lockutils [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.449 186962 DEBUG oslo_concurrency.lockutils [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.449 186962 DEBUG oslo_concurrency.lockutils [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5690e5c7-982b-467a-bfd2-cf03ba81672c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.449 186962 DEBUG nova.compute.manager [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] No waiting events found dispatching network-vif-plugged-5be8ebc8-375b-4152-819e-76e386deb759 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.450 186962 WARNING nova.compute.manager [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Received unexpected event network-vif-plugged-5be8ebc8-375b-4152-819e-76e386deb759 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:07:52 np0005539505 nova_compute[186958]: 2025-11-29 07:07:52.450 186962 DEBUG nova.compute.manager [req-9bb8030f-ecf1-4b82-9a2a-17f55e4b1a56 req-1c6f8b75-5cc1-4180-a2b3-b43fe0c67de8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Received event network-vif-deleted-5be8ebc8-375b-4152-819e-76e386deb759 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:07:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:07:53.686 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:07:54 np0005539505 nova_compute[186958]: 2025-11-29 07:07:54.179 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:54 np0005539505 nova_compute[186958]: 2025-11-29 07:07:54.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:55 np0005539505 nova_compute[186958]: 2025-11-29 07:07:55.529 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:56 np0005539505 nova_compute[186958]: 2025-11-29 07:07:56.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.056 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.170 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.226 186962 DEBUG nova.compute.manager [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.460 186962 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.461 186962 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.541 186962 DEBUG nova.objects.instance [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_requests' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.563 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.564 186962 INFO nova.compute.claims [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.565 186962 DEBUG nova.objects.instance [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.583 186962 DEBUG nova.objects.instance [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.666 186962 INFO nova.compute.resource_tracker [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating resource usage from migration 2f05530a-5042-477c-94a7-4c27a7f0ae7b#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.667 186962 DEBUG nova.compute.resource_tracker [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Starting to track incoming migration 2f05530a-5042-477c-94a7-4c27a7f0ae7b with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 02:07:57 np0005539505 podman[227968]: 2025-11-29 07:07:57.726328231 +0000 UTC m=+0.062580280 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.797 186962 DEBUG nova.compute.provider_tree [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:07:57 np0005539505 podman[227969]: 2025-11-29 07:07:57.805710115 +0000 UTC m=+0.131905010 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.813 186962 DEBUG nova.scheduler.client.report [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.839 186962 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.379s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:57 np0005539505 nova_compute[186958]: 2025-11-29 07:07:57.840 186962 INFO nova.compute.manager [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Migrating#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.543 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.544 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.544 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.545 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.710 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.711 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5684MB free_disk=73.22595977783203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.711 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.711 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.837 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Migration for instance 9223f44a-297e-4db1-9f44-ee0694c4e258 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.929 186962 INFO nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating resource usage from migration 2f05530a-5042-477c-94a7-4c27a7f0ae7b#033[00m
Nov 29 02:07:58 np0005539505 nova_compute[186958]: 2025-11-29 07:07:58.930 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Starting to track incoming migration 2f05530a-5042-477c-94a7-4c27a7f0ae7b with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 02:07:59 np0005539505 nova_compute[186958]: 2025-11-29 07:07:59.156 186962 WARNING nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 9223f44a-297e-4db1-9f44-ee0694c4e258 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}.#033[00m
Nov 29 02:07:59 np0005539505 nova_compute[186958]: 2025-11-29 07:07:59.156 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:07:59 np0005539505 nova_compute[186958]: 2025-11-29 07:07:59.157 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:07:59 np0005539505 nova_compute[186958]: 2025-11-29 07:07:59.180 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:07:59 np0005539505 nova_compute[186958]: 2025-11-29 07:07:59.310 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:07:59 np0005539505 nova_compute[186958]: 2025-11-29 07:07:59.336 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:07:59 np0005539505 nova_compute[186958]: 2025-11-29 07:07:59.378 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:07:59 np0005539505 nova_compute[186958]: 2025-11-29 07:07:59.378 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:00 np0005539505 nova_compute[186958]: 2025-11-29 07:08:00.531 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:02 np0005539505 podman[228018]: 2025-11-29 07:08:02.722994935 +0000 UTC m=+0.051292321 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Nov 29 02:08:03 np0005539505 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 02:08:03 np0005539505 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 02:08:03 np0005539505 systemd-logind[794]: New session 40 of user nova.
Nov 29 02:08:03 np0005539505 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 02:08:03 np0005539505 systemd[1]: Starting User Manager for UID 42436...
Nov 29 02:08:03 np0005539505 systemd[228045]: Queued start job for default target Main User Target.
Nov 29 02:08:03 np0005539505 systemd[228045]: Created slice User Application Slice.
Nov 29 02:08:03 np0005539505 systemd[228045]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:08:03 np0005539505 systemd[228045]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 02:08:03 np0005539505 systemd[228045]: Reached target Paths.
Nov 29 02:08:03 np0005539505 systemd[228045]: Reached target Timers.
Nov 29 02:08:03 np0005539505 systemd[228045]: Starting D-Bus User Message Bus Socket...
Nov 29 02:08:03 np0005539505 systemd[228045]: Starting Create User's Volatile Files and Directories...
Nov 29 02:08:03 np0005539505 systemd[228045]: Listening on D-Bus User Message Bus Socket.
Nov 29 02:08:03 np0005539505 systemd[228045]: Reached target Sockets.
Nov 29 02:08:03 np0005539505 systemd[228045]: Finished Create User's Volatile Files and Directories.
Nov 29 02:08:03 np0005539505 systemd[228045]: Reached target Basic System.
Nov 29 02:08:03 np0005539505 systemd[228045]: Reached target Main User Target.
Nov 29 02:08:03 np0005539505 systemd[228045]: Startup finished in 151ms.
Nov 29 02:08:03 np0005539505 systemd[1]: Started User Manager for UID 42436.
Nov 29 02:08:03 np0005539505 systemd[1]: Started Session 40 of User nova.
Nov 29 02:08:03 np0005539505 systemd[1]: session-40.scope: Deactivated successfully.
Nov 29 02:08:03 np0005539505 systemd-logind[794]: Session 40 logged out. Waiting for processes to exit.
Nov 29 02:08:03 np0005539505 systemd-logind[794]: Removed session 40.
Nov 29 02:08:03 np0005539505 systemd-logind[794]: New session 42 of user nova.
Nov 29 02:08:03 np0005539505 systemd[1]: Started Session 42 of User nova.
Nov 29 02:08:03 np0005539505 systemd-logind[794]: Session 42 logged out. Waiting for processes to exit.
Nov 29 02:08:03 np0005539505 systemd[1]: session-42.scope: Deactivated successfully.
Nov 29 02:08:03 np0005539505 systemd-logind[794]: Removed session 42.
Nov 29 02:08:04 np0005539505 nova_compute[186958]: 2025-11-29 07:08:04.183 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:04 np0005539505 nova_compute[186958]: 2025-11-29 07:08:04.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:04 np0005539505 nova_compute[186958]: 2025-11-29 07:08:04.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:04 np0005539505 nova_compute[186958]: 2025-11-29 07:08:04.374 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:08:04 np0005539505 nova_compute[186958]: 2025-11-29 07:08:04.374 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:08:04 np0005539505 nova_compute[186958]: 2025-11-29 07:08:04.439 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:08:04 np0005539505 podman[228067]: 2025-11-29 07:08:04.75801411 +0000 UTC m=+0.085514908 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:08:04 np0005539505 nova_compute[186958]: 2025-11-29 07:08:04.965 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400069.9642835, 42c368fd-3d19-43c7-a528-68d642f739a2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:08:04 np0005539505 nova_compute[186958]: 2025-11-29 07:08:04.966 186962 INFO nova.compute.manager [-] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:08:05 np0005539505 nova_compute[186958]: 2025-11-29 07:08:05.025 186962 DEBUG nova.compute.manager [None req-66bc2e63-fb1c-4a94-93cf-4bcfde3842c3 - - - - - -] [instance: 42c368fd-3d19-43c7-a528-68d642f739a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:08:05 np0005539505 nova_compute[186958]: 2025-11-29 07:08:05.501 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400070.4997873, 5690e5c7-982b-467a-bfd2-cf03ba81672c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:08:05 np0005539505 nova_compute[186958]: 2025-11-29 07:08:05.501 186962 INFO nova.compute.manager [-] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:08:05 np0005539505 nova_compute[186958]: 2025-11-29 07:08:05.535 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:05 np0005539505 nova_compute[186958]: 2025-11-29 07:08:05.550 186962 DEBUG nova.compute.manager [None req-ad18ddea-dfc3-4058-99cf-2326a1fc2b4a - - - - - -] [instance: 5690e5c7-982b-467a-bfd2-cf03ba81672c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:08:06 np0005539505 nova_compute[186958]: 2025-11-29 07:08:06.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:06 np0005539505 nova_compute[186958]: 2025-11-29 07:08:06.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:08:06 np0005539505 nova_compute[186958]: 2025-11-29 07:08:06.830 186962 DEBUG nova.compute.manager [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:06 np0005539505 nova_compute[186958]: 2025-11-29 07:08:06.831 186962 DEBUG oslo_concurrency.lockutils [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:06 np0005539505 nova_compute[186958]: 2025-11-29 07:08:06.831 186962 DEBUG oslo_concurrency.lockutils [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:06 np0005539505 nova_compute[186958]: 2025-11-29 07:08:06.831 186962 DEBUG oslo_concurrency.lockutils [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:06 np0005539505 nova_compute[186958]: 2025-11-29 07:08:06.832 186962 DEBUG nova.compute.manager [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:06 np0005539505 nova_compute[186958]: 2025-11-29 07:08:06.832 186962 WARNING nova.compute.manager [req-a25f9ec3-bf54-4540-be02-3cbb075a0af2 req-5af6262d-2807-4bd9-b78c-eedd82109efb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:08:07 np0005539505 systemd-logind[794]: New session 43 of user nova.
Nov 29 02:08:07 np0005539505 systemd[1]: Started Session 43 of User nova.
Nov 29 02:08:07 np0005539505 nova_compute[186958]: 2025-11-29 07:08:07.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:07 np0005539505 systemd[1]: session-43.scope: Deactivated successfully.
Nov 29 02:08:07 np0005539505 systemd-logind[794]: Session 43 logged out. Waiting for processes to exit.
Nov 29 02:08:07 np0005539505 systemd-logind[794]: Removed session 43.
Nov 29 02:08:07 np0005539505 systemd-logind[794]: New session 44 of user nova.
Nov 29 02:08:07 np0005539505 systemd[1]: Started Session 44 of User nova.
Nov 29 02:08:07 np0005539505 systemd[1]: session-44.scope: Deactivated successfully.
Nov 29 02:08:07 np0005539505 systemd-logind[794]: Session 44 logged out. Waiting for processes to exit.
Nov 29 02:08:07 np0005539505 systemd-logind[794]: Removed session 44.
Nov 29 02:08:07 np0005539505 systemd-logind[794]: New session 45 of user nova.
Nov 29 02:08:07 np0005539505 systemd[1]: Started Session 45 of User nova.
Nov 29 02:08:07 np0005539505 systemd[1]: session-45.scope: Deactivated successfully.
Nov 29 02:08:07 np0005539505 systemd-logind[794]: Session 45 logged out. Waiting for processes to exit.
Nov 29 02:08:07 np0005539505 systemd-logind[794]: Removed session 45.
Nov 29 02:08:09 np0005539505 nova_compute[186958]: 2025-11-29 07:08:09.184 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:09 np0005539505 nova_compute[186958]: 2025-11-29 07:08:09.974 186962 DEBUG nova.compute.manager [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:09 np0005539505 nova_compute[186958]: 2025-11-29 07:08:09.974 186962 DEBUG oslo_concurrency.lockutils [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:09 np0005539505 nova_compute[186958]: 2025-11-29 07:08:09.975 186962 DEBUG oslo_concurrency.lockutils [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:09 np0005539505 nova_compute[186958]: 2025-11-29 07:08:09.975 186962 DEBUG oslo_concurrency.lockutils [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:09 np0005539505 nova_compute[186958]: 2025-11-29 07:08:09.975 186962 DEBUG nova.compute.manager [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:09 np0005539505 nova_compute[186958]: 2025-11-29 07:08:09.975 186962 WARNING nova.compute.manager [req-46ed2e01-0919-454e-8b8d-f60ad5d63cb2 req-41f87cec-4fe4-466e-845c-264353a624bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:08:10 np0005539505 nova_compute[186958]: 2025-11-29 07:08:10.478 186962 INFO nova.network.neutron [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating port b7078e73-f0e3-441a-843e-8920e38aec30 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 29 02:08:10 np0005539505 nova_compute[186958]: 2025-11-29 07:08:10.538 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:11 np0005539505 nova_compute[186958]: 2025-11-29 07:08:11.822 186962 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:08:11 np0005539505 nova_compute[186958]: 2025-11-29 07:08:11.823 186962 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:08:11 np0005539505 nova_compute[186958]: 2025-11-29 07:08:11.823 186962 DEBUG nova.network.neutron [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:08:12 np0005539505 nova_compute[186958]: 2025-11-29 07:08:12.008 186962 DEBUG nova.compute.manager [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-changed-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:12 np0005539505 nova_compute[186958]: 2025-11-29 07:08:12.008 186962 DEBUG nova.compute.manager [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Refreshing instance network info cache due to event network-changed-b7078e73-f0e3-441a-843e-8920e38aec30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:08:12 np0005539505 nova_compute[186958]: 2025-11-29 07:08:12.009 186962 DEBUG oslo_concurrency.lockutils [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:08:14 np0005539505 nova_compute[186958]: 2025-11-29 07:08:14.186 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:15 np0005539505 nova_compute[186958]: 2025-11-29 07:08:15.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:15 np0005539505 nova_compute[186958]: 2025-11-29 07:08:15.406 186962 DEBUG nova.network.neutron [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:15 np0005539505 nova_compute[186958]: 2025-11-29 07:08:15.512 186962 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:08:15 np0005539505 nova_compute[186958]: 2025-11-29 07:08:15.519 186962 DEBUG oslo_concurrency.lockutils [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:08:15 np0005539505 nova_compute[186958]: 2025-11-29 07:08:15.519 186962 DEBUG nova.network.neutron [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Refreshing network info cache for port b7078e73-f0e3-441a-843e-8920e38aec30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:08:15 np0005539505 nova_compute[186958]: 2025-11-29 07:08:15.542 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:15 np0005539505 podman[228104]: 2025-11-29 07:08:15.735735583 +0000 UTC m=+0.062047804 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:08:15 np0005539505 podman[228103]: 2025-11-29 07:08:15.75119082 +0000 UTC m=+0.081402861 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public)
Nov 29 02:08:16 np0005539505 nova_compute[186958]: 2025-11-29 07:08:16.389 186962 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 29 02:08:16 np0005539505 nova_compute[186958]: 2025-11-29 07:08:16.391 186962 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 02:08:16 np0005539505 nova_compute[186958]: 2025-11-29 07:08:16.391 186962 INFO nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Creating image(s)#033[00m
Nov 29 02:08:16 np0005539505 nova_compute[186958]: 2025-11-29 07:08:16.392 186962 DEBUG nova.objects.instance [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:08:16 np0005539505 nova_compute[186958]: 2025-11-29 07:08:16.650 186962 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:16 np0005539505 nova_compute[186958]: 2025-11-29 07:08:16.715 186962 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:16 np0005539505 nova_compute[186958]: 2025-11-29 07:08:16.716 186962 DEBUG nova.virt.disk.api [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Checking if we can resize image /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:08:16 np0005539505 nova_compute[186958]: 2025-11-29 07:08:16.717 186962 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:16 np0005539505 nova_compute[186958]: 2025-11-29 07:08:16.779 186962 DEBUG oslo_concurrency.processutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:16 np0005539505 nova_compute[186958]: 2025-11-29 07:08:16.780 186962 DEBUG nova.virt.disk.api [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Cannot resize image /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.234 186962 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.234 186962 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Ensure instance console log exists: /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.235 186962 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.236 186962 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.236 186962 DEBUG oslo_concurrency.lockutils [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.239 186962 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Start _get_guest_xml network_info=[{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:1e:a3:23"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.246 186962 WARNING nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.254 186962 DEBUG nova.virt.libvirt.host [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.255 186962 DEBUG nova.virt.libvirt.host [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.258 186962 DEBUG nova.virt.libvirt.host [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.259 186962 DEBUG nova.virt.libvirt.host [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.260 186962 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.261 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e29df891-dca5-4a1c-9258-dc512a46956f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.261 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.261 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.262 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.262 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.262 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.262 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.262 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.263 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.263 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.263 186962 DEBUG nova.virt.hardware [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.263 186962 DEBUG nova.objects.instance [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.283 186962 DEBUG nova.virt.libvirt.vif [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:1e:a3:23"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.283 186962 DEBUG nova.network.os_vif_util [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:1e:a3:23"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.284 186962 DEBUG nova.network.os_vif_util [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.286 186962 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  <uuid>9223f44a-297e-4db1-9f44-ee0694c4e258</uuid>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  <name>instance-00000042</name>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  <memory>196608</memory>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerActionsTestJSON-server-664171356</nova:name>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:08:17</nova:creationTime>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.micro">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:        <nova:memory>192</nova:memory>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:        <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:        <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:        <nova:port uuid="b7078e73-f0e3-441a-843e-8920e38aec30">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <entry name="serial">9223f44a-297e-4db1-9f44-ee0694c4e258</entry>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <entry name="uuid">9223f44a-297e-4db1-9f44-ee0694c4e258</entry>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/disk.config"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:1e:a3:23"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <target dev="tapb7078e73-f0"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258/console.log" append="off"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:08:17 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:08:17 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:08:17 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:08:17 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.287 186962 DEBUG nova.virt.libvirt.vif [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:1e:a3:23"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.288 186962 DEBUG nova.network.os_vif_util [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:1e:a3:23"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.288 186962 DEBUG nova.network.os_vif_util [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.289 186962 DEBUG os_vif [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.289 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.290 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.290 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.296 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.297 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7078e73-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.297 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7078e73-f0, col_values=(('external_ids', {'iface-id': 'b7078e73-f0e3-441a-843e-8920e38aec30', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:a3:23', 'vm-uuid': '9223f44a-297e-4db1-9f44-ee0694c4e258'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.299 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 NetworkManager[55134]: <info>  [1764400097.3006] manager: (tapb7078e73-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.302 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.306 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.307 186962 INFO os_vif [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0')#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.403 186962 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.403 186962 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.404 186962 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No VIF found with MAC fa:16:3e:1e:a3:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.405 186962 INFO nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Using config drive#033[00m
Nov 29 02:08:17 np0005539505 kernel: tapb7078e73-f0: entered promiscuous mode
Nov 29 02:08:17 np0005539505 NetworkManager[55134]: <info>  [1764400097.4724] manager: (tapb7078e73-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/166)
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.473 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:17Z|00327|binding|INFO|Claiming lport b7078e73-f0e3-441a-843e-8920e38aec30 for this chassis.
Nov 29 02:08:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:17Z|00328|binding|INFO|b7078e73-f0e3-441a-843e-8920e38aec30: Claiming fa:16:3e:1e:a3:23 10.100.0.9
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.477 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.482 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.485 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 NetworkManager[55134]: <info>  [1764400097.4869] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/167)
Nov 29 02:08:17 np0005539505 NetworkManager[55134]: <info>  [1764400097.4878] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.497 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a3:23 10.100.0.9'], port_security=['fa:16:3e:1e:a3:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '12', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b7078e73-f0e3-441a-843e-8920e38aec30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.498 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b7078e73-f0e3-441a-843e-8920e38aec30 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.500 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399#033[00m
Nov 29 02:08:17 np0005539505 systemd-udevd[228166]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.514 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[22b3e302-1fe0-4105-9550-99c0a25bc6e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.515 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:08:17 np0005539505 systemd-machined[153285]: New machine qemu-42-instance-00000042.
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.517 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.517 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[840915bb-88a5-420e-8bf8-922bd28f9623]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.518 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[83e1df36-1dab-4a1b-b97c-7eadd5c0dba7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 NetworkManager[55134]: <info>  [1764400097.5231] device (tapb7078e73-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:08:17 np0005539505 NetworkManager[55134]: <info>  [1764400097.5242] device (tapb7078e73-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.529 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[626f7a43-1731-4bf6-87c2-641e755dfe7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.554 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b01b69c4-0b52-47e3-89c6-16eb4a2f771e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 systemd[1]: Started Virtual Machine qemu-42-instance-00000042.
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.580 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.586 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.586 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[64f81e11-6934-4f61-80ca-e6ecde4ac9a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:17Z|00329|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 ovn-installed in OVS
Nov 29 02:08:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:17Z|00330|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 up in Southbound
Nov 29 02:08:17 np0005539505 NetworkManager[55134]: <info>  [1764400097.5966] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/169)
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.595 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[99d89447-effe-4999-865b-321c0daab488]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.597 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.628 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[591405a4-838e-4c8c-8ec9-c297c0813383]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.631 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ce03ebaf-d330-4a09-b653-42cc7f111343]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 NetworkManager[55134]: <info>  [1764400097.6565] device (tap9226dea3-60): carrier: link connected
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.662 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[6a8448b1-f7e5-43c3-b6cf-840a2b37c1db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.676 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b2311b1d-67a5-451a-9f73-db0b70129cd1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554524, 'reachable_time': 29254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228200, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.689 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[05ffd62c-769a-4622-a63b-8126e9de5ef1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554524, 'tstamp': 554524}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228201, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.703 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f56fbf8e-7307-47ec-a60a-0cf738abcab7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554524, 'reachable_time': 29254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228202, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.729 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cce8ba44-567a-4ebb-b51d-dc6ccb479570]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.782 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dda8024e-ddd2-4392-b772-7049cef0d566]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.783 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.784 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.784 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:17 np0005539505 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 02:08:17 np0005539505 NetworkManager[55134]: <info>  [1764400097.7868] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.786 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.788 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.793 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.794 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:17Z|00331|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.796 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.806 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.807 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.808 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1d9b2fd5-1abe-463a-be59-b93800dc16ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.808 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:08:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:17.809 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.950 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400097.948655, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.952 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.954 186962 DEBUG nova.compute.manager [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.957 186962 INFO nova.virt.libvirt.driver [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance running successfully.#033[00m
Nov 29 02:08:17 np0005539505 virtqemud[186353]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.960 186962 DEBUG nova.virt.libvirt.guest [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.960 186962 DEBUG nova.virt.libvirt.driver [None req-391665d1-684c-4ff5-9abd-06034a548110 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.989 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:08:17 np0005539505 nova_compute[186958]: 2025-11-29 07:08:17.993 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:08:18 np0005539505 nova_compute[186958]: 2025-11-29 07:08:18.047 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 02:08:18 np0005539505 nova_compute[186958]: 2025-11-29 07:08:18.047 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400097.9497764, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:08:18 np0005539505 nova_compute[186958]: 2025-11-29 07:08:18.048 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Started (Lifecycle Event)#033[00m
Nov 29 02:08:18 np0005539505 nova_compute[186958]: 2025-11-29 07:08:18.081 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:08:18 np0005539505 nova_compute[186958]: 2025-11-29 07:08:18.085 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:08:18 np0005539505 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 02:08:18 np0005539505 systemd[228045]: Activating special unit Exit the Session...
Nov 29 02:08:18 np0005539505 systemd[228045]: Stopped target Main User Target.
Nov 29 02:08:18 np0005539505 systemd[228045]: Stopped target Basic System.
Nov 29 02:08:18 np0005539505 systemd[228045]: Stopped target Paths.
Nov 29 02:08:18 np0005539505 systemd[228045]: Stopped target Sockets.
Nov 29 02:08:18 np0005539505 systemd[228045]: Stopped target Timers.
Nov 29 02:08:18 np0005539505 systemd[228045]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:08:18 np0005539505 systemd[228045]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 02:08:18 np0005539505 systemd[228045]: Closed D-Bus User Message Bus Socket.
Nov 29 02:08:18 np0005539505 systemd[228045]: Stopped Create User's Volatile Files and Directories.
Nov 29 02:08:18 np0005539505 systemd[228045]: Removed slice User Application Slice.
Nov 29 02:08:18 np0005539505 systemd[228045]: Reached target Shutdown.
Nov 29 02:08:18 np0005539505 systemd[228045]: Finished Exit the Session.
Nov 29 02:08:18 np0005539505 systemd[228045]: Reached target Exit the Session.
Nov 29 02:08:18 np0005539505 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 02:08:18 np0005539505 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 02:08:18 np0005539505 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 02:08:18 np0005539505 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 02:08:18 np0005539505 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 02:08:18 np0005539505 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 02:08:18 np0005539505 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 02:08:18 np0005539505 podman[228241]: 2025-11-29 07:08:18.211545329 +0000 UTC m=+0.068257671 container create c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:08:18 np0005539505 systemd[1]: Started libpod-conmon-c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7.scope.
Nov 29 02:08:18 np0005539505 podman[228241]: 2025-11-29 07:08:18.169611093 +0000 UTC m=+0.026323485 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:08:18 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:08:18 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/468bf383fe917361bfb7024bf54e5d8a6997f4159495d09996df0a60cb7a3b51/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:08:18 np0005539505 podman[228241]: 2025-11-29 07:08:18.308124129 +0000 UTC m=+0.164836491 container init c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:08:18 np0005539505 podman[228241]: 2025-11-29 07:08:18.315266671 +0000 UTC m=+0.171979003 container start c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:08:18 np0005539505 nova_compute[186958]: 2025-11-29 07:08:18.326 186962 DEBUG nova.compute.manager [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:18 np0005539505 nova_compute[186958]: 2025-11-29 07:08:18.326 186962 DEBUG oslo_concurrency.lockutils [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:18 np0005539505 nova_compute[186958]: 2025-11-29 07:08:18.326 186962 DEBUG oslo_concurrency.lockutils [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:18 np0005539505 nova_compute[186958]: 2025-11-29 07:08:18.327 186962 DEBUG oslo_concurrency.lockutils [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:18 np0005539505 nova_compute[186958]: 2025-11-29 07:08:18.327 186962 DEBUG nova.compute.manager [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:18 np0005539505 nova_compute[186958]: 2025-11-29 07:08:18.327 186962 WARNING nova.compute.manager [req-8ebda652-ae5d-4351-957f-fd43b6ba4ba6 req-3ace5552-61ed-426c-ae97-823acce11116 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:08:18 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228257]: [NOTICE]   (228261) : New worker (228263) forked
Nov 29 02:08:18 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228257]: [NOTICE]   (228261) : Loading success.
Nov 29 02:08:19 np0005539505 nova_compute[186958]: 2025-11-29 07:08:19.189 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:19 np0005539505 nova_compute[186958]: 2025-11-29 07:08:19.500 186962 DEBUG nova.network.neutron [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updated VIF entry in instance network info cache for port b7078e73-f0e3-441a-843e-8920e38aec30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:08:19 np0005539505 nova_compute[186958]: 2025-11-29 07:08:19.502 186962 DEBUG nova.network.neutron [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [{"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:19 np0005539505 nova_compute[186958]: 2025-11-29 07:08:19.533 186962 DEBUG oslo_concurrency.lockutils [req-18fc88d6-69ee-451c-ad11-5913e1c24289 req-b515a70a-ee62-49ab-bf1d-951c65d2e305 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-9223f44a-297e-4db1-9f44-ee0694c4e258" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:08:19 np0005539505 podman[228273]: 2025-11-29 07:08:19.73924257 +0000 UTC m=+0.066952644 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:08:20 np0005539505 nova_compute[186958]: 2025-11-29 07:08:20.577 186962 DEBUG nova.compute.manager [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:20 np0005539505 nova_compute[186958]: 2025-11-29 07:08:20.579 186962 DEBUG oslo_concurrency.lockutils [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:20 np0005539505 nova_compute[186958]: 2025-11-29 07:08:20.579 186962 DEBUG oslo_concurrency.lockutils [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:20 np0005539505 nova_compute[186958]: 2025-11-29 07:08:20.579 186962 DEBUG oslo_concurrency.lockutils [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:20 np0005539505 nova_compute[186958]: 2025-11-29 07:08:20.579 186962 DEBUG nova.compute.manager [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:20 np0005539505 nova_compute[186958]: 2025-11-29 07:08:20.580 186962 WARNING nova.compute.manager [req-92747e1b-73cf-4bbb-893d-4c57d35ccf9f req-05cfdc6d-c213-4d10-929d-88abf45c17fd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:08:20 np0005539505 nova_compute[186958]: 2025-11-29 07:08:20.765 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:22 np0005539505 nova_compute[186958]: 2025-11-29 07:08:22.301 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:24 np0005539505 nova_compute[186958]: 2025-11-29 07:08:24.191 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:26.945 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:26.947 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:26.948 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:27 np0005539505 nova_compute[186958]: 2025-11-29 07:08:27.304 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:28 np0005539505 podman[228292]: 2025-11-29 07:08:28.716457973 +0000 UTC m=+0.050883709 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:08:28 np0005539505 podman[228293]: 2025-11-29 07:08:28.80332967 +0000 UTC m=+0.134798983 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Nov 29 02:08:29 np0005539505 nova_compute[186958]: 2025-11-29 07:08:29.194 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:29 np0005539505 nova_compute[186958]: 2025-11-29 07:08:29.210 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:29 np0005539505 nova_compute[186958]: 2025-11-29 07:08:29.211 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:29 np0005539505 nova_compute[186958]: 2025-11-29 07:08:29.248 186962 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:08:29 np0005539505 nova_compute[186958]: 2025-11-29 07:08:29.478 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:29 np0005539505 nova_compute[186958]: 2025-11-29 07:08:29.479 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:29 np0005539505 nova_compute[186958]: 2025-11-29 07:08:29.487 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:08:29 np0005539505 nova_compute[186958]: 2025-11-29 07:08:29.487 186962 INFO nova.compute.claims [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:08:30 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:30Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:a3:23 10.100.0.9
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.120 186962 DEBUG nova.compute.provider_tree [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.316 186962 DEBUG nova.scheduler.client.report [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.359 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.360 186962 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.465 186962 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.465 186962 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.490 186962 INFO nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.550 186962 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.765 186962 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.766 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.767 186962 INFO nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Creating image(s)#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.767 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "/var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.767 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "/var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.768 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "/var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.781 186962 DEBUG nova.policy [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.783 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.843 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.844 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.845 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.857 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.908 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.909 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.942 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.943 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:30 np0005539505 nova_compute[186958]: 2025-11-29 07:08:30.943 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:31 np0005539505 nova_compute[186958]: 2025-11-29 07:08:31.008 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:31 np0005539505 nova_compute[186958]: 2025-11-29 07:08:31.009 186962 DEBUG nova.virt.disk.api [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Checking if we can resize image /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:08:31 np0005539505 nova_compute[186958]: 2025-11-29 07:08:31.010 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:31 np0005539505 nova_compute[186958]: 2025-11-29 07:08:31.063 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:31 np0005539505 nova_compute[186958]: 2025-11-29 07:08:31.066 186962 DEBUG nova.virt.disk.api [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Cannot resize image /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:08:31 np0005539505 nova_compute[186958]: 2025-11-29 07:08:31.067 186962 DEBUG nova.objects.instance [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'migration_context' on Instance uuid 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:08:31 np0005539505 nova_compute[186958]: 2025-11-29 07:08:31.083 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:08:31 np0005539505 nova_compute[186958]: 2025-11-29 07:08:31.084 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Ensure instance console log exists: /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:08:31 np0005539505 nova_compute[186958]: 2025-11-29 07:08:31.084 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:31 np0005539505 nova_compute[186958]: 2025-11-29 07:08:31.084 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:31 np0005539505 nova_compute[186958]: 2025-11-29 07:08:31.085 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:32 np0005539505 nova_compute[186958]: 2025-11-29 07:08:32.308 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:33 np0005539505 nova_compute[186958]: 2025-11-29 07:08:33.276 186962 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Successfully created port: 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:08:33 np0005539505 podman[228361]: 2025-11-29 07:08:33.73289945 +0000 UTC m=+0.066893822 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.136 186962 DEBUG oslo_concurrency.lockutils [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.137 186962 DEBUG oslo_concurrency.lockutils [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.137 186962 DEBUG oslo_concurrency.lockutils [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.137 186962 DEBUG oslo_concurrency.lockutils [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.138 186962 DEBUG oslo_concurrency.lockutils [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.153 186962 INFO nova.compute.manager [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Terminating instance#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.166 186962 DEBUG nova.compute.manager [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:08:34 np0005539505 kernel: tapb7078e73-f0 (unregistering): left promiscuous mode
Nov 29 02:08:34 np0005539505 NetworkManager[55134]: <info>  [1764400114.1856] device (tapb7078e73-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.193 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:34Z|00332|binding|INFO|Releasing lport b7078e73-f0e3-441a-843e-8920e38aec30 from this chassis (sb_readonly=0)
Nov 29 02:08:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:34Z|00333|binding|INFO|Setting lport b7078e73-f0e3-441a-843e-8920e38aec30 down in Southbound
Nov 29 02:08:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:34Z|00334|binding|INFO|Removing iface tapb7078e73-f0 ovn-installed in OVS
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.195 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.210 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a3:23 10.100.0.9'], port_security=['fa:16:3e:1e:a3:23 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9223f44a-297e-4db1-9f44-ee0694c4e258', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '14', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b7078e73-f0e3-441a-843e-8920e38aec30) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.212 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b7078e73-f0e3-441a-843e-8920e38aec30 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.213 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.214 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.215 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c87d8860-f4cd-4864-b62a-bf09d420e39b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.216 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore#033[00m
Nov 29 02:08:34 np0005539505 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000042.scope: Deactivated successfully.
Nov 29 02:08:34 np0005539505 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000042.scope: Consumed 12.830s CPU time.
Nov 29 02:08:34 np0005539505 systemd-machined[153285]: Machine qemu-42-instance-00000042 terminated.
Nov 29 02:08:34 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228257]: [NOTICE]   (228261) : haproxy version is 2.8.14-c23fe91
Nov 29 02:08:34 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228257]: [NOTICE]   (228261) : path to executable is /usr/sbin/haproxy
Nov 29 02:08:34 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228257]: [WARNING]  (228261) : Exiting Master process...
Nov 29 02:08:34 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228257]: [ALERT]    (228261) : Current worker (228263) exited with code 143 (Terminated)
Nov 29 02:08:34 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[228257]: [WARNING]  (228261) : All workers exited. Exiting... (0)
Nov 29 02:08:34 np0005539505 systemd[1]: libpod-c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7.scope: Deactivated successfully.
Nov 29 02:08:34 np0005539505 podman[228406]: 2025-11-29 07:08:34.345111309 +0000 UTC m=+0.046076484 container died c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:08:34 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7-userdata-shm.mount: Deactivated successfully.
Nov 29 02:08:34 np0005539505 systemd[1]: var-lib-containers-storage-overlay-468bf383fe917361bfb7024bf54e5d8a6997f4159495d09996df0a60cb7a3b51-merged.mount: Deactivated successfully.
Nov 29 02:08:34 np0005539505 podman[228406]: 2025-11-29 07:08:34.391673485 +0000 UTC m=+0.092638660 container cleanup c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:08:34 np0005539505 systemd[1]: libpod-conmon-c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7.scope: Deactivated successfully.
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.429 186962 INFO nova.virt.libvirt.driver [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Instance destroyed successfully.#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.429 186962 DEBUG nova.objects.instance [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 9223f44a-297e-4db1-9f44-ee0694c4e258 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:08:34 np0005539505 podman[228444]: 2025-11-29 07:08:34.457790494 +0000 UTC m=+0.043691856 container remove c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.463 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8b0daf50-23bd-4a6a-aa8f-d21249531f10]: (4, ('Sat Nov 29 07:08:34 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7)\nc8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7\nSat Nov 29 07:08:34 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (c8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7)\nc8e0a5f953bb470747ad25357a73b71f0d850b592e697f80d00f51a7c86b61c7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.464 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ff9b0bf9-e4ac-49f6-8154-7f9fe788d3a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.465 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.467 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:34 np0005539505 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.482 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.486 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4b04780f-03b0-4b28-99d6-4df359660f16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.500 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ddfc3f8e-29f9-4f3c-84e3-9044e800c96a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.502 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1f75e652-604a-4a2a-9fbf-6d93b9d202a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.518 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f186a725-33ec-4b5c-b663-024deb51ddce]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554516, 'reachable_time': 32988, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228471, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.522 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:08:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:34.523 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[21463797-5c9f-4e6f-ad13-b3289277750e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:34 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.558 186962 DEBUG nova.virt.libvirt.vif [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:04:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-664171356',display_name='tempest-ServerActionsTestJSON-server-664171356',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-664171356',id=66,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:08:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-gfjum0fh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:08:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=9223f44a-297e-4db1-9f44-ee0694c4e258,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.559 186962 DEBUG nova.network.os_vif_util [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "b7078e73-f0e3-441a-843e-8920e38aec30", "address": "fa:16:3e:1e:a3:23", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7078e73-f0", "ovs_interfaceid": "b7078e73-f0e3-441a-843e-8920e38aec30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.559 186962 DEBUG nova.network.os_vif_util [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.560 186962 DEBUG os_vif [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.561 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.562 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7078e73-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.563 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.564 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.567 186962 INFO os_vif [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a3:23,bridge_name='br-int',has_traffic_filtering=True,id=b7078e73-f0e3-441a-843e-8920e38aec30,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7078e73-f0')#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.567 186962 INFO nova.virt.libvirt.driver [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Deleting instance files /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258_del#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.574 186962 INFO nova.virt.libvirt.driver [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Deletion of /var/lib/nova/instances/9223f44a-297e-4db1-9f44-ee0694c4e258_del complete#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.965 186962 INFO nova.compute.manager [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Took 0.80 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.966 186962 DEBUG oslo.service.loopingcall [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.966 186962 DEBUG nova.compute.manager [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:08:34 np0005539505 nova_compute[186958]: 2025-11-29 07:08:34.966 186962 DEBUG nova.network.neutron [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.168 186962 DEBUG nova.compute.manager [req-bd483f93-732e-4bcd-bd4c-8302eb74c387 req-6500b695-b717-4aa8-b74f-035aa122f52f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.169 186962 DEBUG oslo_concurrency.lockutils [req-bd483f93-732e-4bcd-bd4c-8302eb74c387 req-6500b695-b717-4aa8-b74f-035aa122f52f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.169 186962 DEBUG oslo_concurrency.lockutils [req-bd483f93-732e-4bcd-bd4c-8302eb74c387 req-6500b695-b717-4aa8-b74f-035aa122f52f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.169 186962 DEBUG oslo_concurrency.lockutils [req-bd483f93-732e-4bcd-bd4c-8302eb74c387 req-6500b695-b717-4aa8-b74f-035aa122f52f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.169 186962 DEBUG nova.compute.manager [req-bd483f93-732e-4bcd-bd4c-8302eb74c387 req-6500b695-b717-4aa8-b74f-035aa122f52f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.170 186962 DEBUG nova.compute.manager [req-bd483f93-732e-4bcd-bd4c-8302eb74c387 req-6500b695-b717-4aa8-b74f-035aa122f52f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-unplugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.466 186962 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Successfully updated port: 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.483 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "refresh_cache-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.483 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquired lock "refresh_cache-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.484 186962 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:08:35 np0005539505 podman[228472]: 2025-11-29 07:08:35.753209858 +0000 UTC m=+0.082244107 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.835 186962 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:08:35 np0005539505 nova_compute[186958]: 2025-11-29 07:08:35.962 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:36 np0005539505 nova_compute[186958]: 2025-11-29 07:08:36.756 186962 DEBUG nova.network.neutron [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:36 np0005539505 nova_compute[186958]: 2025-11-29 07:08:36.785 186962 INFO nova.compute.manager [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Took 1.82 seconds to deallocate network for instance.#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.018 186962 DEBUG nova.compute.manager [req-9b55bc1a-0842-4d03-acc5-c5c917d1294d req-fca3eb1f-63eb-4403-9d02-e31d34353201 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-deleted-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.157 186962 DEBUG oslo_concurrency.lockutils [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.158 186962 DEBUG oslo_concurrency.lockutils [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.163 186962 DEBUG oslo_concurrency.lockutils [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.226 186962 INFO nova.scheduler.client.report [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Deleted allocations for instance 9223f44a-297e-4db1-9f44-ee0694c4e258#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.333 186962 DEBUG nova.compute.manager [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.333 186962 DEBUG oslo_concurrency.lockutils [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.333 186962 DEBUG oslo_concurrency.lockutils [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.334 186962 DEBUG oslo_concurrency.lockutils [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.334 186962 DEBUG nova.compute.manager [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] No waiting events found dispatching network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.334 186962 WARNING nova.compute.manager [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Received unexpected event network-vif-plugged-b7078e73-f0e3-441a-843e-8920e38aec30 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.334 186962 DEBUG nova.compute.manager [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Received event network-changed-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.334 186962 DEBUG nova.compute.manager [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Refreshing instance network info cache due to event network-changed-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.335 186962 DEBUG oslo_concurrency.lockutils [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.384 186962 DEBUG nova.network.neutron [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Updating instance_info_cache with network_info: [{"id": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "address": "fa:16:3e:60:3c:47", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5b5d87-ab", "ovs_interfaceid": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.762 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.854 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Releasing lock "refresh_cache-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.854 186962 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Instance network_info: |[{"id": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "address": "fa:16:3e:60:3c:47", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5b5d87-ab", "ovs_interfaceid": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.855 186962 DEBUG oslo_concurrency.lockutils [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.855 186962 DEBUG nova.network.neutron [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Refreshing network info cache for port 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.857 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Start _get_guest_xml network_info=[{"id": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "address": "fa:16:3e:60:3c:47", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5b5d87-ab", "ovs_interfaceid": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.862 186962 WARNING nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.869 186962 DEBUG nova.virt.libvirt.host [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.870 186962 DEBUG nova.virt.libvirt.host [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.875 186962 DEBUG nova.virt.libvirt.host [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.876 186962 DEBUG nova.virt.libvirt.host [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.877 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.877 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.878 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.878 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.878 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.879 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.879 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.879 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.880 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.880 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.880 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.880 186962 DEBUG nova.virt.hardware [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.884 186962 DEBUG nova.virt.libvirt.vif [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-622119530',display_name='tempest-tempest.common.compute-instance-622119530-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-622119530-2',id=86,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-u25t76wp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:30Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=0db4c951-8b4f-4c3b-b7a0-ccb26138abcd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "address": "fa:16:3e:60:3c:47", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5b5d87-ab", "ovs_interfaceid": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.884 186962 DEBUG nova.network.os_vif_util [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "address": "fa:16:3e:60:3c:47", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5b5d87-ab", "ovs_interfaceid": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.885 186962 DEBUG nova.network.os_vif_util [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:3c:47,bridge_name='br-int',has_traffic_filtering=True,id=8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5b5d87-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.886 186962 DEBUG nova.objects.instance [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.911 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  <uuid>0db4c951-8b4f-4c3b-b7a0-ccb26138abcd</uuid>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  <name>instance-00000056</name>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <nova:name>tempest-tempest.common.compute-instance-622119530-2</nova:name>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:08:37</nova:creationTime>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:        <nova:user uuid="e621c9f314214c7980a4d441f0600e90">tempest-MultipleCreateTestJSON-910974113-project-member</nova:user>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:        <nova:project uuid="a16c3c4eb5654a7f9742906d1a6f6698">tempest-MultipleCreateTestJSON-910974113</nova:project>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:        <nova:port uuid="8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <entry name="serial">0db4c951-8b4f-4c3b-b7a0-ccb26138abcd</entry>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <entry name="uuid">0db4c951-8b4f-4c3b-b7a0-ccb26138abcd</entry>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.config"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:60:3c:47"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <target dev="tap8b5b5d87-ab"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/console.log" append="off"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:08:37 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:08:37 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:08:37 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:08:37 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.912 186962 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Preparing to wait for external event network-vif-plugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.913 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.913 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.913 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.914 186962 DEBUG nova.virt.libvirt.vif [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-622119530',display_name='tempest-tempest.common.compute-instance-622119530-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-622119530-2',id=86,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-u25t76wp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:08:30Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=0db4c951-8b4f-4c3b-b7a0-ccb26138abcd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "address": "fa:16:3e:60:3c:47", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5b5d87-ab", "ovs_interfaceid": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.915 186962 DEBUG nova.network.os_vif_util [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "address": "fa:16:3e:60:3c:47", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5b5d87-ab", "ovs_interfaceid": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.915 186962 DEBUG nova.network.os_vif_util [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:3c:47,bridge_name='br-int',has_traffic_filtering=True,id=8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5b5d87-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.916 186962 DEBUG os_vif [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:3c:47,bridge_name='br-int',has_traffic_filtering=True,id=8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5b5d87-ab') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.917 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.917 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.918 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.920 186962 DEBUG oslo_concurrency.lockutils [None req-7f08395d-9b19-4c5e-9d8d-30990f0fa828 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "9223f44a-297e-4db1-9f44-ee0694c4e258" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.921 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.922 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b5b5d87-ab, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.922 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8b5b5d87-ab, col_values=(('external_ids', {'iface-id': '8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:3c:47', 'vm-uuid': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.923 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:37 np0005539505 NetworkManager[55134]: <info>  [1764400117.9246] manager: (tap8b5b5d87-ab): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.926 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.929 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:37 np0005539505 nova_compute[186958]: 2025-11-29 07:08:37.930 186962 INFO os_vif [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:3c:47,bridge_name='br-int',has_traffic_filtering=True,id=8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5b5d87-ab')#033[00m
Nov 29 02:08:38 np0005539505 nova_compute[186958]: 2025-11-29 07:08:38.459 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:08:38 np0005539505 nova_compute[186958]: 2025-11-29 07:08:38.460 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:08:38 np0005539505 nova_compute[186958]: 2025-11-29 07:08:38.460 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No VIF found with MAC fa:16:3e:60:3c:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:08:38 np0005539505 nova_compute[186958]: 2025-11-29 07:08:38.460 186962 INFO nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Using config drive#033[00m
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.217 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.334 186962 INFO nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Creating config drive at /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.config#033[00m
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.340 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzyw84aeb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.467 186962 DEBUG oslo_concurrency.processutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzyw84aeb" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:08:39 np0005539505 kernel: tap8b5b5d87-ab: entered promiscuous mode
Nov 29 02:08:39 np0005539505 NetworkManager[55134]: <info>  [1764400119.5394] manager: (tap8b5b5d87-ab): new Tun device (/org/freedesktop/NetworkManager/Devices/172)
Nov 29 02:08:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:39Z|00335|binding|INFO|Claiming lport 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 for this chassis.
Nov 29 02:08:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:39Z|00336|binding|INFO|8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72: Claiming fa:16:3e:60:3c:47 10.100.0.9
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.540 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:39Z|00337|binding|INFO|Setting lport 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 ovn-installed in OVS
Nov 29 02:08:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:39Z|00338|binding|INFO|Setting lport 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 up in Southbound
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.553 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.553 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:3c:47 10.100.0.9'], port_security=['fa:16:3e:60:3c:47 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61999b35-f067-478e-ae7d-2c014e39aec6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'neutron:revision_number': '2', 'neutron:security_group_ids': '33049197-f5b6-46f9-bb9f-ae10a060cbf4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0493b85c-e95a-459c-8e5e-a22ec09f96c2, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.555 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 in datapath 61999b35-f067-478e-ae7d-2c014e39aec6 bound to our chassis#033[00m
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.556 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.558 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 61999b35-f067-478e-ae7d-2c014e39aec6#033[00m
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.562 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.574 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f201ff-5d7a-498d-862d-01c0fdb10d9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 systemd-udevd[228514]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.575 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap61999b35-f1 in ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:08:39 np0005539505 systemd-machined[153285]: New machine qemu-43-instance-00000056.
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.578 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap61999b35-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.579 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1f6e927a-800c-4cba-906c-2a81d5d69439]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.581 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a0537447-0068-43ca-9d4f-e4f239e29315]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 systemd[1]: Started Virtual Machine qemu-43-instance-00000056.
Nov 29 02:08:39 np0005539505 NetworkManager[55134]: <info>  [1764400119.5942] device (tap8b5b5d87-ab): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:08:39 np0005539505 NetworkManager[55134]: <info>  [1764400119.5954] device (tap8b5b5d87-ab): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.596 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[8defeb43-26a3-434c-846c-f2c1550b5537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.619 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[89b2d4e0-2e2f-4c41-9e53-035509b14c78]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.649 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[7767f733-8f73-4151-b56f-b204034cda01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.654 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dc448aec-bc08-4be7-b497-0e4cc8f37b4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 NetworkManager[55134]: <info>  [1764400119.6558] manager: (tap61999b35-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/173)
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.685 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[9173acf9-1a71-4891-9773-e1b2311c29b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.688 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[bbdef8b0-407e-4067-83fc-925461c858f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 NetworkManager[55134]: <info>  [1764400119.7115] device (tap61999b35-f0): carrier: link connected
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.721 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[7755433b-f9ca-47c7-ae71-2003f2c02f21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.739 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8f32cc14-ce26-430c-a709-1f77ea5a66e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61999b35-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:e2:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556729, 'reachable_time': 43197, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228547, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.754 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[40625a3b-8a95-4d11-98f8-ff751af10d1d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:e2e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556729, 'tstamp': 556729}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228548, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.769 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b9a9e720-93ca-476b-8691-223a4ba1a417]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61999b35-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:e2:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556729, 'reachable_time': 43197, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228549, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.799 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[95e6ae76-346c-43dd-b0d5-383f1914fdda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.860 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f9409c9f-e6a6-493b-9ca0-61cff227cc58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.862 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61999b35-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.862 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.862 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61999b35-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.864 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:39 np0005539505 NetworkManager[55134]: <info>  [1764400119.8653] manager: (tap61999b35-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Nov 29 02:08:39 np0005539505 kernel: tap61999b35-f0: entered promiscuous mode
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.867 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.867 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap61999b35-f0, col_values=(('external_ids', {'iface-id': 'c68228ff-9afd-4bc1-81a6-230bf1aa485f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.869 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:39Z|00339|binding|INFO|Releasing lport c68228ff-9afd-4bc1-81a6-230bf1aa485f from this chassis (sb_readonly=0)
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.881 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.882 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/61999b35-f067-478e-ae7d-2c014e39aec6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/61999b35-f067-478e-ae7d-2c014e39aec6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.883 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0dac10-03b2-436c-9d3a-53f486ad5d93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.884 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-61999b35-f067-478e-ae7d-2c014e39aec6
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/61999b35-f067-478e-ae7d-2c014e39aec6.pid.haproxy
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 61999b35-f067-478e-ae7d-2c014e39aec6
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:08:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:39.884 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'env', 'PROCESS_TAG=haproxy-61999b35-f067-478e-ae7d-2c014e39aec6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/61999b35-f067-478e-ae7d-2c014e39aec6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.995 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400119.994639, 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:08:39 np0005539505 nova_compute[186958]: 2025-11-29 07:08:39.995 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] VM Started (Lifecycle Event)#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.103 186962 DEBUG nova.compute.manager [req-40bf2aa2-8d7a-482a-911e-0eff3b9b8369 req-fc2869f3-bb79-4eaa-8761-aafc71a644ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Received event network-vif-plugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.103 186962 DEBUG oslo_concurrency.lockutils [req-40bf2aa2-8d7a-482a-911e-0eff3b9b8369 req-fc2869f3-bb79-4eaa-8761-aafc71a644ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.104 186962 DEBUG oslo_concurrency.lockutils [req-40bf2aa2-8d7a-482a-911e-0eff3b9b8369 req-fc2869f3-bb79-4eaa-8761-aafc71a644ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.104 186962 DEBUG oslo_concurrency.lockutils [req-40bf2aa2-8d7a-482a-911e-0eff3b9b8369 req-fc2869f3-bb79-4eaa-8761-aafc71a644ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.104 186962 DEBUG nova.compute.manager [req-40bf2aa2-8d7a-482a-911e-0eff3b9b8369 req-fc2869f3-bb79-4eaa-8761-aafc71a644ef 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Processing event network-vif-plugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.105 186962 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.113 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.118 186962 INFO nova.virt.libvirt.driver [-] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Instance spawned successfully.#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.119 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.129 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.133 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:08:40 np0005539505 podman[228588]: 2025-11-29 07:08:40.251285418 +0000 UTC m=+0.052281660 container create 34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.263 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.264 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400119.9947972, 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.264 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.269 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.270 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.270 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.270 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.271 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.271 186962 DEBUG nova.virt.libvirt.driver [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:08:40 np0005539505 systemd[1]: Started libpod-conmon-34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238.scope.
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.315 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:08:40 np0005539505 podman[228588]: 2025-11-29 07:08:40.222857584 +0000 UTC m=+0.023853846 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.320 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400120.111737, 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.320 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:08:40 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:08:40 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97b7f0079fe4cdf6eed1a5d0a0cd4095f224b4fa2d21fd1007ae6a1e92d8bf38/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:08:40 np0005539505 podman[228588]: 2025-11-29 07:08:40.34225397 +0000 UTC m=+0.143250232 container init 34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:08:40 np0005539505 podman[228588]: 2025-11-29 07:08:40.348186707 +0000 UTC m=+0.149182959 container start 34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.372 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:08:40 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[228603]: [NOTICE]   (228607) : New worker (228609) forked
Nov 29 02:08:40 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[228603]: [NOTICE]   (228607) : Loading success.
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.376 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.420 186962 DEBUG nova.network.neutron [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Updated VIF entry in instance network info cache for port 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.421 186962 DEBUG nova.network.neutron [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Updating instance_info_cache with network_info: [{"id": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "address": "fa:16:3e:60:3c:47", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5b5d87-ab", "ovs_interfaceid": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.459 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.462 186962 DEBUG oslo_concurrency.lockutils [req-df0c6622-4788-45ed-8448-0a87005e5453 req-a72ef882-ce44-48da-8e3f-032e0657cd54 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.682 186962 INFO nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Took 9.92 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:08:40 np0005539505 nova_compute[186958]: 2025-11-29 07:08:40.683 186962 DEBUG nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:08:42 np0005539505 nova_compute[186958]: 2025-11-29 07:08:42.392 186962 DEBUG nova.compute.manager [req-2457c8fe-b075-4ae2-99bf-3dfc78447d70 req-24a89379-904c-41e7-8773-8da955c50402 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Received event network-vif-plugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:42 np0005539505 nova_compute[186958]: 2025-11-29 07:08:42.393 186962 DEBUG oslo_concurrency.lockutils [req-2457c8fe-b075-4ae2-99bf-3dfc78447d70 req-24a89379-904c-41e7-8773-8da955c50402 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:42 np0005539505 nova_compute[186958]: 2025-11-29 07:08:42.394 186962 DEBUG oslo_concurrency.lockutils [req-2457c8fe-b075-4ae2-99bf-3dfc78447d70 req-24a89379-904c-41e7-8773-8da955c50402 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:42 np0005539505 nova_compute[186958]: 2025-11-29 07:08:42.394 186962 DEBUG oslo_concurrency.lockutils [req-2457c8fe-b075-4ae2-99bf-3dfc78447d70 req-24a89379-904c-41e7-8773-8da955c50402 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:42 np0005539505 nova_compute[186958]: 2025-11-29 07:08:42.394 186962 DEBUG nova.compute.manager [req-2457c8fe-b075-4ae2-99bf-3dfc78447d70 req-24a89379-904c-41e7-8773-8da955c50402 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] No waiting events found dispatching network-vif-plugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:42 np0005539505 nova_compute[186958]: 2025-11-29 07:08:42.394 186962 WARNING nova.compute.manager [req-2457c8fe-b075-4ae2-99bf-3dfc78447d70 req-24a89379-904c-41e7-8773-8da955c50402 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Received unexpected event network-vif-plugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:08:42 np0005539505 nova_compute[186958]: 2025-11-29 07:08:42.404 186962 INFO nova.compute.manager [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Took 13.04 seconds to build instance.#033[00m
Nov 29 02:08:42 np0005539505 nova_compute[186958]: 2025-11-29 07:08:42.494 186962 DEBUG oslo_concurrency.lockutils [None req-2b12b80e-1297-42d6-9d91-fa9af62482a4 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:42 np0005539505 nova_compute[186958]: 2025-11-29 07:08:42.926 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:44 np0005539505 nova_compute[186958]: 2025-11-29 07:08:44.219 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:46 np0005539505 podman[228619]: 2025-11-29 07:08:46.720475425 +0000 UTC m=+0.052057143 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:08:46 np0005539505 podman[228618]: 2025-11-29 07:08:46.72811028 +0000 UTC m=+0.060753438 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Nov 29 02:08:47 np0005539505 nova_compute[186958]: 2025-11-29 07:08:47.929 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.087 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'name': 'tempest-tempest.common.compute-instance-622119530-2', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000056', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'hostId': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.088 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.089 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.089 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-2>]
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.089 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.119 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.read.latency volume: 111814264 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.120 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.read.latency volume: 655859 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc3dc902-373a-4c5f-a841-438996c93eb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 111814264, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-vda', 'timestamp': '2025-11-29T07:08:48.089751', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4049a038-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': '50d5d29a47c60635b324a0f7a2eeec725a3d2da8139bf2be55a448aa804c75aa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 655859, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-sda', 'timestamp': '2025-11-29T07:08:48.089751', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4049aefc-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': '69dc3be2d75deca585453a1af652a3e17eb27296877732be310d00b1317826e1'}]}, 'timestamp': '2025-11-29 07:08:48.121011', '_unique_id': 'ff10876d9e9c4e1c9dd5d74ebb3be155'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.125 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.125 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42b639f2-a4c8-4e62-9de2-344337d1e3dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-vda', 'timestamp': '2025-11-29T07:08:48.125514', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '404a7418-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': 'de877a014f3195d1fb9a7cfb6c74163c760e404f5395fd836592c40d1e00ad55'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-sda', 'timestamp': '2025-11-29T07:08:48.125514', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '404a81e2-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': '91a13b3ac4f19ad64d8757651e67392eee17478a2643f5bc39c0a15a975ea54a'}]}, 'timestamp': '2025-11-29 07:08:48.126377', '_unique_id': '294bdbe7629d48309f976c7aac788b24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.132 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd / tap8b5b5d87-ab inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.133 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0de8e9d-8e94-43ee-bb60-2bd762a7c0ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000056-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-tap8b5b5d87-ab', 'timestamp': '2025-11-29T07:08:48.130013', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'tap8b5b5d87-ab', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:3c:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b5b5d87-ab'}, 'message_id': '404b9df2-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.770877983, 'message_signature': 'e88dd84b59d68ab4d42a867ee79cd9ee6ba0568ee0a4810fd995318bbc7d226c'}]}, 'timestamp': '2025-11-29 07:08:48.133647', '_unique_id': 'b053f3da1a254e8e83e734c2f616f36d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.135 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.137 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07013846-a929-4aa1-80e8-ec3ee44e6757', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000056-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-tap8b5b5d87-ab', 'timestamp': '2025-11-29T07:08:48.137116', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'tap8b5b5d87-ab', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:3c:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b5b5d87-ab'}, 'message_id': '404c38a2-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.770877983, 'message_signature': '60060073ce970df1626d0833d7687cf424eeaf9dd929e0967c514e5d6a975ac0'}]}, 'timestamp': '2025-11-29 07:08:48.137563', '_unique_id': '6d6614d9fa2c426ab998a90446a87370'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.151 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.152 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '030830ec-9024-48fe-9287-2cae5b6c50e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-vda', 'timestamp': '2025-11-29T07:08:48.141269', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '404e7b26-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.782234504, 'message_signature': '8267d2a85a4129cc49e2826d8ed95e5d19f1d969e4aea059d46a1aee84efe95b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-sda', 'timestamp': '2025-11-29T07:08:48.141269', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '404e8684-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.782234504, 'message_signature': '204caabcdccf39a10021322b3e303a4c0bb752c0e06a381f28d1d23cc523dd1b'}]}, 'timestamp': '2025-11-29 07:08:48.152729', '_unique_id': '59c458715973469d9a34abc08bf6bc77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.156 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10a05241-363d-415a-9839-95aeb3e25a1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000056-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-tap8b5b5d87-ab', 'timestamp': '2025-11-29T07:08:48.156483', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'tap8b5b5d87-ab', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:3c:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b5b5d87-ab'}, 'message_id': '404f2bf2-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.770877983, 'message_signature': '31de10943dfa468f9415ad73c6b3a801662de3f8f27fe67040968a23adaf463f'}]}, 'timestamp': '2025-11-29 07:08:48.156895', '_unique_id': '21f2ecb58ec9447692b056a3639a31e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.157 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.160 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.160 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-2>]
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.160 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.161 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.161 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6f62a9d-782c-4411-8613-1e392371da52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-vda', 'timestamp': '2025-11-29T07:08:48.160960', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '404fdb6a-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': '06b4e201d252342185bfd2a85521798c92fc7e55671938dd19d2dc707839052e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-sda', 'timestamp': '2025-11-29T07:08:48.160960', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '404fe68c-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': 'ebaece4197590d1f33829bc3fd5202ef340368432eb5811a349bbecc7955470a'}]}, 'timestamp': '2025-11-29 07:08:48.161787', '_unique_id': '3b90b640d6ee4348a952007850d3629f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.164 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.165 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f7cea38-9095-4e9e-9941-d038406559b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-vda', 'timestamp': '2025-11-29T07:08:48.164908', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '405073b8-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': 'bb67823fa5044795daa33395cb9237966e2ca6a8632bb62d499276397207a764'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-sda', 'timestamp': '2025-11-29T07:08:48.164908', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4050859c-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': 'a8b72f94cf29013863e566102ce7d9281700d2915ab8fd806784a9e036b5170d'}]}, 'timestamp': '2025-11-29 07:08:48.165730', '_unique_id': 'dd2a4117607247fe8bb488f10583e9ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.168 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f135933-a7ad-4694-ba9e-3f366b808a54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000056-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-tap8b5b5d87-ab', 'timestamp': '2025-11-29T07:08:48.168881', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'tap8b5b5d87-ab', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:3c:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b5b5d87-ab'}, 'message_id': '40510fb2-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.770877983, 'message_signature': '7ba34fa696bcf8ce2b15c6ce5f258ac8febe4c4d85d1d18bfc1750a8081b0450'}]}, 'timestamp': '2025-11-29 07:08:48.169293', '_unique_id': '04a88a45ecc54f04993ea88a2ffb2476'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.172 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ed91bf1-6890-4852-9ca2-520ed63b4b03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000056-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-tap8b5b5d87-ab', 'timestamp': '2025-11-29T07:08:48.172330', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'tap8b5b5d87-ab', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:3c:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b5b5d87-ab'}, 'message_id': '40519784-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.770877983, 'message_signature': '1b88331e03446a3b4dc51dd3f91845dbcb11ed37972d724b029a69bc224dbcf8'}]}, 'timestamp': '2025-11-29 07:08:48.172753', '_unique_id': '96af32c277524f6eb1e8839042ac0b59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.173 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.175 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.175 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fb72269-6e48-43b9-b368-cf5dc9ce57a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000056-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-tap8b5b5d87-ab', 'timestamp': '2025-11-29T07:08:48.175894', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'tap8b5b5d87-ab', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:3c:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b5b5d87-ab'}, 'message_id': '40522320-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.770877983, 'message_signature': 'a879bc22599589b2f42d4bff09296ea88ec0133f33da59747f8492a002754e1f'}]}, 'timestamp': '2025-11-29 07:08:48.176395', '_unique_id': 'ee1e59876fa349148105b6662e1b3171'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.177 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.179 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.180 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '404d757d-51f6-4ad5-a776-11533d5bfa26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-vda', 'timestamp': '2025-11-29T07:08:48.179785', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4052bbbe-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.782234504, 'message_signature': '52d7d217a61867638808f3aeddde53fcd6d04a9ab78198bf326811c98b896d29'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-sda', 'timestamp': '2025-11-29T07:08:48.179785', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4052c8f2-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.782234504, 'message_signature': '37a05bd16dec1f4eca9328c36b5de8f953a8beadd5aca30cbb302d008f74c3ec'}]}, 'timestamp': '2025-11-29 07:08:48.180564', '_unique_id': 'd44a6231db4649e3bd3411be10344bc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.182 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05849f99-4f7a-4c97-985d-2314a729c556', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000056-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-tap8b5b5d87-ab', 'timestamp': '2025-11-29T07:08:48.181982', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'tap8b5b5d87-ab', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:3c:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b5b5d87-ab'}, 'message_id': '40536bfe-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.770877983, 'message_signature': 'da00c8d2d891abf833b2e5421d2b7ece2fafae1f48dc8d90110345bf80676bab'}]}, 'timestamp': '2025-11-29 07:08:48.184829', '_unique_id': 'b993c0d8e0e34fbca8bcad4e066fe5ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.188 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.188 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-2>]
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.188 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.188 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e111d3b8-7952-4808-864c-502eafbebcd1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-vda', 'timestamp': '2025-11-29T07:08:48.188692', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '405415cc-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.782234504, 'message_signature': '711fa70edf0efee1823549d95ae8322d0c3467c797676daceb1017c5a7aa41ee'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-sda', 'timestamp': '2025-11-29T07:08:48.188692', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '405421ac-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.782234504, 'message_signature': 'd9dbf5c8b2944faa82a8c93b0c8be4a30ca9d0b8a34ef93c9d10f45519b4cd91'}]}, 'timestamp': '2025-11-29 07:08:48.189385', '_unique_id': '58aaf2788e664d069e36e38e19e73060'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.192 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.234 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.235 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd: ceilometer.compute.pollsters.NoVolumeException
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.235 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.236 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-2>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-622119530-2>]
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.236 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.236 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b86c6e58-5363-4da6-8877-df970c51ab43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000056-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-tap8b5b5d87-ab', 'timestamp': '2025-11-29T07:08:48.236607', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'tap8b5b5d87-ab', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:3c:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b5b5d87-ab'}, 'message_id': '405b6a98-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.770877983, 'message_signature': 'a837207e5ee2be1552bc823935ebcc6a394268af7cbd7f5aa550c474579ba03c'}]}, 'timestamp': '2025-11-29 07:08:48.237309', '_unique_id': '26b3c8cff9974341bc4dbd2d1192ddc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.238 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.242 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.242 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62e90d6e-0b76-4697-a3c3-662e2a082520', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000056-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-tap8b5b5d87-ab', 'timestamp': '2025-11-29T07:08:48.242605', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'tap8b5b5d87-ab', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:3c:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b5b5d87-ab'}, 'message_id': '405c53b8-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.770877983, 'message_signature': '1931cc60180c20ed6fa1dd0c9a849d484be1ad33a17ebbda981619709aa0d52a'}]}, 'timestamp': '2025-11-29 07:08:48.243256', '_unique_id': 'a9f6bd45d19c4c04bdbf5892b1f30dac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.245 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.245 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/cpu volume: 7860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3b9b4e7-d024-4b5a-bdaa-a319d1f1a2a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7860000000, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'timestamp': '2025-11-29T07:08:48.245762', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '405cc7d0-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.875354236, 'message_signature': 'c8b4de8e0c30407289603ebf3f935a275073aaa62d53e71a95b2b0c8c7fc2975'}]}, 'timestamp': '2025-11-29 07:08:48.246097', '_unique_id': '2d30e2e1512e4510b5d108888b97cd1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.249 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.250 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.252 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.253 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2b0ee29-39aa-4a1e-8cba-ae9b4e5755e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-vda', 'timestamp': '2025-11-29T07:08:48.252941', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '405de200-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': 'f4ef70893d8ccf2deed1215f2d48dae4af31f04c4431fe37d354d342d22906dc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-sda', 'timestamp': '2025-11-29T07:08:48.252941', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '405dee94-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': '91ac431d11f7461b46107a90c42d59d560a4699ee5b77267858633e522950323'}]}, 'timestamp': '2025-11-29 07:08:48.253618', '_unique_id': '42359f20aed94c5989652220bd72ba03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.254 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.255 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.256 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b25e280b-86e0-4c49-9991-d961b2522d13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-vda', 'timestamp': '2025-11-29T07:08:48.255729', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '405e4d12-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': '90e9736246d3da92853607052a0e7d0d35281f2133c9880cadafc75a6b0466cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-sda', 'timestamp': '2025-11-29T07:08:48.255729', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'instance-00000056', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '405e57bc-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.730571223, 'message_signature': '1d0e5f336335ca83593e9477aa29ea8c80334f108070dcd6719c94450afa973c'}]}, 'timestamp': '2025-11-29 07:08:48.259076', '_unique_id': '2fe5ab6e8cd24fccb8cf2fb4d1df7c84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.260 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.261 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.264 12 DEBUG ceilometer.compute.pollsters [-] 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '897cc31b-9671-4921-9ec4-49ead3d37c53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_name': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_name': None, 'resource_id': 'instance-00000056-0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-tap8b5b5d87-ab', 'timestamp': '2025-11-29T07:08:48.264091', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-622119530-2', 'name': 'tap8b5b5d87-ab', 'instance_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'instance_type': 'm1.nano', 'host': '35c12200785e50b9b36344871ad772f9d6df97be303dd6bdb0657efa', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:60:3c:47', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8b5b5d87-ab'}, 'message_id': '405f9726-ccf2-11f0-8954-fa163e5a5606', 'monotonic_time': 5575.770877983, 'message_signature': '737da144f96dea3d39f95c406f7e6f030200037e795c5bb309d52407aba6e662'}]}, 'timestamp': '2025-11-29 07:08:48.264596', '_unique_id': '13f3c201354040fbb4aaed333baba369'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:08:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:08:48.265 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:08:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:48.891 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:08:48 np0005539505 nova_compute[186958]: 2025-11-29 07:08:48.892 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:48 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:48.893 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:08:49 np0005539505 nova_compute[186958]: 2025-11-29 07:08:49.220 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:49 np0005539505 nova_compute[186958]: 2025-11-29 07:08:49.428 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400114.4268382, 9223f44a-297e-4db1-9f44-ee0694c4e258 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:08:49 np0005539505 nova_compute[186958]: 2025-11-29 07:08:49.429 186962 INFO nova.compute.manager [-] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:08:49 np0005539505 nova_compute[186958]: 2025-11-29 07:08:49.431 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:49 np0005539505 nova_compute[186958]: 2025-11-29 07:08:49.520 186962 DEBUG nova.compute.manager [None req-14eb39dd-0473-4438-821c-cad1acfa8de1 - - - - - -] [instance: 9223f44a-297e-4db1-9f44-ee0694c4e258] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.160 186962 DEBUG oslo_concurrency.lockutils [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.161 186962 DEBUG oslo_concurrency.lockutils [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.161 186962 DEBUG oslo_concurrency.lockutils [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.162 186962 DEBUG oslo_concurrency.lockutils [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.162 186962 DEBUG oslo_concurrency.lockutils [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.176 186962 INFO nova.compute.manager [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Terminating instance#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.196 186962 DEBUG nova.compute.manager [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:08:50 np0005539505 kernel: tap8b5b5d87-ab (unregistering): left promiscuous mode
Nov 29 02:08:50 np0005539505 NetworkManager[55134]: <info>  [1764400130.2291] device (tap8b5b5d87-ab): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.238 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:50Z|00340|binding|INFO|Releasing lport 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 from this chassis (sb_readonly=0)
Nov 29 02:08:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:50Z|00341|binding|INFO|Setting lport 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 down in Southbound
Nov 29 02:08:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:08:50Z|00342|binding|INFO|Removing iface tap8b5b5d87-ab ovn-installed in OVS
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.241 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.249 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:3c:47 10.100.0.9'], port_security=['fa:16:3e:60:3c:47 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '0db4c951-8b4f-4c3b-b7a0-ccb26138abcd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61999b35-f067-478e-ae7d-2c014e39aec6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'neutron:revision_number': '4', 'neutron:security_group_ids': '33049197-f5b6-46f9-bb9f-ae10a060cbf4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0493b85c-e95a-459c-8e5e-a22ec09f96c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.251 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 in datapath 61999b35-f067-478e-ae7d-2c014e39aec6 unbound from our chassis#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.253 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 61999b35-f067-478e-ae7d-2c014e39aec6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.255 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0fa684-9d35-462a-8242-f5d008233339]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.257 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.257 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 namespace which is not needed anymore#033[00m
Nov 29 02:08:50 np0005539505 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000056.scope: Deactivated successfully.
Nov 29 02:08:50 np0005539505 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000056.scope: Consumed 10.628s CPU time.
Nov 29 02:08:50 np0005539505 systemd-machined[153285]: Machine qemu-43-instance-00000056 terminated.
Nov 29 02:08:50 np0005539505 podman[228661]: 2025-11-29 07:08:50.345191142 +0000 UTC m=+0.082116632 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 02:08:50 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[228603]: [NOTICE]   (228607) : haproxy version is 2.8.14-c23fe91
Nov 29 02:08:50 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[228603]: [NOTICE]   (228607) : path to executable is /usr/sbin/haproxy
Nov 29 02:08:50 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[228603]: [WARNING]  (228607) : Exiting Master process...
Nov 29 02:08:50 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[228603]: [WARNING]  (228607) : Exiting Master process...
Nov 29 02:08:50 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[228603]: [ALERT]    (228607) : Current worker (228609) exited with code 143 (Terminated)
Nov 29 02:08:50 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[228603]: [WARNING]  (228607) : All workers exited. Exiting... (0)
Nov 29 02:08:50 np0005539505 systemd[1]: libpod-34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238.scope: Deactivated successfully.
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:50 np0005539505 podman[228703]: 2025-11-29 07:08:50.379831182 +0000 UTC m=+0.044114208 container died 34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:08:50 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238-userdata-shm.mount: Deactivated successfully.
Nov 29 02:08:50 np0005539505 systemd[1]: var-lib-containers-storage-overlay-97b7f0079fe4cdf6eed1a5d0a0cd4095f224b4fa2d21fd1007ae6a1e92d8bf38-merged.mount: Deactivated successfully.
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.419 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:50 np0005539505 podman[228703]: 2025-11-29 07:08:50.420423329 +0000 UTC m=+0.084706345 container cleanup 34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.427 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:50 np0005539505 systemd[1]: libpod-conmon-34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238.scope: Deactivated successfully.
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.454 186962 INFO nova.virt.libvirt.driver [-] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Instance destroyed successfully.#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.454 186962 DEBUG nova.objects.instance [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'resources' on Instance uuid 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.471 186962 DEBUG nova.virt.libvirt.vif [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-622119530',display_name='tempest-tempest.common.compute-instance-622119530-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-622119530-2',id=86,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-11-29T07:08:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-u25t76wp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:08:40Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=0db4c951-8b4f-4c3b-b7a0-ccb26138abcd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "address": "fa:16:3e:60:3c:47", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5b5d87-ab", "ovs_interfaceid": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.471 186962 DEBUG nova.network.os_vif_util [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "address": "fa:16:3e:60:3c:47", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b5b5d87-ab", "ovs_interfaceid": "8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.472 186962 DEBUG nova.network.os_vif_util [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:3c:47,bridge_name='br-int',has_traffic_filtering=True,id=8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5b5d87-ab') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.472 186962 DEBUG os_vif [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:3c:47,bridge_name='br-int',has_traffic_filtering=True,id=8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5b5d87-ab') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.474 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.474 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b5b5d87-ab, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.475 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:50 np0005539505 podman[228739]: 2025-11-29 07:08:50.477927895 +0000 UTC m=+0.037822580 container remove 34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.479 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.481 186962 INFO os_vif [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:3c:47,bridge_name='br-int',has_traffic_filtering=True,id=8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8b5b5d87-ab')#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.481 186962 INFO nova.virt.libvirt.driver [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Deleting instance files /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd_del#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.482 186962 INFO nova.virt.libvirt.driver [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Deletion of /var/lib/nova/instances/0db4c951-8b4f-4c3b-b7a0-ccb26138abcd_del complete#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.483 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[acd4a872-14fa-4073-b853-c9e5bfe368f1]: (4, ('Sat Nov 29 07:08:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 (34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238)\n34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238\nSat Nov 29 07:08:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 (34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238)\n34faf26801d3d5e4d1e541e76a564af255be3b7572f7f3c0c4a4362e874f5238\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.484 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[068f3022-b066-49ad-b217-d01abac44ec7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.485 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61999b35-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.486 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:50 np0005539505 kernel: tap61999b35-f0: left promiscuous mode
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.498 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.502 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd84a96-394b-492b-9175-202213181b91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.524 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f9819e6b-e736-424f-8266-1f1e01af102f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.525 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e0751c3e-0250-427d-b1bb-901bb8c23eb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.540 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[faa0cf38-2b0c-4d0b-8e85-78575c3e174a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556722, 'reachable_time': 15018, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228761, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.542 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:08:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:50.542 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[432c7b5d-fe74-4efa-b94e-827636b231fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:08:50 np0005539505 systemd[1]: run-netns-ovnmeta\x2d61999b35\x2df067\x2d478e\x2dae7d\x2d2c014e39aec6.mount: Deactivated successfully.
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.715 186962 INFO nova.compute.manager [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Took 0.52 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.715 186962 DEBUG oslo.service.loopingcall [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.715 186962 DEBUG nova.compute.manager [-] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.716 186962 DEBUG nova.network.neutron [-] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.969 186962 DEBUG nova.compute.manager [req-3c32456e-26f3-4d00-9d86-f2c4fbc13ff7 req-831d74aa-4525-4921-9501-09466fdcf13d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Received event network-vif-unplugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.969 186962 DEBUG oslo_concurrency.lockutils [req-3c32456e-26f3-4d00-9d86-f2c4fbc13ff7 req-831d74aa-4525-4921-9501-09466fdcf13d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.969 186962 DEBUG oslo_concurrency.lockutils [req-3c32456e-26f3-4d00-9d86-f2c4fbc13ff7 req-831d74aa-4525-4921-9501-09466fdcf13d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.970 186962 DEBUG oslo_concurrency.lockutils [req-3c32456e-26f3-4d00-9d86-f2c4fbc13ff7 req-831d74aa-4525-4921-9501-09466fdcf13d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.970 186962 DEBUG nova.compute.manager [req-3c32456e-26f3-4d00-9d86-f2c4fbc13ff7 req-831d74aa-4525-4921-9501-09466fdcf13d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] No waiting events found dispatching network-vif-unplugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:50 np0005539505 nova_compute[186958]: 2025-11-29 07:08:50.970 186962 DEBUG nova.compute.manager [req-3c32456e-26f3-4d00-9d86-f2c4fbc13ff7 req-831d74aa-4525-4921-9501-09466fdcf13d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Received event network-vif-unplugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:08:52 np0005539505 nova_compute[186958]: 2025-11-29 07:08:52.948 186962 DEBUG nova.network.neutron [-] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:53 np0005539505 nova_compute[186958]: 2025-11-29 07:08:53.161 186962 DEBUG nova.compute.manager [req-67d34169-3a17-487e-a5f4-3d8f61c37560 req-f7350f7b-574e-47b8-afaf-6083e75e992f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Received event network-vif-plugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:53 np0005539505 nova_compute[186958]: 2025-11-29 07:08:53.162 186962 DEBUG oslo_concurrency.lockutils [req-67d34169-3a17-487e-a5f4-3d8f61c37560 req-f7350f7b-574e-47b8-afaf-6083e75e992f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:53 np0005539505 nova_compute[186958]: 2025-11-29 07:08:53.162 186962 DEBUG oslo_concurrency.lockutils [req-67d34169-3a17-487e-a5f4-3d8f61c37560 req-f7350f7b-574e-47b8-afaf-6083e75e992f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:53 np0005539505 nova_compute[186958]: 2025-11-29 07:08:53.162 186962 DEBUG oslo_concurrency.lockutils [req-67d34169-3a17-487e-a5f4-3d8f61c37560 req-f7350f7b-574e-47b8-afaf-6083e75e992f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:53 np0005539505 nova_compute[186958]: 2025-11-29 07:08:53.162 186962 DEBUG nova.compute.manager [req-67d34169-3a17-487e-a5f4-3d8f61c37560 req-f7350f7b-574e-47b8-afaf-6083e75e992f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] No waiting events found dispatching network-vif-plugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:08:53 np0005539505 nova_compute[186958]: 2025-11-29 07:08:53.163 186962 WARNING nova.compute.manager [req-67d34169-3a17-487e-a5f4-3d8f61c37560 req-f7350f7b-574e-47b8-afaf-6083e75e992f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Received unexpected event network-vif-plugged-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:08:53 np0005539505 nova_compute[186958]: 2025-11-29 07:08:53.163 186962 DEBUG nova.compute.manager [req-67d34169-3a17-487e-a5f4-3d8f61c37560 req-f7350f7b-574e-47b8-afaf-6083e75e992f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Received event network-vif-deleted-8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:08:53 np0005539505 nova_compute[186958]: 2025-11-29 07:08:53.163 186962 INFO nova.compute.manager [req-67d34169-3a17-487e-a5f4-3d8f61c37560 req-f7350f7b-574e-47b8-afaf-6083e75e992f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Neutron deleted interface 8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:08:53 np0005539505 nova_compute[186958]: 2025-11-29 07:08:53.163 186962 DEBUG nova.network.neutron [req-67d34169-3a17-487e-a5f4-3d8f61c37560 req-f7350f7b-574e-47b8-afaf-6083e75e992f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:08:53 np0005539505 nova_compute[186958]: 2025-11-29 07:08:53.382 186962 INFO nova.compute.manager [-] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Took 2.67 seconds to deallocate network for instance.#033[00m
Nov 29 02:08:53 np0005539505 nova_compute[186958]: 2025-11-29 07:08:53.582 186962 DEBUG nova.compute.manager [req-67d34169-3a17-487e-a5f4-3d8f61c37560 req-f7350f7b-574e-47b8-afaf-6083e75e992f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Detach interface failed, port_id=8b5b5d87-ab04-4a5f-84bf-67ce5d08bc72, reason: Instance 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:08:54 np0005539505 nova_compute[186958]: 2025-11-29 07:08:54.223 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:54 np0005539505 nova_compute[186958]: 2025-11-29 07:08:54.759 186962 DEBUG oslo_concurrency.lockutils [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:08:54 np0005539505 nova_compute[186958]: 2025-11-29 07:08:54.760 186962 DEBUG oslo_concurrency.lockutils [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:08:54 np0005539505 nova_compute[186958]: 2025-11-29 07:08:54.796 186962 DEBUG nova.scheduler.client.report [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:08:54 np0005539505 nova_compute[186958]: 2025-11-29 07:08:54.815 186962 DEBUG nova.scheduler.client.report [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:08:54 np0005539505 nova_compute[186958]: 2025-11-29 07:08:54.815 186962 DEBUG nova.compute.provider_tree [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:08:54 np0005539505 nova_compute[186958]: 2025-11-29 07:08:54.831 186962 DEBUG nova.scheduler.client.report [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:08:54 np0005539505 nova_compute[186958]: 2025-11-29 07:08:54.849 186962 DEBUG nova.scheduler.client.report [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:08:54 np0005539505 nova_compute[186958]: 2025-11-29 07:08:54.896 186962 DEBUG nova.compute.provider_tree [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:08:55 np0005539505 nova_compute[186958]: 2025-11-29 07:08:55.387 186962 DEBUG nova.scheduler.client.report [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:08:55 np0005539505 nova_compute[186958]: 2025-11-29 07:08:55.477 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:55 np0005539505 nova_compute[186958]: 2025-11-29 07:08:55.705 186962 DEBUG oslo_concurrency.lockutils [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.945s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:55 np0005539505 nova_compute[186958]: 2025-11-29 07:08:55.801 186962 INFO nova.scheduler.client.report [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Deleted allocations for instance 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd#033[00m
Nov 29 02:08:56 np0005539505 nova_compute[186958]: 2025-11-29 07:08:56.143 186962 DEBUG oslo_concurrency.lockutils [None req-56529982-24a0-4671-9483-cabf402202b5 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "0db4c951-8b4f-4c3b-b7a0-ccb26138abcd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.982s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:08:56 np0005539505 nova_compute[186958]: 2025-11-29 07:08:56.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:57 np0005539505 nova_compute[186958]: 2025-11-29 07:08:57.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:08:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:08:58.896 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:08:59 np0005539505 nova_compute[186958]: 2025-11-29 07:08:59.224 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:08:59 np0005539505 podman[228762]: 2025-11-29 07:08:59.72719121 +0000 UTC m=+0.060699007 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:08:59 np0005539505 podman[228763]: 2025-11-29 07:08:59.765292467 +0000 UTC m=+0.095907902 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.422 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.423 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.424 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.424 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.479 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.589 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.591 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5682MB free_disk=73.2259407043457GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.591 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.592 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.685 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 2198e4d8-4116-4747-b375-ba8212f745fd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.710 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 94c74b87-76f9-4489-a7d0-5abe91d0db7b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.710 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.711 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.773 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.790 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.839 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "2198e4d8-4116-4747-b375-ba8212f745fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.840 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.867 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.870 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.870 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.917 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:00 np0005539505 nova_compute[186958]: 2025-11-29 07:09:00.918 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.137 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.219 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.220 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.227 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.227 186962 INFO nova.compute.claims [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.300 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.489 186962 DEBUG nova.compute.provider_tree [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.513 186962 DEBUG nova.scheduler.client.report [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.556 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.556 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.559 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.585 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.585 186962 INFO nova.compute.claims [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.665 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.665 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.701 186962 INFO nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.750 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.799 186962 DEBUG nova.compute.provider_tree [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.826 186962 DEBUG nova.scheduler.client.report [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.889 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.330s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.889 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.916 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.917 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.917 186962 INFO nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Creating image(s)#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.918 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "/var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.918 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "/var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.919 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "/var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.932 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.951 186962 DEBUG nova.policy [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.961 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.961 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.993 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.993 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:01 np0005539505 nova_compute[186958]: 2025-11-29 07:09:01.994 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.009 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.027 186962 INFO nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.051 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.067 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.068 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.108 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.109 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.109 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.164 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.165 186962 DEBUG nova.virt.disk.api [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Checking if we can resize image /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.165 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.208 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.209 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.210 186962 INFO nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Creating image(s)#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.210 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "/var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.211 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "/var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.211 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "/var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.224 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.224 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.243 186962 DEBUG nova.virt.disk.api [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Cannot resize image /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.244 186962 DEBUG nova.objects.instance [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'migration_context' on Instance uuid 2198e4d8-4116-4747-b375-ba8212f745fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.249 186962 DEBUG nova.policy [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e621c9f314214c7980a4d441f0600e90', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.262 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.263 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Ensure instance console log exists: /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.263 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.263 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.264 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.287 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.288 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.289 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.304 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.361 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.362 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.574 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk 1073741824" returned: 0 in 0.211s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.575 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.575 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.628 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.629 186962 DEBUG nova.virt.disk.api [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Checking if we can resize image /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.629 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.682 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.683 186962 DEBUG nova.virt.disk.api [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Cannot resize image /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.684 186962 DEBUG nova.objects.instance [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'migration_context' on Instance uuid 94c74b87-76f9-4489-a7d0-5abe91d0db7b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.704 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.705 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Ensure instance console log exists: /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.706 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.706 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.707 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:02 np0005539505 nova_compute[186958]: 2025-11-29 07:09:02.858 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Successfully created port: 87d28cdc-3ed5-4ff4-99a5-89e199e95f8b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:09:03 np0005539505 nova_compute[186958]: 2025-11-29 07:09:03.006 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Successfully created port: b239f9b7-bdd7-4374-a10b-614ad18cffda _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.226 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.597 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Successfully updated port: 87d28cdc-3ed5-4ff4-99a5-89e199e95f8b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.615 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "refresh_cache-2198e4d8-4116-4747-b375-ba8212f745fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.615 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquired lock "refresh_cache-2198e4d8-4116-4747-b375-ba8212f745fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.616 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:09:04 np0005539505 podman[228842]: 2025-11-29 07:09:04.732972424 +0000 UTC m=+0.061341986 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.866 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.867 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.867 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.867 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.905 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.905 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:09:04 np0005539505 nova_compute[186958]: 2025-11-29 07:09:04.906 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.372 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.453 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400130.45251, 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.454 186962 INFO nova.compute.manager [-] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.475 186962 DEBUG nova.compute.manager [None req-a0636393-e7de-43cd-a43f-cec46fb5dfd3 - - - - - -] [instance: 0db4c951-8b4f-4c3b-b7a0-ccb26138abcd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.482 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.539 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Successfully updated port: b239f9b7-bdd7-4374-a10b-614ad18cffda _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.556 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "refresh_cache-94c74b87-76f9-4489-a7d0-5abe91d0db7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.556 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquired lock "refresh_cache-94c74b87-76f9-4489-a7d0-5abe91d0db7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.556 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.664 186962 DEBUG nova.compute.manager [req-ea231294-344c-48d3-a8fd-53411e2cd3fd req-a8b81134-540d-49c6-8518-b1f1b659a85e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Received event network-changed-b239f9b7-bdd7-4374-a10b-614ad18cffda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.665 186962 DEBUG nova.compute.manager [req-ea231294-344c-48d3-a8fd-53411e2cd3fd req-a8b81134-540d-49c6-8518-b1f1b659a85e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Refreshing instance network info cache due to event network-changed-b239f9b7-bdd7-4374-a10b-614ad18cffda. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:09:05 np0005539505 nova_compute[186958]: 2025-11-29 07:09:05.665 186962 DEBUG oslo_concurrency.lockutils [req-ea231294-344c-48d3-a8fd-53411e2cd3fd req-a8b81134-540d-49c6-8518-b1f1b659a85e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-94c74b87-76f9-4489-a7d0-5abe91d0db7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:06 np0005539505 nova_compute[186958]: 2025-11-29 07:09:06.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:06 np0005539505 nova_compute[186958]: 2025-11-29 07:09:06.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:09:06 np0005539505 nova_compute[186958]: 2025-11-29 07:09:06.419 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:09:06 np0005539505 podman[228861]: 2025-11-29 07:09:06.728558253 +0000 UTC m=+0.059200884 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 29 02:09:06 np0005539505 nova_compute[186958]: 2025-11-29 07:09:06.783 186962 DEBUG nova.compute.manager [req-17d69b3c-7be0-4d4e-b140-18908d394990 req-2a275737-081d-4297-89fd-774c82abcbdb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Received event network-changed-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:06 np0005539505 nova_compute[186958]: 2025-11-29 07:09:06.784 186962 DEBUG nova.compute.manager [req-17d69b3c-7be0-4d4e-b140-18908d394990 req-2a275737-081d-4297-89fd-774c82abcbdb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Refreshing instance network info cache due to event network-changed-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:09:06 np0005539505 nova_compute[186958]: 2025-11-29 07:09:06.784 186962 DEBUG oslo_concurrency.lockutils [req-17d69b3c-7be0-4d4e-b140-18908d394990 req-2a275737-081d-4297-89fd-774c82abcbdb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2198e4d8-4116-4747-b375-ba8212f745fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.508 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Updating instance_info_cache with network_info: [{"id": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "address": "fa:16:3e:14:43:2a", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb239f9b7-bd", "ovs_interfaceid": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.589 186962 DEBUG nova.network.neutron [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Updating instance_info_cache with network_info: [{"id": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "address": "fa:16:3e:10:72:0c", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87d28cdc-3e", "ovs_interfaceid": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.623 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Releasing lock "refresh_cache-2198e4d8-4116-4747-b375-ba8212f745fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.624 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Instance network_info: |[{"id": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "address": "fa:16:3e:10:72:0c", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87d28cdc-3e", "ovs_interfaceid": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.624 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Releasing lock "refresh_cache-94c74b87-76f9-4489-a7d0-5abe91d0db7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.624 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Instance network_info: |[{"id": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "address": "fa:16:3e:14:43:2a", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb239f9b7-bd", "ovs_interfaceid": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.625 186962 DEBUG oslo_concurrency.lockutils [req-17d69b3c-7be0-4d4e-b140-18908d394990 req-2a275737-081d-4297-89fd-774c82abcbdb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2198e4d8-4116-4747-b375-ba8212f745fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.625 186962 DEBUG nova.network.neutron [req-17d69b3c-7be0-4d4e-b140-18908d394990 req-2a275737-081d-4297-89fd-774c82abcbdb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Refreshing network info cache for port 87d28cdc-3ed5-4ff4-99a5-89e199e95f8b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.629 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Start _get_guest_xml network_info=[{"id": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "address": "fa:16:3e:10:72:0c", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87d28cdc-3e", "ovs_interfaceid": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.630 186962 DEBUG oslo_concurrency.lockutils [req-ea231294-344c-48d3-a8fd-53411e2cd3fd req-a8b81134-540d-49c6-8518-b1f1b659a85e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-94c74b87-76f9-4489-a7d0-5abe91d0db7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.630 186962 DEBUG nova.network.neutron [req-ea231294-344c-48d3-a8fd-53411e2cd3fd req-a8b81134-540d-49c6-8518-b1f1b659a85e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Refreshing network info cache for port b239f9b7-bdd7-4374-a10b-614ad18cffda _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.633 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Start _get_guest_xml network_info=[{"id": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "address": "fa:16:3e:14:43:2a", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb239f9b7-bd", "ovs_interfaceid": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.640 186962 WARNING nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.642 186962 WARNING nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.647 186962 DEBUG nova.virt.libvirt.host [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.648 186962 DEBUG nova.virt.libvirt.host [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.648 186962 DEBUG nova.virt.libvirt.host [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.648 186962 DEBUG nova.virt.libvirt.host [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.652 186962 DEBUG nova.virt.libvirt.host [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.652 186962 DEBUG nova.virt.libvirt.host [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.654 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.654 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.655 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.655 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.655 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.656 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.656 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.656 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.656 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.657 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.657 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.657 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.661 186962 DEBUG nova.virt.libvirt.vif [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-996056207',display_name='tempest-MultipleCreateTestJSON-server-996056207-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-996056207-1',id=90,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-z4wsatd8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:09:01Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=2198e4d8-4116-4747-b375-ba8212f745fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "address": "fa:16:3e:10:72:0c", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87d28cdc-3e", "ovs_interfaceid": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.662 186962 DEBUG nova.network.os_vif_util [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "address": "fa:16:3e:10:72:0c", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87d28cdc-3e", "ovs_interfaceid": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.663 186962 DEBUG nova.network.os_vif_util [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=87d28cdc-3ed5-4ff4-99a5-89e199e95f8b,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87d28cdc-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.664 186962 DEBUG nova.objects.instance [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2198e4d8-4116-4747-b375-ba8212f745fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.665 186962 DEBUG nova.virt.libvirt.host [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.666 186962 DEBUG nova.virt.libvirt.host [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.667 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.667 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.667 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.667 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.668 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.668 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.668 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.668 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.669 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.669 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.669 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.669 186962 DEBUG nova.virt.hardware [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.672 186962 DEBUG nova.virt.libvirt.vif [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-996056207',display_name='tempest-MultipleCreateTestJSON-server-996056207-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-996056207-2',id=91,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-z4wsatd8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:09:02Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=94c74b87-76f9-4489-a7d0-5abe91d0db7b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "address": "fa:16:3e:14:43:2a", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb239f9b7-bd", "ovs_interfaceid": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.672 186962 DEBUG nova.network.os_vif_util [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "address": "fa:16:3e:14:43:2a", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb239f9b7-bd", "ovs_interfaceid": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.673 186962 DEBUG nova.network.os_vif_util [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:43:2a,bridge_name='br-int',has_traffic_filtering=True,id=b239f9b7-bdd7-4374-a10b-614ad18cffda,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb239f9b7-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.674 186962 DEBUG nova.objects.instance [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'pci_devices' on Instance uuid 94c74b87-76f9-4489-a7d0-5abe91d0db7b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.703 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <uuid>2198e4d8-4116-4747-b375-ba8212f745fd</uuid>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <name>instance-0000005a</name>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:name>tempest-MultipleCreateTestJSON-server-996056207-1</nova:name>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:09:07</nova:creationTime>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:user uuid="e621c9f314214c7980a4d441f0600e90">tempest-MultipleCreateTestJSON-910974113-project-member</nova:user>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:project uuid="a16c3c4eb5654a7f9742906d1a6f6698">tempest-MultipleCreateTestJSON-910974113</nova:project>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:port uuid="87d28cdc-3ed5-4ff4-99a5-89e199e95f8b">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="serial">2198e4d8-4116-4747-b375-ba8212f745fd</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="uuid">2198e4d8-4116-4747-b375-ba8212f745fd</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk.config"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:10:72:0c"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <target dev="tap87d28cdc-3e"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/console.log" append="off"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:09:07 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:09:07 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.705 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Preparing to wait for external event network-vif-plugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.705 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.706 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.706 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.707 186962 DEBUG nova.virt.libvirt.vif [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-996056207',display_name='tempest-MultipleCreateTestJSON-server-996056207-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-996056207-1',id=90,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-z4wsatd8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:09:01Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=2198e4d8-4116-4747-b375-ba8212f745fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "address": "fa:16:3e:10:72:0c", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87d28cdc-3e", "ovs_interfaceid": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.707 186962 DEBUG nova.network.os_vif_util [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "address": "fa:16:3e:10:72:0c", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87d28cdc-3e", "ovs_interfaceid": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.708 186962 DEBUG nova.network.os_vif_util [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=87d28cdc-3ed5-4ff4-99a5-89e199e95f8b,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87d28cdc-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.709 186962 DEBUG os_vif [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=87d28cdc-3ed5-4ff4-99a5-89e199e95f8b,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87d28cdc-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.709 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.710 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.710 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.713 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <uuid>94c74b87-76f9-4489-a7d0-5abe91d0db7b</uuid>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <name>instance-0000005b</name>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:name>tempest-MultipleCreateTestJSON-server-996056207-2</nova:name>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:09:07</nova:creationTime>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:user uuid="e621c9f314214c7980a4d441f0600e90">tempest-MultipleCreateTestJSON-910974113-project-member</nova:user>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:project uuid="a16c3c4eb5654a7f9742906d1a6f6698">tempest-MultipleCreateTestJSON-910974113</nova:project>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        <nova:port uuid="b239f9b7-bdd7-4374-a10b-614ad18cffda">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="serial">94c74b87-76f9-4489-a7d0-5abe91d0db7b</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="uuid">94c74b87-76f9-4489-a7d0-5abe91d0db7b</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk.config"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:14:43:2a"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <target dev="tapb239f9b7-bd"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/console.log" append="off"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:09:07 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:09:07 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:09:07 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:09:07 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.714 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Preparing to wait for external event network-vif-plugged-b239f9b7-bdd7-4374-a10b-614ad18cffda prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.715 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.715 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.715 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.716 186962 DEBUG nova.virt.libvirt.vif [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:08:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-996056207',display_name='tempest-MultipleCreateTestJSON-server-996056207-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-996056207-2',id=91,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-z4wsatd8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:09:02Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=94c74b87-76f9-4489-a7d0-5abe91d0db7b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "address": "fa:16:3e:14:43:2a", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb239f9b7-bd", "ovs_interfaceid": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.716 186962 DEBUG nova.network.os_vif_util [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "address": "fa:16:3e:14:43:2a", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb239f9b7-bd", "ovs_interfaceid": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.717 186962 DEBUG nova.network.os_vif_util [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:43:2a,bridge_name='br-int',has_traffic_filtering=True,id=b239f9b7-bdd7-4374-a10b-614ad18cffda,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb239f9b7-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.717 186962 DEBUG os_vif [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:43:2a,bridge_name='br-int',has_traffic_filtering=True,id=b239f9b7-bdd7-4374-a10b-614ad18cffda,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb239f9b7-bd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.718 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.718 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.719 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.720 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.721 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap87d28cdc-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.721 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap87d28cdc-3e, col_values=(('external_ids', {'iface-id': '87d28cdc-3ed5-4ff4-99a5-89e199e95f8b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:72:0c', 'vm-uuid': '2198e4d8-4116-4747-b375-ba8212f745fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.723 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:07 np0005539505 NetworkManager[55134]: <info>  [1764400147.7247] manager: (tap87d28cdc-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.726 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.729 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.730 186962 INFO os_vif [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=87d28cdc-3ed5-4ff4-99a5-89e199e95f8b,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87d28cdc-3e')#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.731 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.731 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb239f9b7-bd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.732 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb239f9b7-bd, col_values=(('external_ids', {'iface-id': 'b239f9b7-bdd7-4374-a10b-614ad18cffda', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:14:43:2a', 'vm-uuid': '94c74b87-76f9-4489-a7d0-5abe91d0db7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.733 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:07 np0005539505 NetworkManager[55134]: <info>  [1764400147.7340] manager: (tapb239f9b7-bd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/176)
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.734 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.740 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:07 np0005539505 nova_compute[186958]: 2025-11-29 07:09:07.741 186962 INFO os_vif [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:43:2a,bridge_name='br-int',has_traffic_filtering=True,id=b239f9b7-bdd7-4374-a10b-614ad18cffda,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb239f9b7-bd')#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.029 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.029 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.030 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No VIF found with MAC fa:16:3e:10:72:0c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.030 186962 INFO nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Using config drive#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.032 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.033 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.033 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] No VIF found with MAC fa:16:3e:14:43:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.033 186962 INFO nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Using config drive#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.587 186962 INFO nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Creating config drive at /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk.config#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.592 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4w2qvxga execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.637 186962 INFO nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Creating config drive at /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk.config#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.643 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmhzbrzb0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.715 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4w2qvxga" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.766 186962 DEBUG oslo_concurrency.processutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmhzbrzb0" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:08 np0005539505 NetworkManager[55134]: <info>  [1764400148.7754] manager: (tapb239f9b7-bd): new Tun device (/org/freedesktop/NetworkManager/Devices/177)
Nov 29 02:09:08 np0005539505 kernel: tapb239f9b7-bd: entered promiscuous mode
Nov 29 02:09:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:08Z|00343|binding|INFO|Claiming lport b239f9b7-bdd7-4374-a10b-614ad18cffda for this chassis.
Nov 29 02:09:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:08Z|00344|binding|INFO|b239f9b7-bdd7-4374-a10b-614ad18cffda: Claiming fa:16:3e:14:43:2a 10.100.0.14
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.780 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.788 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:43:2a 10.100.0.14'], port_security=['fa:16:3e:14:43:2a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '94c74b87-76f9-4489-a7d0-5abe91d0db7b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61999b35-f067-478e-ae7d-2c014e39aec6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'neutron:revision_number': '2', 'neutron:security_group_ids': '33049197-f5b6-46f9-bb9f-ae10a060cbf4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0493b85c-e95a-459c-8e5e-a22ec09f96c2, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b239f9b7-bdd7-4374-a10b-614ad18cffda) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.789 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b239f9b7-bdd7-4374-a10b-614ad18cffda in datapath 61999b35-f067-478e-ae7d-2c014e39aec6 bound to our chassis#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.791 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 61999b35-f067-478e-ae7d-2c014e39aec6#033[00m
Nov 29 02:09:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:08Z|00345|binding|INFO|Setting lport b239f9b7-bdd7-4374-a10b-614ad18cffda ovn-installed in OVS
Nov 29 02:09:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:08Z|00346|binding|INFO|Setting lport b239f9b7-bdd7-4374-a10b-614ad18cffda up in Southbound
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.802 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.806 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2c504377-6dde-47fa-90eb-935e72131aed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.807 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap61999b35-f1 in ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:09:08 np0005539505 systemd-udevd[228914]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.810 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.809 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap61999b35-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.809 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a7c30d-c169-4707-9baf-2fef544cc292]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.813 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9dae9af0-f126-4e84-9e85-9bcb0078a8fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:08 np0005539505 systemd-machined[153285]: New machine qemu-44-instance-0000005b.
Nov 29 02:09:08 np0005539505 NetworkManager[55134]: <info>  [1764400148.8240] device (tapb239f9b7-bd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:09:08 np0005539505 NetworkManager[55134]: <info>  [1764400148.8256] device (tapb239f9b7-bd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.824 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc865d7-a21d-4d9e-8dca-f7170c22c796]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:08 np0005539505 kernel: tap87d28cdc-3e: entered promiscuous mode
Nov 29 02:09:08 np0005539505 NetworkManager[55134]: <info>  [1764400148.8343] manager: (tap87d28cdc-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/178)
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.834 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.839 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:08Z|00347|binding|INFO|Claiming lport 87d28cdc-3ed5-4ff4-99a5-89e199e95f8b for this chassis.
Nov 29 02:09:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:08Z|00348|binding|INFO|87d28cdc-3ed5-4ff4-99a5-89e199e95f8b: Claiming fa:16:3e:10:72:0c 10.100.0.5
Nov 29 02:09:08 np0005539505 systemd[1]: Started Virtual Machine qemu-44-instance-0000005b.
Nov 29 02:09:08 np0005539505 NetworkManager[55134]: <info>  [1764400148.8478] device (tap87d28cdc-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:09:08 np0005539505 NetworkManager[55134]: <info>  [1764400148.8485] device (tap87d28cdc-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.848 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:72:0c 10.100.0.5'], port_security=['fa:16:3e:10:72:0c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2198e4d8-4116-4747-b375-ba8212f745fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61999b35-f067-478e-ae7d-2c014e39aec6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'neutron:revision_number': '2', 'neutron:security_group_ids': '33049197-f5b6-46f9-bb9f-ae10a060cbf4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0493b85c-e95a-459c-8e5e-a22ec09f96c2, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=87d28cdc-3ed5-4ff4-99a5-89e199e95f8b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:09:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:08Z|00349|binding|INFO|Setting lport 87d28cdc-3ed5-4ff4-99a5-89e199e95f8b ovn-installed in OVS
Nov 29 02:09:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:08Z|00350|binding|INFO|Setting lport 87d28cdc-3ed5-4ff4-99a5-89e199e95f8b up in Southbound
Nov 29 02:09:08 np0005539505 nova_compute[186958]: 2025-11-29 07:09:08.855 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.864 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[66ebac17-8783-4cf5-963b-f1edfe61c60c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:08 np0005539505 systemd-machined[153285]: New machine qemu-45-instance-0000005a.
Nov 29 02:09:08 np0005539505 systemd[1]: Started Virtual Machine qemu-45-instance-0000005a.
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.893 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f13670a0-8464-4573-9b09-c7f8ca47c3d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.897 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c29a0db2-dfa8-4fa2-8abf-f682c1a4a058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:08 np0005539505 NetworkManager[55134]: <info>  [1764400148.8992] manager: (tap61999b35-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/179)
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.934 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[98f4ce4b-13e0-45e4-8893-d15439f6000f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.937 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[25b56c8a-3b95-4956-ba58-1edfecdb0ae8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:08 np0005539505 NetworkManager[55134]: <info>  [1764400148.9624] device (tap61999b35-f0): carrier: link connected
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.968 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[8be6a697-8a37-459d-a579-7ab78cb8b0ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:08.984 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[428b8b1d-73e6-49f6-b906-34a5779ddc6a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61999b35-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:e2:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559654, 'reachable_time': 20157, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228963, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.000 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b43c83-bb4d-4046-9408-cd54704427f0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:e2e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559654, 'tstamp': 559654}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228964, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.016 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[58d8603b-db4e-4476-a3de-28e1e694f1bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61999b35-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:e2:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559654, 'reachable_time': 20157, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228965, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.044 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3a318256-cf5e-4772-b152-53ed6e715cd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.103 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[59c18dea-3e59-4dee-a100-1eaf3d4d5e63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.104 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61999b35-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.105 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.105 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61999b35-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.107 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:09 np0005539505 NetworkManager[55134]: <info>  [1764400149.1076] manager: (tap61999b35-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/180)
Nov 29 02:09:09 np0005539505 kernel: tap61999b35-f0: entered promiscuous mode
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.125 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.127 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap61999b35-f0, col_values=(('external_ids', {'iface-id': 'c68228ff-9afd-4bc1-81a6-230bf1aa485f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.128 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:09Z|00351|binding|INFO|Releasing lport c68228ff-9afd-4bc1-81a6-230bf1aa485f from this chassis (sb_readonly=0)
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.141 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.142 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/61999b35-f067-478e-ae7d-2c014e39aec6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/61999b35-f067-478e-ae7d-2c014e39aec6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.143 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a989fd52-8e2d-4433-b0ad-2864f42f1f1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.144 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-61999b35-f067-478e-ae7d-2c014e39aec6
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/61999b35-f067-478e-ae7d-2c014e39aec6.pid.haproxy
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 61999b35-f067-478e-ae7d-2c014e39aec6
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:09:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:09.145 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'env', 'PROCESS_TAG=haproxy-61999b35-f067-478e-ae7d-2c014e39aec6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/61999b35-f067-478e-ae7d-2c014e39aec6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.175 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400149.1753707, 94c74b87-76f9-4489-a7d0-5abe91d0db7b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.176 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] VM Started (Lifecycle Event)#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.217 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.221 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400149.1780393, 94c74b87-76f9-4489-a7d0-5abe91d0db7b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.222 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.228 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.261 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.265 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.288 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.289 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400149.2647007, 2198e4d8-4116-4747-b375-ba8212f745fd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.290 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] VM Started (Lifecycle Event)#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.313 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.317 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400149.264778, 2198e4d8-4116-4747-b375-ba8212f745fd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.317 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.334 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.337 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:09:09 np0005539505 nova_compute[186958]: 2025-11-29 07:09:09.361 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:09:09 np0005539505 podman[229011]: 2025-11-29 07:09:09.476473602 +0000 UTC m=+0.021060147 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:09:09 np0005539505 podman[229011]: 2025-11-29 07:09:09.736553685 +0000 UTC m=+0.281140200 container create 9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:09:09 np0005539505 systemd[1]: Started libpod-conmon-9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b.scope.
Nov 29 02:09:09 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:09:09 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0e3ecf043b4516753e7a5aba439bb0499e79d8cd0a5718525e7449c0a410e88/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:09:10 np0005539505 podman[229011]: 2025-11-29 07:09:10.318665294 +0000 UTC m=+0.863251829 container init 9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:09:10 np0005539505 podman[229011]: 2025-11-29 07:09:10.325011573 +0000 UTC m=+0.869598088 container start 9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:09:10 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[229027]: [NOTICE]   (229031) : New worker (229033) forked
Nov 29 02:09:10 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[229027]: [NOTICE]   (229031) : Loading success.
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.472 186962 DEBUG nova.network.neutron [req-17d69b3c-7be0-4d4e-b140-18908d394990 req-2a275737-081d-4297-89fd-774c82abcbdb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Updated VIF entry in instance network info cache for port 87d28cdc-3ed5-4ff4-99a5-89e199e95f8b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.473 186962 DEBUG nova.network.neutron [req-17d69b3c-7be0-4d4e-b140-18908d394990 req-2a275737-081d-4297-89fd-774c82abcbdb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Updating instance_info_cache with network_info: [{"id": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "address": "fa:16:3e:10:72:0c", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87d28cdc-3e", "ovs_interfaceid": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.490 186962 DEBUG oslo_concurrency.lockutils [req-17d69b3c-7be0-4d4e-b140-18908d394990 req-2a275737-081d-4297-89fd-774c82abcbdb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2198e4d8-4116-4747-b375-ba8212f745fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.729 186962 DEBUG nova.network.neutron [req-ea231294-344c-48d3-a8fd-53411e2cd3fd req-a8b81134-540d-49c6-8518-b1f1b659a85e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Updated VIF entry in instance network info cache for port b239f9b7-bdd7-4374-a10b-614ad18cffda. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.729 186962 DEBUG nova.network.neutron [req-ea231294-344c-48d3-a8fd-53411e2cd3fd req-a8b81134-540d-49c6-8518-b1f1b659a85e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Updating instance_info_cache with network_info: [{"id": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "address": "fa:16:3e:14:43:2a", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb239f9b7-bd", "ovs_interfaceid": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.744 186962 DEBUG oslo_concurrency.lockutils [req-ea231294-344c-48d3-a8fd-53411e2cd3fd req-a8b81134-540d-49c6-8518-b1f1b659a85e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-94c74b87-76f9-4489-a7d0-5abe91d0db7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.841 186962 DEBUG nova.compute.manager [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Received event network-vif-plugged-b239f9b7-bdd7-4374-a10b-614ad18cffda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.842 186962 DEBUG oslo_concurrency.lockutils [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.842 186962 DEBUG oslo_concurrency.lockutils [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.842 186962 DEBUG oslo_concurrency.lockutils [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.843 186962 DEBUG nova.compute.manager [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Processing event network-vif-plugged-b239f9b7-bdd7-4374-a10b-614ad18cffda _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.843 186962 DEBUG nova.compute.manager [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Received event network-vif-plugged-b239f9b7-bdd7-4374-a10b-614ad18cffda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.843 186962 DEBUG oslo_concurrency.lockutils [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.843 186962 DEBUG oslo_concurrency.lockutils [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.844 186962 DEBUG oslo_concurrency.lockutils [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.844 186962 DEBUG nova.compute.manager [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] No waiting events found dispatching network-vif-plugged-b239f9b7-bdd7-4374-a10b-614ad18cffda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.844 186962 WARNING nova.compute.manager [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Received unexpected event network-vif-plugged-b239f9b7-bdd7-4374-a10b-614ad18cffda for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.845 186962 DEBUG nova.compute.manager [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Received event network-vif-plugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.845 186962 DEBUG oslo_concurrency.lockutils [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.845 186962 DEBUG oslo_concurrency.lockutils [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.845 186962 DEBUG oslo_concurrency.lockutils [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.845 186962 DEBUG nova.compute.manager [req-91ecb836-4182-4a05-b248-f7437f7fbefb req-6fcdb1a1-d71d-4467-b765-3268f60d1e45 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Processing event network-vif-plugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.846 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.847 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.851 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400150.8507164, 2198e4d8-4116-4747-b375-ba8212f745fd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.851 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.853 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.853 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.859 186962 INFO nova.virt.libvirt.driver [-] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Instance spawned successfully.#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.859 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.861 186962 INFO nova.virt.libvirt.driver [-] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Instance spawned successfully.#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.862 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.884 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.889 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.902 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.902 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.903 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.903 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.904 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.904 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.915 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.916 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.916 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.917 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.917 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.918 186962 DEBUG nova.virt.libvirt.driver [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.920 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.921 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400150.8528943, 94c74b87-76f9-4489-a7d0-5abe91d0db7b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.921 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.960 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:10 np0005539505 nova_compute[186958]: 2025-11-29 07:09:10.962 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:09:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:10.987 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 87d28cdc-3ed5-4ff4-99a5-89e199e95f8b in datapath 61999b35-f067-478e-ae7d-2c014e39aec6 unbound from our chassis#033[00m
Nov 29 02:09:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:10.989 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 61999b35-f067-478e-ae7d-2c014e39aec6#033[00m
Nov 29 02:09:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:11.003 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6482725f-3b67-4399-852f-e0e35581da8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:11 np0005539505 nova_compute[186958]: 2025-11-29 07:09:11.005 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:09:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:11.031 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[733892f8-b6c7-4f53-8d09-a2f655b2b0fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:11.034 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[45c09388-63d9-4bdb-aeff-34d43acc5763]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:11 np0005539505 nova_compute[186958]: 2025-11-29 07:09:11.035 186962 INFO nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Took 9.12 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:09:11 np0005539505 nova_compute[186958]: 2025-11-29 07:09:11.036 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:11 np0005539505 nova_compute[186958]: 2025-11-29 07:09:11.038 186962 INFO nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Took 8.83 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:09:11 np0005539505 nova_compute[186958]: 2025-11-29 07:09:11.038 186962 DEBUG nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:11.061 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[745c25db-32bc-42e2-862d-bbbb2993318f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:11.077 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5c8caca4-98ac-45d7-8f26-5eea534b6675]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61999b35-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:e2:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 5, 'tx_packets': 5, 'rx_bytes': 442, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 5, 'tx_packets': 5, 'rx_bytes': 442, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559654, 'reachable_time': 20157, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 5, 'inoctets': 372, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 5, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 372, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 5, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229047, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:11.092 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[35f2310d-4e0f-4a58-a852-bbac28aeb58b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap61999b35-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559665, 'tstamp': 559665}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229048, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap61999b35-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559668, 'tstamp': 559668}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229048, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:11.094 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61999b35-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:11 np0005539505 nova_compute[186958]: 2025-11-29 07:09:11.124 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:11 np0005539505 nova_compute[186958]: 2025-11-29 07:09:11.125 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:11.125 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61999b35-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:11.126 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:09:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:11.126 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap61999b35-f0, col_values=(('external_ids', {'iface-id': 'c68228ff-9afd-4bc1-81a6-230bf1aa485f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:11.126 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:09:11 np0005539505 nova_compute[186958]: 2025-11-29 07:09:11.154 186962 INFO nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Took 9.97 seconds to build instance.#033[00m
Nov 29 02:09:11 np0005539505 nova_compute[186958]: 2025-11-29 07:09:11.169 186962 INFO nova.compute.manager [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Took 9.91 seconds to build instance.#033[00m
Nov 29 02:09:11 np0005539505 nova_compute[186958]: 2025-11-29 07:09:11.175 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:11 np0005539505 nova_compute[186958]: 2025-11-29 07:09:11.186 186962 DEBUG oslo_concurrency.lockutils [None req-f0c7b620-223a-4b7b-8418-a92a43dfd753 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:12 np0005539505 nova_compute[186958]: 2025-11-29 07:09:12.734 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:12 np0005539505 nova_compute[186958]: 2025-11-29 07:09:12.934 186962 DEBUG nova.compute.manager [req-63f0a7d1-6e59-40b7-adac-a667e8a19cac req-f17829c5-1c6d-4f73-9bf4-7a6d04e0fdb8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Received event network-vif-plugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:12 np0005539505 nova_compute[186958]: 2025-11-29 07:09:12.935 186962 DEBUG oslo_concurrency.lockutils [req-63f0a7d1-6e59-40b7-adac-a667e8a19cac req-f17829c5-1c6d-4f73-9bf4-7a6d04e0fdb8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:12 np0005539505 nova_compute[186958]: 2025-11-29 07:09:12.935 186962 DEBUG oslo_concurrency.lockutils [req-63f0a7d1-6e59-40b7-adac-a667e8a19cac req-f17829c5-1c6d-4f73-9bf4-7a6d04e0fdb8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:12 np0005539505 nova_compute[186958]: 2025-11-29 07:09:12.935 186962 DEBUG oslo_concurrency.lockutils [req-63f0a7d1-6e59-40b7-adac-a667e8a19cac req-f17829c5-1c6d-4f73-9bf4-7a6d04e0fdb8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:12 np0005539505 nova_compute[186958]: 2025-11-29 07:09:12.936 186962 DEBUG nova.compute.manager [req-63f0a7d1-6e59-40b7-adac-a667e8a19cac req-f17829c5-1c6d-4f73-9bf4-7a6d04e0fdb8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] No waiting events found dispatching network-vif-plugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:12 np0005539505 nova_compute[186958]: 2025-11-29 07:09:12.936 186962 WARNING nova.compute.manager [req-63f0a7d1-6e59-40b7-adac-a667e8a19cac req-f17829c5-1c6d-4f73-9bf4-7a6d04e0fdb8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Received unexpected event network-vif-plugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b for instance with vm_state active and task_state None.#033[00m
Nov 29 02:09:13 np0005539505 nova_compute[186958]: 2025-11-29 07:09:13.813 186962 DEBUG oslo_concurrency.lockutils [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "2198e4d8-4116-4747-b375-ba8212f745fd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:13 np0005539505 nova_compute[186958]: 2025-11-29 07:09:13.814 186962 DEBUG oslo_concurrency.lockutils [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:13 np0005539505 nova_compute[186958]: 2025-11-29 07:09:13.814 186962 DEBUG oslo_concurrency.lockutils [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:13 np0005539505 nova_compute[186958]: 2025-11-29 07:09:13.815 186962 DEBUG oslo_concurrency.lockutils [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:13 np0005539505 nova_compute[186958]: 2025-11-29 07:09:13.815 186962 DEBUG oslo_concurrency.lockutils [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:13 np0005539505 nova_compute[186958]: 2025-11-29 07:09:13.825 186962 INFO nova.compute.manager [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Terminating instance#033[00m
Nov 29 02:09:13 np0005539505 nova_compute[186958]: 2025-11-29 07:09:13.844 186962 DEBUG nova.compute.manager [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:09:13 np0005539505 kernel: tap87d28cdc-3e (unregistering): left promiscuous mode
Nov 29 02:09:13 np0005539505 NetworkManager[55134]: <info>  [1764400153.8661] device (tap87d28cdc-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:09:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:13Z|00352|binding|INFO|Releasing lport 87d28cdc-3ed5-4ff4-99a5-89e199e95f8b from this chassis (sb_readonly=0)
Nov 29 02:09:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:13Z|00353|binding|INFO|Setting lport 87d28cdc-3ed5-4ff4-99a5-89e199e95f8b down in Southbound
Nov 29 02:09:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:13Z|00354|binding|INFO|Removing iface tap87d28cdc-3e ovn-installed in OVS
Nov 29 02:09:13 np0005539505 nova_compute[186958]: 2025-11-29 07:09:13.876 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:13 np0005539505 nova_compute[186958]: 2025-11-29 07:09:13.878 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:13 np0005539505 nova_compute[186958]: 2025-11-29 07:09:13.890 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:13.889 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:72:0c 10.100.0.5'], port_security=['fa:16:3e:10:72:0c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '2198e4d8-4116-4747-b375-ba8212f745fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61999b35-f067-478e-ae7d-2c014e39aec6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'neutron:revision_number': '4', 'neutron:security_group_ids': '33049197-f5b6-46f9-bb9f-ae10a060cbf4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0493b85c-e95a-459c-8e5e-a22ec09f96c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=87d28cdc-3ed5-4ff4-99a5-89e199e95f8b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:09:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:13.890 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 87d28cdc-3ed5-4ff4-99a5-89e199e95f8b in datapath 61999b35-f067-478e-ae7d-2c014e39aec6 unbound from our chassis#033[00m
Nov 29 02:09:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:13.891 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 61999b35-f067-478e-ae7d-2c014e39aec6#033[00m
Nov 29 02:09:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:13.907 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[14c70318-78b3-43ff-9da5-186c14b0b884]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:13 np0005539505 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Nov 29 02:09:13 np0005539505 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d0000005a.scope: Consumed 3.276s CPU time.
Nov 29 02:09:13 np0005539505 systemd-machined[153285]: Machine qemu-45-instance-0000005a terminated.
Nov 29 02:09:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:13.938 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2e188687-577b-405d-93d7-45f993881f5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:13.942 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[60d6584e-f946-4785-ade3-a1e800515a41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:13.975 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[4f7f5d52-f24a-45a0-993a-6f1a388e601c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:13.991 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc08bfa-cad1-41ca-909f-e7ddb42d3a6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61999b35-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:e2:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559654, 'reachable_time': 20157, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229061, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:14.008 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a75b5306-a24d-43a7-be70-fc90d3c08a10]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap61999b35-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559665, 'tstamp': 559665}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229062, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap61999b35-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 559668, 'tstamp': 559668}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229062, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:14.010 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61999b35-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.011 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.015 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:14.015 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61999b35-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:14.015 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:09:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:14.016 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap61999b35-f0, col_values=(('external_ids', {'iface-id': 'c68228ff-9afd-4bc1-81a6-230bf1aa485f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:14.016 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.106 186962 DEBUG oslo_concurrency.lockutils [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.107 186962 DEBUG oslo_concurrency.lockutils [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.107 186962 DEBUG oslo_concurrency.lockutils [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.107 186962 DEBUG oslo_concurrency.lockutils [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.108 186962 DEBUG oslo_concurrency.lockutils [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.121 186962 INFO nova.compute.manager [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Terminating instance#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.125 186962 INFO nova.virt.libvirt.driver [-] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Instance destroyed successfully.#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.125 186962 DEBUG nova.objects.instance [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'resources' on Instance uuid 2198e4d8-4116-4747-b375-ba8212f745fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.145 186962 DEBUG nova.compute.manager [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.159 186962 DEBUG nova.virt.libvirt.vif [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-996056207',display_name='tempest-MultipleCreateTestJSON-server-996056207-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-996056207-1',id=90,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:09:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-z4wsatd8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:09:11Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=2198e4d8-4116-4747-b375-ba8212f745fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "address": "fa:16:3e:10:72:0c", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87d28cdc-3e", "ovs_interfaceid": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.159 186962 DEBUG nova.network.os_vif_util [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "address": "fa:16:3e:10:72:0c", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87d28cdc-3e", "ovs_interfaceid": "87d28cdc-3ed5-4ff4-99a5-89e199e95f8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.160 186962 DEBUG nova.network.os_vif_util [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=87d28cdc-3ed5-4ff4-99a5-89e199e95f8b,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87d28cdc-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.160 186962 DEBUG os_vif [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=87d28cdc-3ed5-4ff4-99a5-89e199e95f8b,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87d28cdc-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.162 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.162 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87d28cdc-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:14 np0005539505 kernel: tapb239f9b7-bd (unregistering): left promiscuous mode
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.165 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.166 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:09:14 np0005539505 NetworkManager[55134]: <info>  [1764400154.1698] device (tapb239f9b7-bd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.169 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.175 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:14Z|00355|binding|INFO|Releasing lport b239f9b7-bdd7-4374-a10b-614ad18cffda from this chassis (sb_readonly=0)
Nov 29 02:09:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:14Z|00356|binding|INFO|Setting lport b239f9b7-bdd7-4374-a10b-614ad18cffda down in Southbound
Nov 29 02:09:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:14Z|00357|binding|INFO|Removing iface tapb239f9b7-bd ovn-installed in OVS
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.178 186962 INFO os_vif [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:72:0c,bridge_name='br-int',has_traffic_filtering=True,id=87d28cdc-3ed5-4ff4-99a5-89e199e95f8b,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87d28cdc-3e')#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.178 186962 INFO nova.virt.libvirt.driver [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Deleting instance files /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd_del#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.179 186962 INFO nova.virt.libvirt.driver [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Deletion of /var/lib/nova/instances/2198e4d8-4116-4747-b375-ba8212f745fd_del complete#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.182 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:14.183 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:43:2a 10.100.0.14'], port_security=['fa:16:3e:14:43:2a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '94c74b87-76f9-4489-a7d0-5abe91d0db7b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61999b35-f067-478e-ae7d-2c014e39aec6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a16c3c4eb5654a7f9742906d1a6f6698', 'neutron:revision_number': '4', 'neutron:security_group_ids': '33049197-f5b6-46f9-bb9f-ae10a060cbf4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0493b85c-e95a-459c-8e5e-a22ec09f96c2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b239f9b7-bdd7-4374-a10b-614ad18cffda) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:09:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:14.184 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b239f9b7-bdd7-4374-a10b-614ad18cffda in datapath 61999b35-f067-478e-ae7d-2c014e39aec6 unbound from our chassis#033[00m
Nov 29 02:09:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:14.185 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 61999b35-f067-478e-ae7d-2c014e39aec6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:09:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:14.186 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7adc2f52-654e-47c2-93dc-ca1e1ec345fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:14.186 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 namespace which is not needed anymore#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.192 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.230 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000005b.scope: Deactivated successfully.
Nov 29 02:09:14 np0005539505 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d0000005b.scope: Consumed 3.569s CPU time.
Nov 29 02:09:14 np0005539505 systemd-machined[153285]: Machine qemu-44-instance-0000005b terminated.
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.273 186962 INFO nova.compute.manager [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.274 186962 DEBUG oslo.service.loopingcall [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.274 186962 DEBUG nova.compute.manager [-] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.275 186962 DEBUG nova.network.neutron [-] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:09:14 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[229027]: [NOTICE]   (229031) : haproxy version is 2.8.14-c23fe91
Nov 29 02:09:14 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[229027]: [NOTICE]   (229031) : path to executable is /usr/sbin/haproxy
Nov 29 02:09:14 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[229027]: [WARNING]  (229031) : Exiting Master process...
Nov 29 02:09:14 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[229027]: [ALERT]    (229031) : Current worker (229033) exited with code 143 (Terminated)
Nov 29 02:09:14 np0005539505 neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6[229027]: [WARNING]  (229031) : All workers exited. Exiting... (0)
Nov 29 02:09:14 np0005539505 systemd[1]: libpod-9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b.scope: Deactivated successfully.
Nov 29 02:09:14 np0005539505 podman[229103]: 2025-11-29 07:09:14.379278344 +0000 UTC m=+0.109944730 container died 9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.409 186962 INFO nova.virt.libvirt.driver [-] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Instance destroyed successfully.#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.409 186962 DEBUG nova.objects.instance [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lazy-loading 'resources' on Instance uuid 94c74b87-76f9-4489-a7d0-5abe91d0db7b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:14 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b-userdata-shm.mount: Deactivated successfully.
Nov 29 02:09:14 np0005539505 systemd[1]: var-lib-containers-storage-overlay-e0e3ecf043b4516753e7a5aba439bb0499e79d8cd0a5718525e7449c0a410e88-merged.mount: Deactivated successfully.
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.422 186962 DEBUG nova.virt.libvirt.vif [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-996056207',display_name='tempest-MultipleCreateTestJSON-server-996056207-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-996056207-2',id=91,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-11-29T07:09:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a16c3c4eb5654a7f9742906d1a6f6698',ramdisk_id='',reservation_id='r-z4wsatd8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-910974113',owner_user_name='tempest-MultipleCreateTestJSON-910974113-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:09:11Z,user_data=None,user_id='e621c9f314214c7980a4d441f0600e90',uuid=94c74b87-76f9-4489-a7d0-5abe91d0db7b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "address": "fa:16:3e:14:43:2a", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb239f9b7-bd", "ovs_interfaceid": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.423 186962 DEBUG nova.network.os_vif_util [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converting VIF {"id": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "address": "fa:16:3e:14:43:2a", "network": {"id": "61999b35-f067-478e-ae7d-2c014e39aec6", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-668045473-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a16c3c4eb5654a7f9742906d1a6f6698", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb239f9b7-bd", "ovs_interfaceid": "b239f9b7-bdd7-4374-a10b-614ad18cffda", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.423 186962 DEBUG nova.network.os_vif_util [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:43:2a,bridge_name='br-int',has_traffic_filtering=True,id=b239f9b7-bdd7-4374-a10b-614ad18cffda,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb239f9b7-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.423 186962 DEBUG os_vif [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:43:2a,bridge_name='br-int',has_traffic_filtering=True,id=b239f9b7-bdd7-4374-a10b-614ad18cffda,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb239f9b7-bd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.424 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.425 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb239f9b7-bd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.465 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.467 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.469 186962 INFO os_vif [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:43:2a,bridge_name='br-int',has_traffic_filtering=True,id=b239f9b7-bdd7-4374-a10b-614ad18cffda,network=Network(61999b35-f067-478e-ae7d-2c014e39aec6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb239f9b7-bd')#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.470 186962 INFO nova.virt.libvirt.driver [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Deleting instance files /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b_del#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.470 186962 INFO nova.virt.libvirt.driver [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Deletion of /var/lib/nova/instances/94c74b87-76f9-4489-a7d0-5abe91d0db7b_del complete#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.616 186962 INFO nova.compute.manager [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.617 186962 DEBUG oslo.service.loopingcall [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.617 186962 DEBUG nova.compute.manager [-] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:09:14 np0005539505 nova_compute[186958]: 2025-11-29 07:09:14.617 186962 DEBUG nova.network.neutron [-] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:09:14 np0005539505 podman[229103]: 2025-11-29 07:09:14.68720085 +0000 UTC m=+0.417867236 container cleanup 9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:09:14 np0005539505 systemd[1]: libpod-conmon-9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b.scope: Deactivated successfully.
Nov 29 02:09:15 np0005539505 podman[229151]: 2025-11-29 07:09:15.069771636 +0000 UTC m=+0.353431223 container remove 9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 02:09:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:15.075 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b3068328-b66a-4a18-9b93-2ef88495641f]: (4, ('Sat Nov 29 07:09:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 (9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b)\n9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b\nSat Nov 29 07:09:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 (9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b)\n9700cfdbf3fca2c4b9a359b4f3184107370f3a63fd68b22fb5fb1a7ad421113b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:15.077 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[31f7be75-f3b6-4a6d-9426-893bd87151b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:15.078 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61999b35-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.079 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:15 np0005539505 kernel: tap61999b35-f0: left promiscuous mode
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.096 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:15.100 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ead4a412-a89c-4f99-b809-54dea7291380]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.101 186962 DEBUG nova.compute.manager [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Received event network-vif-unplugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.101 186962 DEBUG oslo_concurrency.lockutils [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.101 186962 DEBUG oslo_concurrency.lockutils [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.101 186962 DEBUG oslo_concurrency.lockutils [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.102 186962 DEBUG nova.compute.manager [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] No waiting events found dispatching network-vif-unplugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.102 186962 DEBUG nova.compute.manager [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Received event network-vif-unplugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.102 186962 DEBUG nova.compute.manager [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Received event network-vif-plugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.102 186962 DEBUG oslo_concurrency.lockutils [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.102 186962 DEBUG oslo_concurrency.lockutils [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.103 186962 DEBUG oslo_concurrency.lockutils [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.103 186962 DEBUG nova.compute.manager [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] No waiting events found dispatching network-vif-plugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.103 186962 WARNING nova.compute.manager [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Received unexpected event network-vif-plugged-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.103 186962 DEBUG nova.compute.manager [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Received event network-vif-unplugged-b239f9b7-bdd7-4374-a10b-614ad18cffda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.103 186962 DEBUG oslo_concurrency.lockutils [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.104 186962 DEBUG oslo_concurrency.lockutils [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.104 186962 DEBUG oslo_concurrency.lockutils [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.104 186962 DEBUG nova.compute.manager [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] No waiting events found dispatching network-vif-unplugged-b239f9b7-bdd7-4374-a10b-614ad18cffda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.104 186962 DEBUG nova.compute.manager [req-1dfc4bc9-8451-4520-95c4-f1cba3c282d0 req-1927a0f3-992f-4448-8bb3-dbf35f7ebc0b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Received event network-vif-unplugged-b239f9b7-bdd7-4374-a10b-614ad18cffda for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:09:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:15.115 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9cbeb2-9b9b-4875-a39a-5e0a79240b65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:15.117 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[05fdd5cd-cd7c-4401-9963-d69f4fa5a890]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:15.131 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7e6f7f94-6ed7-4403-96d6-250bb9fcccf1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 559647, 'reachable_time': 37150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229166, 'error': None, 'target': 'ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:15.134 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-61999b35-f067-478e-ae7d-2c014e39aec6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:09:15 np0005539505 systemd[1]: run-netns-ovnmeta\x2d61999b35\x2df067\x2d478e\x2dae7d\x2d2c014e39aec6.mount: Deactivated successfully.
Nov 29 02:09:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:15.134 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d6784f-bf44-4dff-9da2-f46d6e59ae09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.771 186962 DEBUG nova.network.neutron [-] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.793 186962 INFO nova.compute.manager [-] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Took 1.52 seconds to deallocate network for instance.#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.888 186962 DEBUG oslo_concurrency.lockutils [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.889 186962 DEBUG oslo_concurrency.lockutils [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:15 np0005539505 nova_compute[186958]: 2025-11-29 07:09:15.987 186962 DEBUG nova.compute.provider_tree [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.006 186962 DEBUG nova.scheduler.client.report [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.051 186962 DEBUG oslo_concurrency.lockutils [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.090 186962 INFO nova.scheduler.client.report [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Deleted allocations for instance 2198e4d8-4116-4747-b375-ba8212f745fd#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.122 186962 DEBUG nova.network.neutron [-] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.153 186962 INFO nova.compute.manager [-] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Took 1.54 seconds to deallocate network for instance.#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.191 186962 DEBUG oslo_concurrency.lockutils [None req-ce908777-76dc-4995-a0c7-8c1953faab20 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "2198e4d8-4116-4747-b375-ba8212f745fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.241 186962 DEBUG oslo_concurrency.lockutils [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.241 186962 DEBUG oslo_concurrency.lockutils [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.303 186962 DEBUG nova.compute.provider_tree [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.318 186962 DEBUG nova.scheduler.client.report [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.341 186962 DEBUG oslo_concurrency.lockutils [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.346 186962 DEBUG nova.compute.manager [req-8b0440f1-ef42-4ce5-af9f-e959fcaadfad req-3952c135-f94b-4246-b728-f92fbd15f33d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Received event network-vif-deleted-b239f9b7-bdd7-4374-a10b-614ad18cffda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.373 186962 INFO nova.scheduler.client.report [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Deleted allocations for instance 94c74b87-76f9-4489-a7d0-5abe91d0db7b#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.717 186962 DEBUG oslo_concurrency.lockutils [None req-303571df-c07b-4b70-90b8-741a9544a770 e621c9f314214c7980a4d441f0600e90 a16c3c4eb5654a7f9742906d1a6f6698 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:16 np0005539505 nova_compute[186958]: 2025-11-29 07:09:16.867 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:17 np0005539505 nova_compute[186958]: 2025-11-29 07:09:17.066 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:17 np0005539505 nova_compute[186958]: 2025-11-29 07:09:17.180 186962 DEBUG nova.compute.manager [req-c03e3deb-e9bf-4f33-99a6-ea23771f8a4d req-9f28d0fd-56dd-4a03-8c51-677109f9d53e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Received event network-vif-plugged-b239f9b7-bdd7-4374-a10b-614ad18cffda external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:17 np0005539505 nova_compute[186958]: 2025-11-29 07:09:17.181 186962 DEBUG oslo_concurrency.lockutils [req-c03e3deb-e9bf-4f33-99a6-ea23771f8a4d req-9f28d0fd-56dd-4a03-8c51-677109f9d53e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:17 np0005539505 nova_compute[186958]: 2025-11-29 07:09:17.181 186962 DEBUG oslo_concurrency.lockutils [req-c03e3deb-e9bf-4f33-99a6-ea23771f8a4d req-9f28d0fd-56dd-4a03-8c51-677109f9d53e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:17 np0005539505 nova_compute[186958]: 2025-11-29 07:09:17.182 186962 DEBUG oslo_concurrency.lockutils [req-c03e3deb-e9bf-4f33-99a6-ea23771f8a4d req-9f28d0fd-56dd-4a03-8c51-677109f9d53e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "94c74b87-76f9-4489-a7d0-5abe91d0db7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:17 np0005539505 nova_compute[186958]: 2025-11-29 07:09:17.182 186962 DEBUG nova.compute.manager [req-c03e3deb-e9bf-4f33-99a6-ea23771f8a4d req-9f28d0fd-56dd-4a03-8c51-677109f9d53e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] No waiting events found dispatching network-vif-plugged-b239f9b7-bdd7-4374-a10b-614ad18cffda pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:17 np0005539505 nova_compute[186958]: 2025-11-29 07:09:17.183 186962 WARNING nova.compute.manager [req-c03e3deb-e9bf-4f33-99a6-ea23771f8a4d req-9f28d0fd-56dd-4a03-8c51-677109f9d53e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Received unexpected event network-vif-plugged-b239f9b7-bdd7-4374-a10b-614ad18cffda for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:09:17 np0005539505 nova_compute[186958]: 2025-11-29 07:09:17.183 186962 DEBUG nova.compute.manager [req-c03e3deb-e9bf-4f33-99a6-ea23771f8a4d req-9f28d0fd-56dd-4a03-8c51-677109f9d53e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Received event network-vif-deleted-87d28cdc-3ed5-4ff4-99a5-89e199e95f8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:17 np0005539505 nova_compute[186958]: 2025-11-29 07:09:17.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:17 np0005539505 podman[229169]: 2025-11-29 07:09:17.736638384 +0000 UTC m=+0.063992901 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:09:17 np0005539505 podman[229168]: 2025-11-29 07:09:17.771124839 +0000 UTC m=+0.101619194 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Nov 29 02:09:19 np0005539505 nova_compute[186958]: 2025-11-29 07:09:19.233 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:19 np0005539505 nova_compute[186958]: 2025-11-29 07:09:19.465 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:20 np0005539505 podman[229213]: 2025-11-29 07:09:20.722296275 +0000 UTC m=+0.058729422 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:09:24 np0005539505 nova_compute[186958]: 2025-11-29 07:09:24.235 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:24 np0005539505 nova_compute[186958]: 2025-11-29 07:09:24.467 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:26.947 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:26.948 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:26.948 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:29 np0005539505 nova_compute[186958]: 2025-11-29 07:09:29.122 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400154.1207874, 2198e4d8-4116-4747-b375-ba8212f745fd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:29 np0005539505 nova_compute[186958]: 2025-11-29 07:09:29.122 186962 INFO nova.compute.manager [-] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:09:29 np0005539505 nova_compute[186958]: 2025-11-29 07:09:29.237 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:29 np0005539505 nova_compute[186958]: 2025-11-29 07:09:29.407 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400154.4065847, 94c74b87-76f9-4489-a7d0-5abe91d0db7b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:29 np0005539505 nova_compute[186958]: 2025-11-29 07:09:29.407 186962 INFO nova.compute.manager [-] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:09:29 np0005539505 nova_compute[186958]: 2025-11-29 07:09:29.510 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:30 np0005539505 podman[229232]: 2025-11-29 07:09:30.72638601 +0000 UTC m=+0.058548496 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:09:30 np0005539505 podman[229233]: 2025-11-29 07:09:30.76812207 +0000 UTC m=+0.095526042 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 29 02:09:31 np0005539505 nova_compute[186958]: 2025-11-29 07:09:31.050 186962 DEBUG nova.compute.manager [None req-3c979972-f466-4b33-953d-a8be185f215d - - - - - -] [instance: 2198e4d8-4116-4747-b375-ba8212f745fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:31 np0005539505 nova_compute[186958]: 2025-11-29 07:09:31.106 186962 DEBUG nova.compute.manager [None req-c0af4d50-779f-4b2b-b076-ec08fb41b79d - - - - - -] [instance: 94c74b87-76f9-4489-a7d0-5abe91d0db7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:33 np0005539505 nova_compute[186958]: 2025-11-29 07:09:33.678 186962 DEBUG nova.compute.manager [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 29 02:09:33 np0005539505 nova_compute[186958]: 2025-11-29 07:09:33.899 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:33 np0005539505 nova_compute[186958]: 2025-11-29 07:09:33.900 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:33 np0005539505 nova_compute[186958]: 2025-11-29 07:09:33.927 186962 DEBUG nova.objects.instance [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_requests' on Instance uuid 23cc8968-d9b9-42dc-b458-0683a72a0194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:33 np0005539505 nova_compute[186958]: 2025-11-29 07:09:33.945 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:09:33 np0005539505 nova_compute[186958]: 2025-11-29 07:09:33.945 186962 INFO nova.compute.claims [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:09:33 np0005539505 nova_compute[186958]: 2025-11-29 07:09:33.946 186962 DEBUG nova.objects.instance [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 23cc8968-d9b9-42dc-b458-0683a72a0194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:34 np0005539505 nova_compute[186958]: 2025-11-29 07:09:34.123 186962 DEBUG nova.objects.instance [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'pci_devices' on Instance uuid 23cc8968-d9b9-42dc-b458-0683a72a0194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:34 np0005539505 nova_compute[186958]: 2025-11-29 07:09:34.238 186962 INFO nova.compute.resource_tracker [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating resource usage from migration 238f6481-057e-4ef4-9bf5-568723a7c569#033[00m
Nov 29 02:09:34 np0005539505 nova_compute[186958]: 2025-11-29 07:09:34.239 186962 DEBUG nova.compute.resource_tracker [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Starting to track incoming migration 238f6481-057e-4ef4-9bf5-568723a7c569 with flavor e29df891-dca5-4a1c-9258-dc512a46956f _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 02:09:34 np0005539505 nova_compute[186958]: 2025-11-29 07:09:34.241 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:34 np0005539505 nova_compute[186958]: 2025-11-29 07:09:34.318 186962 DEBUG nova.compute.provider_tree [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:09:34 np0005539505 nova_compute[186958]: 2025-11-29 07:09:34.333 186962 DEBUG nova.scheduler.client.report [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:09:34 np0005539505 nova_compute[186958]: 2025-11-29 07:09:34.361 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:34 np0005539505 nova_compute[186958]: 2025-11-29 07:09:34.362 186962 INFO nova.compute.manager [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Migrating#033[00m
Nov 29 02:09:34 np0005539505 nova_compute[186958]: 2025-11-29 07:09:34.512 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:35 np0005539505 podman[229280]: 2025-11-29 07:09:35.738245675 +0000 UTC m=+0.071240865 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:09:36 np0005539505 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 02:09:36 np0005539505 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 02:09:36 np0005539505 systemd-logind[794]: New session 46 of user nova.
Nov 29 02:09:36 np0005539505 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 02:09:36 np0005539505 systemd[1]: Starting User Manager for UID 42436...
Nov 29 02:09:36 np0005539505 podman[229303]: 2025-11-29 07:09:36.907357919 +0000 UTC m=+0.059762791 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible)
Nov 29 02:09:37 np0005539505 systemd[229322]: Queued start job for default target Main User Target.
Nov 29 02:09:37 np0005539505 systemd[229322]: Created slice User Application Slice.
Nov 29 02:09:37 np0005539505 systemd[229322]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:09:37 np0005539505 systemd[229322]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 02:09:37 np0005539505 systemd[229322]: Reached target Paths.
Nov 29 02:09:37 np0005539505 systemd[229322]: Reached target Timers.
Nov 29 02:09:37 np0005539505 systemd[229322]: Starting D-Bus User Message Bus Socket...
Nov 29 02:09:37 np0005539505 systemd[229322]: Starting Create User's Volatile Files and Directories...
Nov 29 02:09:37 np0005539505 systemd[229322]: Finished Create User's Volatile Files and Directories.
Nov 29 02:09:37 np0005539505 systemd[229322]: Listening on D-Bus User Message Bus Socket.
Nov 29 02:09:37 np0005539505 systemd[229322]: Reached target Sockets.
Nov 29 02:09:37 np0005539505 systemd[229322]: Reached target Basic System.
Nov 29 02:09:37 np0005539505 systemd[229322]: Reached target Main User Target.
Nov 29 02:09:37 np0005539505 systemd[229322]: Startup finished in 133ms.
Nov 29 02:09:37 np0005539505 systemd[1]: Started User Manager for UID 42436.
Nov 29 02:09:37 np0005539505 systemd[1]: Started Session 46 of User nova.
Nov 29 02:09:37 np0005539505 systemd[1]: session-46.scope: Deactivated successfully.
Nov 29 02:09:37 np0005539505 systemd-logind[794]: Session 46 logged out. Waiting for processes to exit.
Nov 29 02:09:37 np0005539505 systemd-logind[794]: Removed session 46.
Nov 29 02:09:37 np0005539505 systemd-logind[794]: New session 48 of user nova.
Nov 29 02:09:37 np0005539505 systemd[1]: Started Session 48 of User nova.
Nov 29 02:09:37 np0005539505 systemd[1]: session-48.scope: Deactivated successfully.
Nov 29 02:09:37 np0005539505 systemd-logind[794]: Session 48 logged out. Waiting for processes to exit.
Nov 29 02:09:37 np0005539505 systemd-logind[794]: Removed session 48.
Nov 29 02:09:39 np0005539505 nova_compute[186958]: 2025-11-29 07:09:39.240 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:39 np0005539505 nova_compute[186958]: 2025-11-29 07:09:39.514 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.432 186962 DEBUG nova.compute.manager [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.546 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.546 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.593 186962 DEBUG nova.objects.instance [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'pci_requests' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.609 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.610 186962 INFO nova.compute.claims [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.610 186962 DEBUG nova.objects.instance [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'resources' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.623 186962 DEBUG nova.objects.instance [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'numa_topology' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:40 np0005539505 systemd-logind[794]: New session 49 of user nova.
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.635 186962 DEBUG nova.objects.instance [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'pci_devices' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:40 np0005539505 systemd[1]: Started Session 49 of User nova.
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.684 186962 DEBUG nova.compute.manager [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.684 186962 DEBUG oslo_concurrency.lockutils [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.684 186962 DEBUG oslo_concurrency.lockutils [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.685 186962 DEBUG oslo_concurrency.lockutils [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.685 186962 DEBUG nova.compute.manager [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.685 186962 WARNING nova.compute.manager [req-541e6b42-1b55-45e2-abf0-24432fc59b53 req-716a7488-3ab8-48d9-97ff-b12c5bb42502 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.689 186962 INFO nova.compute.resource_tracker [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating resource usage from migration 26ffe11d-ec78-48cf-95c2-9fbc8ddb9126#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.689 186962 DEBUG nova.compute.resource_tracker [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Starting to track incoming migration 26ffe11d-ec78-48cf-95c2-9fbc8ddb9126 with flavor 1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.793 186962 DEBUG nova.compute.provider_tree [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.811 186962 DEBUG nova.scheduler.client.report [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.836 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:40 np0005539505 nova_compute[186958]: 2025-11-29 07:09:40.837 186962 INFO nova.compute.manager [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Migrating#033[00m
Nov 29 02:09:41 np0005539505 systemd[1]: session-49.scope: Deactivated successfully.
Nov 29 02:09:41 np0005539505 systemd-logind[794]: Session 49 logged out. Waiting for processes to exit.
Nov 29 02:09:41 np0005539505 systemd-logind[794]: Removed session 49.
Nov 29 02:09:41 np0005539505 systemd-logind[794]: New session 50 of user nova.
Nov 29 02:09:41 np0005539505 systemd[1]: Started Session 50 of User nova.
Nov 29 02:09:41 np0005539505 systemd[1]: session-50.scope: Deactivated successfully.
Nov 29 02:09:41 np0005539505 systemd-logind[794]: Session 50 logged out. Waiting for processes to exit.
Nov 29 02:09:41 np0005539505 systemd-logind[794]: Removed session 50.
Nov 29 02:09:41 np0005539505 systemd-logind[794]: New session 51 of user nova.
Nov 29 02:09:41 np0005539505 systemd[1]: Started Session 51 of User nova.
Nov 29 02:09:42 np0005539505 systemd[1]: session-51.scope: Deactivated successfully.
Nov 29 02:09:42 np0005539505 systemd-logind[794]: Session 51 logged out. Waiting for processes to exit.
Nov 29 02:09:42 np0005539505 systemd-logind[794]: Removed session 51.
Nov 29 02:09:42 np0005539505 nova_compute[186958]: 2025-11-29 07:09:42.700 186962 INFO nova.network.neutron [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating port 18ad87ad-fee6-484b-81da-6889ed2a9af1 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 29 02:09:42 np0005539505 systemd-logind[794]: New session 52 of user nova.
Nov 29 02:09:42 np0005539505 systemd[1]: Started Session 52 of User nova.
Nov 29 02:09:42 np0005539505 nova_compute[186958]: 2025-11-29 07:09:42.811 186962 DEBUG nova.compute.manager [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:42 np0005539505 nova_compute[186958]: 2025-11-29 07:09:42.812 186962 DEBUG oslo_concurrency.lockutils [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:42 np0005539505 nova_compute[186958]: 2025-11-29 07:09:42.812 186962 DEBUG oslo_concurrency.lockutils [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:42 np0005539505 nova_compute[186958]: 2025-11-29 07:09:42.813 186962 DEBUG oslo_concurrency.lockutils [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:42 np0005539505 nova_compute[186958]: 2025-11-29 07:09:42.813 186962 DEBUG nova.compute.manager [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:42 np0005539505 nova_compute[186958]: 2025-11-29 07:09:42.813 186962 WARNING nova.compute.manager [req-6c342841-afd2-484a-b14e-fc33a59dce25 req-2127d0f4-0d68-4584-b89f-269471fb2b79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:09:42 np0005539505 systemd[1]: session-52.scope: Deactivated successfully.
Nov 29 02:09:42 np0005539505 systemd-logind[794]: Session 52 logged out. Waiting for processes to exit.
Nov 29 02:09:42 np0005539505 systemd-logind[794]: Removed session 52.
Nov 29 02:09:42 np0005539505 systemd-logind[794]: New session 53 of user nova.
Nov 29 02:09:43 np0005539505 systemd[1]: Started Session 53 of User nova.
Nov 29 02:09:43 np0005539505 systemd[1]: session-53.scope: Deactivated successfully.
Nov 29 02:09:43 np0005539505 systemd-logind[794]: Session 53 logged out. Waiting for processes to exit.
Nov 29 02:09:43 np0005539505 systemd-logind[794]: Removed session 53.
Nov 29 02:09:43 np0005539505 nova_compute[186958]: 2025-11-29 07:09:43.730 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:43 np0005539505 nova_compute[186958]: 2025-11-29 07:09:43.731 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:43 np0005539505 nova_compute[186958]: 2025-11-29 07:09:43.731 186962 DEBUG nova.network.neutron [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:09:44 np0005539505 nova_compute[186958]: 2025-11-29 07:09:44.243 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:44 np0005539505 nova_compute[186958]: 2025-11-29 07:09:44.510 186962 DEBUG nova.compute.manager [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:44 np0005539505 nova_compute[186958]: 2025-11-29 07:09:44.511 186962 DEBUG nova.compute.manager [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing instance network info cache due to event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:09:44 np0005539505 nova_compute[186958]: 2025-11-29 07:09:44.511 186962 DEBUG oslo_concurrency.lockutils [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:44 np0005539505 nova_compute[186958]: 2025-11-29 07:09:44.515 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:45 np0005539505 nova_compute[186958]: 2025-11-29 07:09:45.909 186962 DEBUG nova.network.neutron [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:45 np0005539505 nova_compute[186958]: 2025-11-29 07:09:45.970 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:09:45 np0005539505 nova_compute[186958]: 2025-11-29 07:09:45.974 186962 DEBUG oslo_concurrency.lockutils [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:45 np0005539505 nova_compute[186958]: 2025-11-29 07:09:45.974 186962 DEBUG nova.network.neutron [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing network info cache for port 18ad87ad-fee6-484b-81da-6889ed2a9af1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.207 186962 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.208 186962 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.209 186962 INFO nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Creating image(s)#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.210 186962 DEBUG nova.objects.instance [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'trusted_certs' on Instance uuid 23cc8968-d9b9-42dc-b458-0683a72a0194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.222 186962 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.283 186962 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.284 186962 DEBUG nova.virt.disk.api [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Checking if we can resize image /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.285 186962 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.349 186962 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.350 186962 DEBUG nova.virt.disk.api [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Cannot resize image /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.371 186962 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.372 186962 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Ensure instance console log exists: /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.372 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.373 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.373 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.376 186962 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Start _get_guest_xml network_info=[{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:77:06:41"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.381 186962 WARNING nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.391 186962 DEBUG nova.virt.libvirt.host [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.392 186962 DEBUG nova.virt.libvirt.host [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.397 186962 DEBUG nova.virt.libvirt.host [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.398 186962 DEBUG nova.virt.libvirt.host [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.399 186962 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.399 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e29df891-dca5-4a1c-9258-dc512a46956f',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.400 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.400 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.400 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.401 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.401 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.401 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.401 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.402 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.402 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.402 186962 DEBUG nova.virt.hardware [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.402 186962 DEBUG nova.objects.instance [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'vcpu_model' on Instance uuid 23cc8968-d9b9-42dc-b458-0683a72a0194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.436 186962 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:46 np0005539505 systemd-logind[794]: New session 54 of user nova.
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.502 186962 DEBUG oslo_concurrency.processutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.503 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.503 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.504 186962 DEBUG oslo_concurrency.lockutils [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.505 186962 DEBUG nova.virt.libvirt.vif [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1325280827',display_name='tempest-ServerActionsTestJSON-server-1325280827',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1325280827',id=87,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:09:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-4ixfsrgy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:09:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=23cc8968-d9b9-42dc-b458-0683a72a0194,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:77:06:41"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.506 186962 DEBUG nova.network.os_vif_util [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:77:06:41"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.507 186962 DEBUG nova.network.os_vif_util [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.509 186962 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  <uuid>23cc8968-d9b9-42dc-b458-0683a72a0194</uuid>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  <name>instance-00000057</name>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  <memory>196608</memory>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerActionsTestJSON-server-1325280827</nova:name>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:09:46</nova:creationTime>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.micro">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:        <nova:memory>192</nova:memory>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:        <nova:user uuid="e1b8fbcc8caa4d94b69570f233c56d18">tempest-ServerActionsTestJSON-157226036-project-member</nova:user>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:        <nova:project uuid="6e6c366001df43fb91731faf7a9578fc">tempest-ServerActionsTestJSON-157226036</nova:project>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:        <nova:port uuid="18ad87ad-fee6-484b-81da-6889ed2a9af1">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <entry name="serial">23cc8968-d9b9-42dc-b458-0683a72a0194</entry>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <entry name="uuid">23cc8968-d9b9-42dc-b458-0683a72a0194</entry>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/disk.config"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:77:06:41"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <target dev="tap18ad87ad-fe"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/console.log" append="off"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:09:46 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:09:46 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:09:46 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:09:46 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.511 186962 DEBUG nova.virt.libvirt.vif [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1325280827',display_name='tempest-ServerActionsTestJSON-server-1325280827',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1325280827',id=87,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:09:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-4ixfsrgy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:09:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=23cc8968-d9b9-42dc-b458-0683a72a0194,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:77:06:41"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.512 186962 DEBUG nova.network.os_vif_util [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-1628606092-network", "vif_mac": "fa:16:3e:77:06:41"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.513 186962 DEBUG nova.network.os_vif_util [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.513 186962 DEBUG os_vif [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.514 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.514 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.515 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.519 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.519 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap18ad87ad-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.520 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap18ad87ad-fe, col_values=(('external_ids', {'iface-id': '18ad87ad-fee6-484b-81da-6889ed2a9af1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:06:41', 'vm-uuid': '23cc8968-d9b9-42dc-b458-0683a72a0194'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:46 np0005539505 systemd[1]: Started Session 54 of User nova.
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.522 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 NetworkManager[55134]: <info>  [1764400186.5237] manager: (tap18ad87ad-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/181)
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.526 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.531 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.532 186962 INFO os_vif [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe')#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.695 186962 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.696 186962 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.696 186962 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] No VIF found with MAC fa:16:3e:77:06:41, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.697 186962 INFO nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Using config drive#033[00m
Nov 29 02:09:46 np0005539505 NetworkManager[55134]: <info>  [1764400186.7519] manager: (tap18ad87ad-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/182)
Nov 29 02:09:46 np0005539505 kernel: tap18ad87ad-fe: entered promiscuous mode
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.757 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.761 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:46Z|00358|binding|INFO|Claiming lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 for this chassis.
Nov 29 02:09:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:46Z|00359|binding|INFO|18ad87ad-fee6-484b-81da-6889ed2a9af1: Claiming fa:16:3e:77:06:41 10.100.0.10
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.765 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.769 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 NetworkManager[55134]: <info>  [1764400186.7762] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Nov 29 02:09:46 np0005539505 NetworkManager[55134]: <info>  [1764400186.7806] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.773 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 systemd-udevd[229402]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.785 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:06:41 10.100.0.10'], port_security=['fa:16:3e:77:06:41 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '23cc8968-d9b9-42dc-b458-0683a72a0194', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=18ad87ad-fee6-484b-81da-6889ed2a9af1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.787 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 18ad87ad-fee6-484b-81da-6889ed2a9af1 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 bound to our chassis#033[00m
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.789 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9226dea3-6355-4dd9-9441-d093c1f1a399#033[00m
Nov 29 02:09:46 np0005539505 NetworkManager[55134]: <info>  [1764400186.8004] device (tap18ad87ad-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:09:46 np0005539505 systemd-machined[153285]: New machine qemu-46-instance-00000057.
Nov 29 02:09:46 np0005539505 NetworkManager[55134]: <info>  [1764400186.8022] device (tap18ad87ad-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.801 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ac124687-1088-40d9-b55b-2e822ec4e959]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.803 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9226dea3-61 in ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.804 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9226dea3-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.804 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c6af4102-faa6-405a-b9c4-267ad221e347]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.806 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1f4ae6d7-65e5-4edb-968c-316129eff24d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.819 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[e470b97c-8a6d-468c-bcb0-080d73dc85ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:46 np0005539505 systemd[1]: Started Virtual Machine qemu-46-instance-00000057.
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.857 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b53ec45a-39ae-4a3f-bf65-67d0ac78d767]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.886 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[4bdeb758-57e5-47f2-92a9-f1ce638519bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:46 np0005539505 NetworkManager[55134]: <info>  [1764400186.9072] manager: (tap9226dea3-60): new Veth device (/org/freedesktop/NetworkManager/Devices/185)
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.909 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[337f9a45-5c74-4bae-8d5c-b1f65f980d11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.936 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.953 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[0b844e92-29ca-44eb-8604-bddc0dc24e0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.955 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.957 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a2c653b0-c452-4f99-a9c6-878003311a32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:46Z|00360|binding|INFO|Setting lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 ovn-installed in OVS
Nov 29 02:09:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:46Z|00361|binding|INFO|Setting lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 up in Southbound
Nov 29 02:09:46 np0005539505 nova_compute[186958]: 2025-11-29 07:09:46.964 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:46 np0005539505 systemd[1]: session-54.scope: Deactivated successfully.
Nov 29 02:09:46 np0005539505 systemd-logind[794]: Session 54 logged out. Waiting for processes to exit.
Nov 29 02:09:46 np0005539505 systemd-logind[794]: Removed session 54.
Nov 29 02:09:46 np0005539505 NetworkManager[55134]: <info>  [1764400186.9826] device (tap9226dea3-60): carrier: link connected
Nov 29 02:09:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:46.986 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f19a25c1-dbd5-4ee2-90af-e843c51fae99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.006 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0809465f-4f52-41da-8b84-bc366b18ec55]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563456, 'reachable_time': 35827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229442, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.026 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cb511071-3740-4683-b9f3-c6b84eb95424]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:493d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 563456, 'tstamp': 563456}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229445, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.046 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[136f3e9f-87ef-4060-b564-b87166adc45e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9226dea3-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:49:3d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563456, 'reachable_time': 35827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229446, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.075 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400187.0741444, 23cc8968-d9b9-42dc-b458-0683a72a0194 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.075 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.077 186962 DEBUG nova.compute.manager [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.078 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5874fa3a-24c1-4147-90a0-f0750916cdc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.080 186962 INFO nova.virt.libvirt.driver [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance running successfully.#033[00m
Nov 29 02:09:47 np0005539505 virtqemud[186353]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.083 186962 DEBUG nova.virt.libvirt.guest [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.083 186962 DEBUG nova.virt.libvirt.driver [None req-d97d0212-9496-448d-833c-51c4737b3eac e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.104 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:47 np0005539505 systemd-logind[794]: New session 55 of user nova.
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.111 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:09:47 np0005539505 systemd[1]: Started Session 55 of User nova.
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.137 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc87e89-5683-48f0-a74f-105af60f1d99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.139 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.139 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.139 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9226dea3-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.141 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:47 np0005539505 NetworkManager[55134]: <info>  [1764400187.1417] manager: (tap9226dea3-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Nov 29 02:09:47 np0005539505 kernel: tap9226dea3-60: entered promiscuous mode
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.143 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.143 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9226dea3-60, col_values=(('external_ids', {'iface-id': 'e99fae54-9bf0-4a59-8b06-7a4b6ecf1479'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.144 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:47Z|00362|binding|INFO|Releasing lport e99fae54-9bf0-4a59-8b06-7a4b6ecf1479 from this chassis (sb_readonly=0)
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.157 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.158 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.159 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f4f7584f-2f2e-4756-803b-ae75480358ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.160 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9226dea3-6355-4dd9-9441-d093c1f1a399.pid.haproxy
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9226dea3-6355-4dd9-9441-d093c1f1a399
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:09:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:47.161 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'env', 'PROCESS_TAG=haproxy-9226dea3-6355-4dd9-9441-d093c1f1a399', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9226dea3-6355-4dd9-9441-d093c1f1a399.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.191 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.191 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400187.0750895, 23cc8968-d9b9-42dc-b458-0683a72a0194 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.192 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] VM Started (Lifecycle Event)#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.235 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.238 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:09:47 np0005539505 systemd[1]: session-55.scope: Deactivated successfully.
Nov 29 02:09:47 np0005539505 systemd-logind[794]: Session 55 logged out. Waiting for processes to exit.
Nov 29 02:09:47 np0005539505 systemd-logind[794]: Removed session 55.
Nov 29 02:09:47 np0005539505 systemd-logind[794]: New session 56 of user nova.
Nov 29 02:09:47 np0005539505 systemd[1]: Started Session 56 of User nova.
Nov 29 02:09:47 np0005539505 systemd[1]: session-56.scope: Deactivated successfully.
Nov 29 02:09:47 np0005539505 systemd-logind[794]: Session 56 logged out. Waiting for processes to exit.
Nov 29 02:09:47 np0005539505 systemd-logind[794]: Removed session 56.
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.508 186962 DEBUG nova.compute.manager [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.509 186962 DEBUG oslo_concurrency.lockutils [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.510 186962 DEBUG oslo_concurrency.lockutils [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.510 186962 DEBUG oslo_concurrency.lockutils [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.510 186962 DEBUG nova.compute.manager [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:47 np0005539505 nova_compute[186958]: 2025-11-29 07:09:47.510 186962 WARNING nova.compute.manager [req-ae87e4fc-0347-4c02-83da-b1784b152d71 req-caa94dce-c0ee-47aa-a9a0-6a0735400e2a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:09:47 np0005539505 podman[229487]: 2025-11-29 07:09:47.562795009 +0000 UTC m=+0.076880064 container create c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:09:47 np0005539505 systemd[1]: Started libpod-conmon-c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee.scope.
Nov 29 02:09:47 np0005539505 podman[229487]: 2025-11-29 07:09:47.509966806 +0000 UTC m=+0.024051861 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:09:47 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:09:47 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0258ac20fbdbad11dba247ac24326edba159e7e2e62e78bacb0e95832f15e52d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:09:47 np0005539505 podman[229487]: 2025-11-29 07:09:47.80119608 +0000 UTC m=+0.315281135 container init c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 02:09:47 np0005539505 podman[229487]: 2025-11-29 07:09:47.807953701 +0000 UTC m=+0.322038736 container start c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 02:09:47 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[229503]: [NOTICE]   (229508) : New worker (229510) forked
Nov 29 02:09:47 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[229503]: [NOTICE]   (229508) : Loading success.
Nov 29 02:09:48 np0005539505 podman[229520]: 2025-11-29 07:09:48.723402694 +0000 UTC m=+0.054452561 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:09:48 np0005539505 podman[229519]: 2025-11-29 07:09:48.737238495 +0000 UTC m=+0.070546516 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.033 186962 INFO nova.network.neutron [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating port 95792ac7-cbc8-4bad-903e-600bb3d09fce with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.245 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.594 186962 DEBUG nova.compute.manager [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-unplugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.595 186962 DEBUG oslo_concurrency.lockutils [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.595 186962 DEBUG oslo_concurrency.lockutils [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.595 186962 DEBUG oslo_concurrency.lockutils [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.595 186962 DEBUG nova.compute.manager [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] No waiting events found dispatching network-vif-unplugged-95792ac7-cbc8-4bad-903e-600bb3d09fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.596 186962 WARNING nova.compute.manager [req-8939c2c2-3fad-44be-a352-d53eeefb843d req-a74c8ecf-2f1a-4e10-932e-24bb06b8a411 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received unexpected event network-vif-unplugged-95792ac7-cbc8-4bad-903e-600bb3d09fce for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.668 186962 DEBUG nova.compute.manager [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.669 186962 DEBUG oslo_concurrency.lockutils [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.669 186962 DEBUG oslo_concurrency.lockutils [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.670 186962 DEBUG oslo_concurrency.lockutils [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.670 186962 DEBUG nova.compute.manager [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:49 np0005539505 nova_compute[186958]: 2025-11-29 07:09:49.670 186962 WARNING nova.compute.manager [req-53fc399e-5fa9-4403-91b3-8a0882fb50af req-4da3c383-0bf7-453e-b376-8da58f2fe148 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:09:50 np0005539505 nova_compute[186958]: 2025-11-29 07:09:50.513 186962 DEBUG nova.network.neutron [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updated VIF entry in instance network info cache for port 18ad87ad-fee6-484b-81da-6889ed2a9af1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:09:50 np0005539505 nova_compute[186958]: 2025-11-29 07:09:50.513 186962 DEBUG nova.network.neutron [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:50 np0005539505 nova_compute[186958]: 2025-11-29 07:09:50.725 186962 DEBUG oslo_concurrency.lockutils [req-8a4719bb-cbc7-4005-b105-638c5cc309a1 req-c831fb43-930b-44f6-a2ee-62d1498088b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:09:51 np0005539505 nova_compute[186958]: 2025-11-29 07:09:51.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:51 np0005539505 nova_compute[186958]: 2025-11-29 07:09:51.523 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:51 np0005539505 podman[229562]: 2025-11-29 07:09:51.73012562 +0000 UTC m=+0.055061117 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:09:52 np0005539505 nova_compute[186958]: 2025-11-29 07:09:52.647 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:52.646 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:09:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:52.648 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:09:52 np0005539505 nova_compute[186958]: 2025-11-29 07:09:52.739 186962 DEBUG nova.compute.manager [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:52 np0005539505 nova_compute[186958]: 2025-11-29 07:09:52.739 186962 DEBUG oslo_concurrency.lockutils [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:52 np0005539505 nova_compute[186958]: 2025-11-29 07:09:52.740 186962 DEBUG oslo_concurrency.lockutils [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:52 np0005539505 nova_compute[186958]: 2025-11-29 07:09:52.740 186962 DEBUG oslo_concurrency.lockutils [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:52 np0005539505 nova_compute[186958]: 2025-11-29 07:09:52.740 186962 DEBUG nova.compute.manager [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] No waiting events found dispatching network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:52 np0005539505 nova_compute[186958]: 2025-11-29 07:09:52.741 186962 WARNING nova.compute.manager [req-59060158-a93a-47ac-8509-85aaf6077025 req-71a5f9f7-9566-49d7-8101-6cd3dc4dc493 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received unexpected event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:09:54 np0005539505 nova_compute[186958]: 2025-11-29 07:09:54.279 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:54.651 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:56 np0005539505 nova_compute[186958]: 2025-11-29 07:09:56.525 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:56 np0005539505 nova_compute[186958]: 2025-11-29 07:09:56.865 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:57 np0005539505 nova_compute[186958]: 2025-11-29 07:09:57.072 186962 DEBUG nova.network.neutron [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Port 18ad87ad-fee6-484b-81da-6889ed2a9af1 binding to destination host compute-2.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Nov 29 02:09:57 np0005539505 nova_compute[186958]: 2025-11-29 07:09:57.073 186962 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:57 np0005539505 nova_compute[186958]: 2025-11-29 07:09:57.073 186962 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquired lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:57 np0005539505 nova_compute[186958]: 2025-11-29 07:09:57.073 186962 DEBUG nova.network.neutron [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:09:57 np0005539505 nova_compute[186958]: 2025-11-29 07:09:57.251 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:57 np0005539505 nova_compute[186958]: 2025-11-29 07:09:57.252 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquired lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:57 np0005539505 nova_compute[186958]: 2025-11-29 07:09:57.252 186962 DEBUG nova.network.neutron [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:09:57 np0005539505 nova_compute[186958]: 2025-11-29 07:09:57.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:57 np0005539505 nova_compute[186958]: 2025-11-29 07:09:57.546 186962 DEBUG nova.compute.manager [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-changed-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:57 np0005539505 nova_compute[186958]: 2025-11-29 07:09:57.546 186962 DEBUG nova.compute.manager [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Refreshing instance network info cache due to event network-changed-95792ac7-cbc8-4bad-903e-600bb3d09fce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:09:57 np0005539505 nova_compute[186958]: 2025-11-29 07:09:57.547 186962 DEBUG oslo_concurrency.lockutils [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:09:57 np0005539505 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 02:09:57 np0005539505 systemd[229322]: Activating special unit Exit the Session...
Nov 29 02:09:57 np0005539505 systemd[229322]: Stopped target Main User Target.
Nov 29 02:09:57 np0005539505 systemd[229322]: Stopped target Basic System.
Nov 29 02:09:57 np0005539505 systemd[229322]: Stopped target Paths.
Nov 29 02:09:57 np0005539505 systemd[229322]: Stopped target Sockets.
Nov 29 02:09:57 np0005539505 systemd[229322]: Stopped target Timers.
Nov 29 02:09:57 np0005539505 systemd[229322]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:09:57 np0005539505 systemd[229322]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 02:09:57 np0005539505 systemd[229322]: Closed D-Bus User Message Bus Socket.
Nov 29 02:09:57 np0005539505 systemd[229322]: Stopped Create User's Volatile Files and Directories.
Nov 29 02:09:57 np0005539505 systemd[229322]: Removed slice User Application Slice.
Nov 29 02:09:57 np0005539505 systemd[229322]: Reached target Shutdown.
Nov 29 02:09:57 np0005539505 systemd[229322]: Finished Exit the Session.
Nov 29 02:09:57 np0005539505 systemd[229322]: Reached target Exit the Session.
Nov 29 02:09:57 np0005539505 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 02:09:57 np0005539505 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 02:09:57 np0005539505 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 02:09:57 np0005539505 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 02:09:57 np0005539505 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 02:09:57 np0005539505 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 02:09:57 np0005539505 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.281 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.430 186962 DEBUG nova.network.neutron [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.450 186962 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Releasing lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.467 186962 DEBUG nova.virt.libvirt.driver [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Creating tmpfile /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194/tmpevoxvd6n to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.479 186962 DEBUG nova.network.neutron [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating instance_info_cache with network_info: [{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.496 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Releasing lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:09:59 np0005539505 kernel: tap18ad87ad-fe (unregistering): left promiscuous mode
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.499 186962 DEBUG oslo_concurrency.lockutils [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.500 186962 DEBUG nova.network.neutron [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Refreshing network info cache for port 95792ac7-cbc8-4bad-903e-600bb3d09fce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:09:59 np0005539505 NetworkManager[55134]: <info>  [1764400199.5018] device (tap18ad87ad-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:09:59 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:59Z|00363|binding|INFO|Releasing lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 from this chassis (sb_readonly=0)
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.511 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:59 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:59Z|00364|binding|INFO|Setting lport 18ad87ad-fee6-484b-81da-6889ed2a9af1 down in Southbound
Nov 29 02:09:59 np0005539505 ovn_controller[95143]: 2025-11-29T07:09:59Z|00365|binding|INFO|Removing iface tap18ad87ad-fe ovn-installed in OVS
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.514 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:59.520 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:06:41 10.100.0.10'], port_security=['fa:16:3e:77:06:41 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '23cc8968-d9b9-42dc-b458-0683a72a0194', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9226dea3-6355-4dd9-9441-d093c1f1a399', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e6c366001df43fb91731faf7a9578fc', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1ab550ec-f542-4b91-8354-868da408e26d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.225', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2b9cdf8-2516-48a0-81c5-85c2327075d5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=18ad87ad-fee6-484b-81da-6889ed2a9af1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:09:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:59.521 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 18ad87ad-fee6-484b-81da-6889ed2a9af1 in datapath 9226dea3-6355-4dd9-9441-d093c1f1a399 unbound from our chassis#033[00m
Nov 29 02:09:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:59.523 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9226dea3-6355-4dd9-9441-d093c1f1a399, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:09:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:59.524 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[34f0f758-9aee-4b22-8819-3fce11822958]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:09:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:09:59.526 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 namespace which is not needed anymore#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.530 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:59 np0005539505 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000057.scope: Deactivated successfully.
Nov 29 02:09:59 np0005539505 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000057.scope: Consumed 11.944s CPU time.
Nov 29 02:09:59 np0005539505 systemd-machined[153285]: Machine qemu-46-instance-00000057 terminated.
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.611 186962 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.613 186962 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.613 186962 INFO nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Creating image(s)#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.614 186962 DEBUG nova.objects.instance [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.632 186962 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.697 186962 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.698 186962 DEBUG nova.virt.disk.api [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Checking if we can resize image /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.699 186962 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.737 186962 INFO nova.virt.libvirt.driver [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Instance destroyed successfully.#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.739 186962 DEBUG nova.objects.instance [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'resources' on Instance uuid 23cc8968-d9b9-42dc-b458-0683a72a0194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.760 186962 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.760 186962 DEBUG nova.virt.disk.api [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Cannot resize image /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.771 186962 DEBUG nova.compute.manager [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.771 186962 DEBUG oslo_concurrency.lockutils [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.772 186962 DEBUG oslo_concurrency.lockutils [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.772 186962 DEBUG oslo_concurrency.lockutils [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.772 186962 DEBUG nova.compute.manager [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.772 186962 WARNING nova.compute.manager [req-49489728-1fb9-4b7c-bbc0-5b641371be58 req-8eaf457e-330c-49df-8f5d-aa1e613c3dfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-unplugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.775 186962 DEBUG nova.virt.libvirt.vif [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1325280827',display_name='tempest-ServerActionsTestJSON-server-1325280827',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1325280827',id=87,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCMk4ml8b3fuOLAahNu+FCDl9v+SWDrsolrsZ2S6Gt/612rFnCoMRUdrLWExAZcapVyOr30SCUoxNsWGg9cONHUEcP39CbtnDaooW7qgHPKQnqdGyUHB5iGAkkvgw7SerQ==',key_name='tempest-keypair-11305483',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:09:47Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e6c366001df43fb91731faf7a9578fc',ramdisk_id='',reservation_id='r-4ixfsrgy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-157226036',owner_user_name='tempest-ServerActionsTestJSON-157226036-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:09:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e1b8fbcc8caa4d94b69570f233c56d18',uuid=23cc8968-d9b9-42dc-b458-0683a72a0194,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.775 186962 DEBUG nova.network.os_vif_util [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converting VIF {"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.776 186962 DEBUG nova.network.os_vif_util [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.776 186962 DEBUG os_vif [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.777 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.778 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18ad87ad-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.781 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.782 186962 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.783 186962 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Ensure instance console log exists: /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.783 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.783 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.784 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.786 186962 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Start _get_guest_xml network_info=[{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1326373200", "vif_mac": "fa:16:3e:a1:a1:8f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.787 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.790 186962 INFO os_vif [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:77:06:41,bridge_name='br-int',has_traffic_filtering=True,id=18ad87ad-fee6-484b-81da-6889ed2a9af1,network=Network(9226dea3-6355-4dd9-9441-d093c1f1a399),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18ad87ad-fe')#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.791 186962 INFO nova.virt.libvirt.driver [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Deleting instance files /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_del#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.796 186962 INFO nova.virt.libvirt.driver [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Deletion of /var/lib/nova/instances/23cc8968-d9b9-42dc-b458-0683a72a0194_del complete#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.802 186962 WARNING nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.807 186962 DEBUG nova.virt.libvirt.host [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.807 186962 DEBUG nova.virt.libvirt.host [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.812 186962 DEBUG nova.virt.libvirt.host [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.813 186962 DEBUG nova.virt.libvirt.host [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.814 186962 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.814 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.815 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.815 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.815 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.815 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.816 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.816 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.816 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.816 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.817 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.817 186962 DEBUG nova.virt.hardware [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.817 186962 DEBUG nova.objects.instance [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'vcpu_model' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.845 186962 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.901 186962 DEBUG oslo_concurrency.processutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.config --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.902 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.902 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.903 186962 DEBUG oslo_concurrency.lockutils [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.904 186962 DEBUG nova.virt.libvirt.vif [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1405928271',display_name='tempest-TestNetworkAdvancedServerOps-server-1405928271',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1405928271',id=89,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJzX+cYphgzFb/LmLSqgC4l/EgTLaDqQRgz2oIoLmiT9pJmbbaoOE/h8lTp9y4P6Lqu0yte5POR0cnSIwuT6ICbf/J95VY/pQuT7Mh/Rw0RaK2X3rgSaxQ5jqSeZ2XDRaw==',key_name='tempest-TestNetworkAdvancedServerOps-1604525815',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:09:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-q0t03bzu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:09:48Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=6d4e9a0c-c91c-45a4-911d-7526b420a8a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1326373200", "vif_mac": "fa:16:3e:a1:a1:8f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.905 186962 DEBUG nova.network.os_vif_util [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converting VIF {"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1326373200", "vif_mac": "fa:16:3e:a1:a1:8f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.905 186962 DEBUG nova.network.os_vif_util [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.908 186962 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  <uuid>6d4e9a0c-c91c-45a4-911d-7526b420a8a9</uuid>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  <name>instance-00000059</name>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1405928271</nova:name>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:09:59</nova:creationTime>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:        <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:        <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:        <nova:port uuid="95792ac7-cbc8-4bad-903e-600bb3d09fce">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <entry name="serial">6d4e9a0c-c91c-45a4-911d-7526b420a8a9</entry>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <entry name="uuid">6d4e9a0c-c91c-45a4-911d-7526b420a8a9</entry>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk.config"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:a1:a1:8f"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <target dev="tap95792ac7-cb"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/console.log" append="off"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:09:59 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:09:59 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:09:59 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:09:59 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.909 186962 DEBUG nova.virt.libvirt.vif [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1405928271',display_name='tempest-TestNetworkAdvancedServerOps-server-1405928271',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1405928271',id=89,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJzX+cYphgzFb/LmLSqgC4l/EgTLaDqQRgz2oIoLmiT9pJmbbaoOE/h8lTp9y4P6Lqu0yte5POR0cnSIwuT6ICbf/J95VY/pQuT7Mh/Rw0RaK2X3rgSaxQ5jqSeZ2XDRaw==',key_name='tempest-TestNetworkAdvancedServerOps-1604525815',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:09:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-q0t03bzu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:09:48Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=6d4e9a0c-c91c-45a4-911d-7526b420a8a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1326373200", "vif_mac": "fa:16:3e:a1:a1:8f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.909 186962 DEBUG nova.network.os_vif_util [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converting VIF {"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1326373200", "vif_mac": "fa:16:3e:a1:a1:8f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.910 186962 DEBUG nova.network.os_vif_util [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.910 186962 DEBUG os_vif [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.911 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.911 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.911 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.913 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.913 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap95792ac7-cb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.914 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap95792ac7-cb, col_values=(('external_ids', {'iface-id': '95792ac7-cbc8-4bad-903e-600bb3d09fce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a1:a1:8f', 'vm-uuid': '6d4e9a0c-c91c-45a4-911d-7526b420a8a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.915 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:59 np0005539505 NetworkManager[55134]: <info>  [1764400199.9160] manager: (tap95792ac7-cb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.917 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.921 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.922 186962 INFO os_vif [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb')#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.925 186962 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.925 186962 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:09:59 np0005539505 nova_compute[186958]: 2025-11-29 07:09:59.942 186962 DEBUG nova.objects.instance [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lazy-loading 'migration_context' on Instance uuid 23cc8968-d9b9-42dc-b458-0683a72a0194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.022 186962 DEBUG nova.compute.provider_tree [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.038 186962 DEBUG nova.scheduler.client.report [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.095 186962 DEBUG oslo_concurrency.lockutils [None req-614438f5-9e76-48ee-bbe9-f78c7f725ea7 e1b8fbcc8caa4d94b69570f233c56d18 6e6c366001df43fb91731faf7a9578fc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:00 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[229503]: [NOTICE]   (229508) : haproxy version is 2.8.14-c23fe91
Nov 29 02:10:00 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[229503]: [NOTICE]   (229508) : path to executable is /usr/sbin/haproxy
Nov 29 02:10:00 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[229503]: [WARNING]  (229508) : Exiting Master process...
Nov 29 02:10:00 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[229503]: [ALERT]    (229508) : Current worker (229510) exited with code 143 (Terminated)
Nov 29 02:10:00 np0005539505 neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399[229503]: [WARNING]  (229508) : All workers exited. Exiting... (0)
Nov 29 02:10:00 np0005539505 systemd[1]: libpod-c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee.scope: Deactivated successfully.
Nov 29 02:10:00 np0005539505 podman[229623]: 2025-11-29 07:10:00.204383055 +0000 UTC m=+0.594431807 container died c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:10:00 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee-userdata-shm.mount: Deactivated successfully.
Nov 29 02:10:00 np0005539505 systemd[1]: var-lib-containers-storage-overlay-0258ac20fbdbad11dba247ac24326edba159e7e2e62e78bacb0e95832f15e52d-merged.mount: Deactivated successfully.
Nov 29 02:10:00 np0005539505 podman[229623]: 2025-11-29 07:10:00.260574144 +0000 UTC m=+0.650622906 container cleanup c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:10:00 np0005539505 systemd[1]: libpod-conmon-c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee.scope: Deactivated successfully.
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.436 186962 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.437 186962 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.437 186962 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] No VIF found with MAC fa:16:3e:a1:a1:8f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.438 186962 INFO nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Using config drive#033[00m
Nov 29 02:10:00 np0005539505 kernel: tap95792ac7-cb: entered promiscuous mode
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.507 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:00Z|00366|binding|INFO|Claiming lport 95792ac7-cbc8-4bad-903e-600bb3d09fce for this chassis.
Nov 29 02:10:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:00Z|00367|binding|INFO|95792ac7-cbc8-4bad-903e-600bb3d09fce: Claiming fa:16:3e:a1:a1:8f 10.100.0.8
Nov 29 02:10:00 np0005539505 systemd-udevd[229603]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:10:00 np0005539505 NetworkManager[55134]: <info>  [1764400200.5158] manager: (tap95792ac7-cb): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.521 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:a1:8f 10.100.0.8'], port_security=['fa:16:3e:a1:a1:8f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '6d4e9a0c-c91c-45a4-911d-7526b420a8a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '6', 'neutron:security_group_ids': '376a466b-335f-4204-8812-ec229fd4d3b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.208'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2abd3f5a-1a92-4bfd-a631-54a420dbc598, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=95792ac7-cbc8-4bad-903e-600bb3d09fce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:10:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:00Z|00368|binding|INFO|Setting lport 95792ac7-cbc8-4bad-903e-600bb3d09fce ovn-installed in OVS
Nov 29 02:10:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:00Z|00369|binding|INFO|Setting lport 95792ac7-cbc8-4bad-903e-600bb3d09fce up in Southbound
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.523 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.524 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.528 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539505 NetworkManager[55134]: <info>  [1764400200.5319] device (tap95792ac7-cb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:10:00 np0005539505 NetworkManager[55134]: <info>  [1764400200.5326] device (tap95792ac7-cb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:10:00 np0005539505 podman[229682]: 2025-11-29 07:10:00.542493963 +0000 UTC m=+0.260555746 container remove c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.548 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c7314ec0-040e-40ba-91c7-ac4a16a2b314]: (4, ('Sat Nov 29 07:09:59 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee)\nc9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee\nSat Nov 29 07:10:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 (c9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee)\nc9a2e76da36011e8cc9898db4278ca7bd0ead1d657cc607eaa816f35003b3eee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.549 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[76c95e84-f0b7-4daf-ab63-73eae13c5bfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.550 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9226dea3-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.552 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539505 kernel: tap9226dea3-60: left promiscuous mode
Nov 29 02:10:00 np0005539505 systemd-machined[153285]: New machine qemu-47-instance-00000059.
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.564 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.567 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fc265ca4-0456-4008-9796-469253085daa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 systemd[1]: Started Virtual Machine qemu-47-instance-00000059.
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.580 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ce1bbf49-842a-4603-b58d-52f004affdda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.581 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ff9f6593-35ae-456b-b397-3b81886ea4ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.595 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[875bf28c-0b33-4867-84c9-a99952f68efc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563446, 'reachable_time': 25882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229712, 'error': None, 'target': 'ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9226dea3\x2d6355\x2d4dd9\x2d9441\x2dd093c1f1a399.mount: Deactivated successfully.
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.598 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9226dea3-6355-4dd9-9441-d093c1f1a399 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.598 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[06636a2e-9893-4b3e-ac9a-50cb79471285]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.599 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 95792ac7-cbc8-4bad-903e-600bb3d09fce in datapath af9d1967-d1a9-4382-82b7-d9db26a40cb7 unbound from our chassis#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.600 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network af9d1967-d1a9-4382-82b7-d9db26a40cb7#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.612 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8d26fa4f-76a4-4e59-8093-98dccead1c23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.613 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaf9d1967-d1 in ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.615 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaf9d1967-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.615 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[abad33da-2cb1-4b88-ab9e-3f510914c249]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.616 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fb389dab-b87d-4728-8e6e-ff590a676048]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.628 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[6667be47-66b4-429c-a820-9b66817a5e83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.641 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a52c48-ced9-4d40-9a49-9cf9b3e11acf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.674 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[218cebac-93be-4dc2-aba4-a969a137409e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 NetworkManager[55134]: <info>  [1764400200.6816] manager: (tapaf9d1967-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/189)
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.680 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[70d798fc-dcd5-4670-9612-630bcf935c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.710 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[4d37ec53-5dcf-4fa9-bc3d-d0f8ee8941d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.712 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1da86bb7-d82e-4482-bdd5-f77842cbd028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 NetworkManager[55134]: <info>  [1764400200.7358] device (tapaf9d1967-d0): carrier: link connected
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.741 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2023cfb0-bf59-450f-942d-44e17d1eafdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.758 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[328a7734-79f0-4295-8b1b-4090486db76d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf9d1967-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:78:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564832, 'reachable_time': 31822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229749, 'error': None, 'target': 'ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.772 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[affd0e44-c4ca-4efb-9b3e-aa1b704881d0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:78da'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564832, 'tstamp': 564832}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229750, 'error': None, 'target': 'ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.788 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[123ccb20-1d97-4a8f-a095-8a1875333d54]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf9d1967-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:78:da'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564832, 'reachable_time': 31822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229752, 'error': None, 'target': 'ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.811 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400200.8111823, 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.812 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.814 186962 DEBUG nova.compute.manager [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.814 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b21da931-7722-4f09-84e0-1a334900a5e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.817 186962 INFO nova.virt.libvirt.driver [-] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance running successfully.#033[00m
Nov 29 02:10:00 np0005539505 virtqemud[186353]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.818 186962 DEBUG nova.virt.libvirt.guest [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.819 186962 DEBUG nova.virt.libvirt.driver [None req-f39ee5cc-ae56-4adf-bd31-b8c136126a0f f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.831 186962 DEBUG nova.compute.manager [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.831 186962 DEBUG oslo_concurrency.lockutils [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.832 186962 DEBUG oslo_concurrency.lockutils [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.832 186962 DEBUG oslo_concurrency.lockutils [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.832 186962 DEBUG nova.compute.manager [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] No waiting events found dispatching network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.833 186962 WARNING nova.compute.manager [req-6cc39d73-b29c-4086-83bd-31279b3d4699 req-f98b323e-c5d1-4813-814d-40ed4d6bb1ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received unexpected event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce for instance with vm_state active and task_state resize_finish.#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.849 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.857 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.868 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[09b04a98-1e8b-4ed4-8903-6634f678c481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.870 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf9d1967-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.870 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.870 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf9d1967-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.872 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539505 kernel: tapaf9d1967-d0: entered promiscuous mode
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.873 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539505 NetworkManager[55134]: <info>  [1764400200.8741] manager: (tapaf9d1967-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.877 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaf9d1967-d0, col_values=(('external_ids', {'iface-id': '0801dae7-0304-45c2-9288-7005217fa4a8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.878 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:00Z|00370|binding|INFO|Releasing lport 0801dae7-0304-45c2-9288-7005217fa4a8 from this chassis (sb_readonly=0)
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.878 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.879 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/af9d1967-d1a9-4382-82b7-d9db26a40cb7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/af9d1967-d1a9-4382-82b7-d9db26a40cb7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.880 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[89036952-d581-4032-9809-2d882d3f9ffc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.880 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-af9d1967-d1a9-4382-82b7-d9db26a40cb7
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/af9d1967-d1a9-4382-82b7-d9db26a40cb7.pid.haproxy
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID af9d1967-d1a9-4382-82b7-d9db26a40cb7
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:10:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:00.881 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'env', 'PROCESS_TAG=haproxy-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/af9d1967-d1a9-4382-82b7-d9db26a40cb7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.890 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.892 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.893 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400200.8136141, 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.893 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] VM Started (Lifecycle Event)#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.926 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.928 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.978 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Acquiring lock "21490f0b-6f12-4093-9b77-881041a7b7e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:00 np0005539505 nova_compute[186958]: 2025-11-29 07:10:00.978 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.003 186962 DEBUG nova.compute.manager [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.103 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.104 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.111 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.112 186962 INFO nova.compute.claims [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.277 186962 DEBUG nova.compute.provider_tree [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.290 186962 DEBUG nova.scheduler.client.report [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.310 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.311 186962 DEBUG nova.compute.manager [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:10:01 np0005539505 podman[229784]: 2025-11-29 07:10:01.223387734 +0000 UTC m=+0.024302048 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.363 186962 DEBUG nova.compute.manager [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.364 186962 DEBUG nova.network.neutron [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.386 186962 INFO nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:10:01 np0005539505 podman[229784]: 2025-11-29 07:10:01.403137766 +0000 UTC m=+0.204052060 container create c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.408 186962 DEBUG nova.compute.manager [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:10:01 np0005539505 systemd[1]: Started libpod-conmon-c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3.scope.
Nov 29 02:10:01 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:10:01 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/354373ee511799f7c68bcfd63702800d23fcc0d059e93983f7befaa4fb9946f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:10:01 np0005539505 podman[229784]: 2025-11-29 07:10:01.516696706 +0000 UTC m=+0.317611030 container init c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 02:10:01 np0005539505 podman[229784]: 2025-11-29 07:10:01.523266412 +0000 UTC m=+0.324180706 container start c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 29 02:10:01 np0005539505 podman[229801]: 2025-11-29 07:10:01.528081128 +0000 UTC m=+0.084141480 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 29 02:10:01 np0005539505 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[229807]: [NOTICE]   (229853) : New worker (229855) forked
Nov 29 02:10:01 np0005539505 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[229807]: [NOTICE]   (229853) : Loading success.
Nov 29 02:10:01 np0005539505 podman[229798]: 2025-11-29 07:10:01.546852689 +0000 UTC m=+0.107058088 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.617 186962 DEBUG nova.policy [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '53dde915e52e45a3a8ca44845484339a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f328764266904de48ec1d6484635553c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.657 186962 DEBUG nova.network.neutron [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updated VIF entry in instance network info cache for port 95792ac7-cbc8-4bad-903e-600bb3d09fce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.658 186962 DEBUG nova.network.neutron [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating instance_info_cache with network_info: [{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:01 np0005539505 nova_compute[186958]: 2025-11-29 07:10:01.859 186962 DEBUG oslo_concurrency.lockutils [req-ec5561d3-ca08-4e8e-a88e-5300649d44a9 req-01148284-35dd-48c3-9f45-5cb68f24a53b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.369 186962 DEBUG nova.compute.manager [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.369 186962 DEBUG oslo_concurrency.lockutils [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.370 186962 DEBUG oslo_concurrency.lockutils [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.370 186962 DEBUG oslo_concurrency.lockutils [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.370 186962 DEBUG nova.compute.manager [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.370 186962 WARNING nova.compute.manager [req-8cd77e3c-3f5e-451b-891b-a5d2a65d71dc req-e58c370f-8d46-42bb-91ac-b82405f2fb79 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.494 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.494 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.495 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.495 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.557 186962 DEBUG nova.compute.manager [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.558 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.558 186962 INFO nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Creating image(s)#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.559 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Acquiring lock "/var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.559 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "/var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.560 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "/var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.576 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.625 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.648 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.649 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.649 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.660 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.700 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.702 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.719 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.720 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.740 186962 DEBUG nova.compute.manager [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.741 186962 DEBUG nova.compute.manager [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing instance network info cache due to event network-changed-18ad87ad-fee6-484b-81da-6889ed2a9af1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.741 186962 DEBUG oslo_concurrency.lockutils [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.742 186962 DEBUG oslo_concurrency.lockutils [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.742 186962 DEBUG nova.network.neutron [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Refreshing network info cache for port 18ad87ad-fee6-484b-81da-6889ed2a9af1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.758 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.902 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.903 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5584MB free_disk=73.19717407226562GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.904 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:02 np0005539505 nova_compute[186958]: 2025-11-29 07:10:02.904 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.029 186962 DEBUG nova.compute.manager [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.030 186962 DEBUG oslo_concurrency.lockutils [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.030 186962 DEBUG oslo_concurrency.lockutils [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.031 186962 DEBUG oslo_concurrency.lockutils [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.031 186962 DEBUG nova.compute.manager [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] No waiting events found dispatching network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.031 186962 WARNING nova.compute.manager [req-2ca69e98-9b42-4917-85d8-b3bef9ebf4c1 req-36f16fb5-6033-4d27-8117-2916afb3c022 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received unexpected event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.244 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Applying migration context for instance 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 as it has an incoming, in-progress migration 26ffe11d-ec78-48cf-95c2-9fbc8ddb9126. Migration status is confirming _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.246 186962 INFO nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating resource usage from migration 26ffe11d-ec78-48cf-95c2-9fbc8ddb9126#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.271 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.271 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 21490f0b-6f12-4093-9b77-881041a7b7e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.272 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.272 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.366 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.388 186962 DEBUG nova.network.neutron [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Successfully created port: 8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.619 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.657 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.658 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.799 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk 1073741824" returned: 0 in 1.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.800 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.801 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.859 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.860 186962 DEBUG nova.virt.disk.api [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Checking if we can resize image /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.861 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.918 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.920 186962 DEBUG nova.virt.disk.api [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Cannot resize image /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.920 186962 DEBUG nova.objects.instance [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lazy-loading 'migration_context' on Instance uuid 21490f0b-6f12-4093-9b77-881041a7b7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.960 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.961 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Ensure instance console log exists: /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.961 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.962 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:03 np0005539505 nova_compute[186958]: 2025-11-29 07:10:03.962 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:04 np0005539505 nova_compute[186958]: 2025-11-29 07:10:04.284 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:04 np0005539505 nova_compute[186958]: 2025-11-29 07:10:04.759 186962 DEBUG nova.network.neutron [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updated VIF entry in instance network info cache for port 18ad87ad-fee6-484b-81da-6889ed2a9af1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:10:04 np0005539505 nova_compute[186958]: 2025-11-29 07:10:04.760 186962 DEBUG nova.network.neutron [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Updating instance_info_cache with network_info: [{"id": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "address": "fa:16:3e:77:06:41", "network": {"id": "9226dea3-6355-4dd9-9441-d093c1f1a399", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1628606092-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e6c366001df43fb91731faf7a9578fc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18ad87ad-fe", "ovs_interfaceid": "18ad87ad-fee6-484b-81da-6889ed2a9af1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:04 np0005539505 nova_compute[186958]: 2025-11-29 07:10:04.915 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:05 np0005539505 nova_compute[186958]: 2025-11-29 07:10:05.216 186962 DEBUG oslo_concurrency.lockutils [req-df356605-2ee1-43b6-aa52-db02b4e2e201 req-dce590cc-974a-4e87-b305-9b183688d55e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-23cc8968-d9b9-42dc-b458-0683a72a0194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:10:05 np0005539505 nova_compute[186958]: 2025-11-29 07:10:05.782 186962 DEBUG nova.network.neutron [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Successfully updated port: 8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:10:05 np0005539505 nova_compute[186958]: 2025-11-29 07:10:05.816 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Acquiring lock "refresh_cache-21490f0b-6f12-4093-9b77-881041a7b7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:10:05 np0005539505 nova_compute[186958]: 2025-11-29 07:10:05.816 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Acquired lock "refresh_cache-21490f0b-6f12-4093-9b77-881041a7b7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:10:05 np0005539505 nova_compute[186958]: 2025-11-29 07:10:05.817 186962 DEBUG nova.network.neutron [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.282 186962 DEBUG nova.network.neutron [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.338 186962 DEBUG nova.compute.manager [req-d867d44c-2065-4eb2-8e1c-2ed67461cf69 req-31735ec8-1154-47ca-bed6-6525d7effa5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Received event network-changed-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.338 186962 DEBUG nova.compute.manager [req-d867d44c-2065-4eb2-8e1c-2ed67461cf69 req-31735ec8-1154-47ca-bed6-6525d7effa5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Refreshing instance network info cache due to event network-changed-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.339 186962 DEBUG oslo_concurrency.lockutils [req-d867d44c-2065-4eb2-8e1c-2ed67461cf69 req-31735ec8-1154-47ca-bed6-6525d7effa5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-21490f0b-6f12-4093-9b77-881041a7b7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.524 186962 DEBUG nova.compute.manager [req-b712e1e1-446a-49b9-b6ee-0dce65831ecc req-97adee3a-8fa8-4542-b5e7-cc1d186763ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.524 186962 DEBUG oslo_concurrency.lockutils [req-b712e1e1-446a-49b9-b6ee-0dce65831ecc req-97adee3a-8fa8-4542-b5e7-cc1d186763ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.525 186962 DEBUG oslo_concurrency.lockutils [req-b712e1e1-446a-49b9-b6ee-0dce65831ecc req-97adee3a-8fa8-4542-b5e7-cc1d186763ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.525 186962 DEBUG oslo_concurrency.lockutils [req-b712e1e1-446a-49b9-b6ee-0dce65831ecc req-97adee3a-8fa8-4542-b5e7-cc1d186763ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "23cc8968-d9b9-42dc-b458-0683a72a0194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.526 186962 DEBUG nova.compute.manager [req-b712e1e1-446a-49b9-b6ee-0dce65831ecc req-97adee3a-8fa8-4542-b5e7-cc1d186763ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] No waiting events found dispatching network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.526 186962 WARNING nova.compute.manager [req-b712e1e1-446a-49b9-b6ee-0dce65831ecc req-97adee3a-8fa8-4542-b5e7-cc1d186763ae 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Received unexpected event network-vif-plugged-18ad87ad-fee6-484b-81da-6889ed2a9af1 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.657 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.659 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.659 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.660 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:10:06 np0005539505 nova_compute[186958]: 2025-11-29 07:10:06.718 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:10:06 np0005539505 podman[229888]: 2025-11-29 07:10:06.730115091 +0000 UTC m=+0.057960860 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.011 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.013 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.013 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.013 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.272 186962 DEBUG nova.network.neutron [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Updating instance_info_cache with network_info: [{"id": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "address": "fa:16:3e:c5:26:c3", "network": {"id": "fd0ae0dd-8827-4082-9379-b72cc347da8a", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-834438218-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f328764266904de48ec1d6484635553c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968d6e6-fd", "ovs_interfaceid": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.690 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Releasing lock "refresh_cache-21490f0b-6f12-4093-9b77-881041a7b7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.691 186962 DEBUG nova.compute.manager [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Instance network_info: |[{"id": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "address": "fa:16:3e:c5:26:c3", "network": {"id": "fd0ae0dd-8827-4082-9379-b72cc347da8a", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-834438218-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f328764266904de48ec1d6484635553c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968d6e6-fd", "ovs_interfaceid": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.691 186962 DEBUG oslo_concurrency.lockutils [req-d867d44c-2065-4eb2-8e1c-2ed67461cf69 req-31735ec8-1154-47ca-bed6-6525d7effa5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-21490f0b-6f12-4093-9b77-881041a7b7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.691 186962 DEBUG nova.network.neutron [req-d867d44c-2065-4eb2-8e1c-2ed67461cf69 req-31735ec8-1154-47ca-bed6-6525d7effa5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Refreshing network info cache for port 8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.694 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Start _get_guest_xml network_info=[{"id": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "address": "fa:16:3e:c5:26:c3", "network": {"id": "fd0ae0dd-8827-4082-9379-b72cc347da8a", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-834438218-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f328764266904de48ec1d6484635553c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968d6e6-fd", "ovs_interfaceid": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.699 186962 WARNING nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.710 186962 DEBUG nova.virt.libvirt.host [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.711 186962 DEBUG nova.virt.libvirt.host [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.715 186962 DEBUG nova.virt.libvirt.host [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.716 186962 DEBUG nova.virt.libvirt.host [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.717 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.717 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.718 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.718 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.719 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.719 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.719 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.719 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.720 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.720 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.720 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.721 186962 DEBUG nova.virt.hardware [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.724 186962 DEBUG nova.virt.libvirt.vif [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:09:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-1632671651',display_name='tempest-NoVNCConsoleTestJSON-server-1632671651',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-1632671651',id=93,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f328764266904de48ec1d6484635553c',ramdisk_id='',reservation_id='r-vbp7ruwh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-110685081',owner_user_name='tempest-NoVNCConsoleTestJSON-110685081-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:10:01Z,user_data=None,user_id='53dde915e52e45a3a8ca44845484339a',uuid=21490f0b-6f12-4093-9b77-881041a7b7e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "address": "fa:16:3e:c5:26:c3", "network": {"id": "fd0ae0dd-8827-4082-9379-b72cc347da8a", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-834438218-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f328764266904de48ec1d6484635553c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968d6e6-fd", "ovs_interfaceid": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.724 186962 DEBUG nova.network.os_vif_util [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Converting VIF {"id": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "address": "fa:16:3e:c5:26:c3", "network": {"id": "fd0ae0dd-8827-4082-9379-b72cc347da8a", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-834438218-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f328764266904de48ec1d6484635553c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968d6e6-fd", "ovs_interfaceid": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:10:07 np0005539505 podman[229908]: 2025-11-29 07:10:07.725201443 +0000 UTC m=+0.056406446 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.725 186962 DEBUG nova.network.os_vif_util [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:26:c3,bridge_name='br-int',has_traffic_filtering=True,id=8968d6e6-fda0-44a5-9acd-fd9086fbcaa6,network=Network(fd0ae0dd-8827-4082-9379-b72cc347da8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8968d6e6-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:10:07 np0005539505 nova_compute[186958]: 2025-11-29 07:10:07.726 186962 DEBUG nova.objects.instance [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lazy-loading 'pci_devices' on Instance uuid 21490f0b-6f12-4093-9b77-881041a7b7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.172 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  <uuid>21490f0b-6f12-4093-9b77-881041a7b7e1</uuid>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  <name>instance-0000005d</name>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <nova:name>tempest-NoVNCConsoleTestJSON-server-1632671651</nova:name>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:10:07</nova:creationTime>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:        <nova:user uuid="53dde915e52e45a3a8ca44845484339a">tempest-NoVNCConsoleTestJSON-110685081-project-member</nova:user>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:        <nova:project uuid="f328764266904de48ec1d6484635553c">tempest-NoVNCConsoleTestJSON-110685081</nova:project>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:        <nova:port uuid="8968d6e6-fda0-44a5-9acd-fd9086fbcaa6">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <entry name="serial">21490f0b-6f12-4093-9b77-881041a7b7e1</entry>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <entry name="uuid">21490f0b-6f12-4093-9b77-881041a7b7e1</entry>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk.config"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:c5:26:c3"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <target dev="tap8968d6e6-fd"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/console.log" append="off"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:10:08 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:10:08 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:10:08 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:10:08 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.180 186962 DEBUG nova.compute.manager [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Preparing to wait for external event network-vif-plugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.181 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Acquiring lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.181 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.181 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.182 186962 DEBUG nova.virt.libvirt.vif [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:09:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-1632671651',display_name='tempest-NoVNCConsoleTestJSON-server-1632671651',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-1632671651',id=93,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f328764266904de48ec1d6484635553c',ramdisk_id='',reservation_id='r-vbp7ruwh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-NoVNCConsoleTestJSON-110685081',owner_user_name='tempest-NoVNCConsoleTestJSON-110685081-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:10:01Z,user_data=None,user_id='53dde915e52e45a3a8ca44845484339a',uuid=21490f0b-6f12-4093-9b77-881041a7b7e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "address": "fa:16:3e:c5:26:c3", "network": {"id": "fd0ae0dd-8827-4082-9379-b72cc347da8a", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-834438218-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f328764266904de48ec1d6484635553c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968d6e6-fd", "ovs_interfaceid": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.183 186962 DEBUG nova.network.os_vif_util [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Converting VIF {"id": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "address": "fa:16:3e:c5:26:c3", "network": {"id": "fd0ae0dd-8827-4082-9379-b72cc347da8a", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-834438218-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f328764266904de48ec1d6484635553c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968d6e6-fd", "ovs_interfaceid": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.184 186962 DEBUG nova.network.os_vif_util [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:26:c3,bridge_name='br-int',has_traffic_filtering=True,id=8968d6e6-fda0-44a5-9acd-fd9086fbcaa6,network=Network(fd0ae0dd-8827-4082-9379-b72cc347da8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8968d6e6-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.184 186962 DEBUG os_vif [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:26:c3,bridge_name='br-int',has_traffic_filtering=True,id=8968d6e6-fda0-44a5-9acd-fd9086fbcaa6,network=Network(fd0ae0dd-8827-4082-9379-b72cc347da8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8968d6e6-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.185 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.186 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.186 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.190 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.190 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8968d6e6-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.190 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8968d6e6-fd, col_values=(('external_ids', {'iface-id': '8968d6e6-fda0-44a5-9acd-fd9086fbcaa6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c5:26:c3', 'vm-uuid': '21490f0b-6f12-4093-9b77-881041a7b7e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.192 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:08 np0005539505 NetworkManager[55134]: <info>  [1764400208.1939] manager: (tap8968d6e6-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.195 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.201 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.202 186962 INFO os_vif [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:26:c3,bridge_name='br-int',has_traffic_filtering=True,id=8968d6e6-fda0-44a5-9acd-fd9086fbcaa6,network=Network(fd0ae0dd-8827-4082-9379-b72cc347da8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8968d6e6-fd')#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.339 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.340 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.340 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] No VIF found with MAC fa:16:3e:c5:26:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:10:08 np0005539505 nova_compute[186958]: 2025-11-29 07:10:08.341 186962 INFO nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Using config drive#033[00m
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.082 186962 INFO nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Creating config drive at /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk.config#033[00m
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.087 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpar93qg4g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.214 186962 DEBUG oslo_concurrency.processutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpar93qg4g" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:10:09 np0005539505 kernel: tap8968d6e6-fd: entered promiscuous mode
Nov 29 02:10:09 np0005539505 NetworkManager[55134]: <info>  [1764400209.2673] manager: (tap8968d6e6-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Nov 29 02:10:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:09Z|00371|binding|INFO|Claiming lport 8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 for this chassis.
Nov 29 02:10:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:09Z|00372|binding|INFO|8968d6e6-fda0-44a5-9acd-fd9086fbcaa6: Claiming fa:16:3e:c5:26:c3 10.100.0.3
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.280 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.277 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:26:c3 10.100.0.3'], port_security=['fa:16:3e:c5:26:c3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '21490f0b-6f12-4093-9b77-881041a7b7e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd0ae0dd-8827-4082-9379-b72cc347da8a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f328764266904de48ec1d6484635553c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '92b3f996-a15f-4f11-93f1-541bc2240d76', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ea705d0-3404-4b52-aa05-fc46263c64a7, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=8968d6e6-fda0-44a5-9acd-fd9086fbcaa6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.280 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 in datapath fd0ae0dd-8827-4082-9379-b72cc347da8a bound to our chassis#033[00m
Nov 29 02:10:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:09Z|00373|binding|INFO|Setting lport 8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 ovn-installed in OVS
Nov 29 02:10:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:09Z|00374|binding|INFO|Setting lport 8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 up in Southbound
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.287 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.285 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd0ae0dd-8827-4082-9379-b72cc347da8a#033[00m
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.292 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.303 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5d1f5616-0d75-42c8-9a3c-ff2228b14831]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.303 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd0ae0dd-81 in ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.306 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd0ae0dd-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.306 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3bbba7b7-07a7-49f6-806b-a8124fcd1fc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.307 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[941b86b8-1be0-4a89-a73b-0c638e31e597]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 systemd-machined[153285]: New machine qemu-48-instance-0000005d.
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.324 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[81a0f72e-c848-490d-9de4-349d5035a115]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 systemd[1]: Started Virtual Machine qemu-48-instance-0000005d.
Nov 29 02:10:09 np0005539505 systemd-udevd[229952]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.346 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe8be60-75ae-4f18-8669-6e690f0b586c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 NetworkManager[55134]: <info>  [1764400209.3575] device (tap8968d6e6-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:10:09 np0005539505 NetworkManager[55134]: <info>  [1764400209.3582] device (tap8968d6e6-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.375 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7fc11b-bd7c-4e88-88b2-d8966523cea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.381 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce7edd0-6e3b-4610-a4bb-985a15e64730]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 NetworkManager[55134]: <info>  [1764400209.3827] manager: (tapfd0ae0dd-80): new Veth device (/org/freedesktop/NetworkManager/Devices/193)
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.414 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[fe495633-f3dc-4877-bc18-487a0ba7056a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.417 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3e74d998-2406-4ef2-917c-58b40e3a8c35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 NetworkManager[55134]: <info>  [1764400209.4429] device (tapfd0ae0dd-80): carrier: link connected
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.448 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3da4e2d0-61b4-4930-ace0-b41e99a080a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.470 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad01af6-d22e-468a-a920-a79bd3e297b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd0ae0dd-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:21:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565702, 'reachable_time': 17851, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229982, 'error': None, 'target': 'ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.476 186962 DEBUG nova.network.neutron [req-d867d44c-2065-4eb2-8e1c-2ed67461cf69 req-31735ec8-1154-47ca-bed6-6525d7effa5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Updated VIF entry in instance network info cache for port 8968d6e6-fda0-44a5-9acd-fd9086fbcaa6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.477 186962 DEBUG nova.network.neutron [req-d867d44c-2065-4eb2-8e1c-2ed67461cf69 req-31735ec8-1154-47ca-bed6-6525d7effa5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Updating instance_info_cache with network_info: [{"id": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "address": "fa:16:3e:c5:26:c3", "network": {"id": "fd0ae0dd-8827-4082-9379-b72cc347da8a", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-834438218-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f328764266904de48ec1d6484635553c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968d6e6-fd", "ovs_interfaceid": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.486 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[170ff862-b67b-4a93-a654-5cb6aef56d42]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:21ec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565702, 'tstamp': 565702}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229983, 'error': None, 'target': 'ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.507 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa0b920-bfbf-497b-b6a3-cb74fc06b16d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd0ae0dd-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:21:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565702, 'reachable_time': 17851, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229984, 'error': None, 'target': 'ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.537 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad1e580-b400-435d-8e9b-be231854d5aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.607 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b087a4fd-43cb-4f44-84c4-00439d49d361]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.608 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd0ae0dd-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.608 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.609 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd0ae0dd-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.610 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:09 np0005539505 kernel: tapfd0ae0dd-80: entered promiscuous mode
Nov 29 02:10:09 np0005539505 NetworkManager[55134]: <info>  [1764400209.6114] manager: (tapfd0ae0dd-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.613 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.614 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd0ae0dd-80, col_values=(('external_ids', {'iface-id': '8ade8bf6-3e4d-4b35-9d76-4ca141d9b803'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.615 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:09Z|00375|binding|INFO|Releasing lport 8ade8bf6-3e4d-4b35-9d76-4ca141d9b803 from this chassis (sb_readonly=0)
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.626 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.627 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd0ae0dd-8827-4082-9379-b72cc347da8a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd0ae0dd-8827-4082-9379-b72cc347da8a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.628 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3f998287-1b8e-49b6-ada1-84cfb867be19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.629 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-fd0ae0dd-8827-4082-9379-b72cc347da8a
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/fd0ae0dd-8827-4082-9379-b72cc347da8a.pid.haproxy
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID fd0ae0dd-8827-4082-9379-b72cc347da8a
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:10:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:09.630 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a', 'env', 'PROCESS_TAG=haproxy-fd0ae0dd-8827-4082-9379-b72cc347da8a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd0ae0dd-8827-4082-9379-b72cc347da8a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.770 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400209.769682, 21490f0b-6f12-4093-9b77-881041a7b7e1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:09 np0005539505 nova_compute[186958]: 2025-11-29 07:10:09.771 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] VM Started (Lifecycle Event)#033[00m
Nov 29 02:10:10 np0005539505 podman[230023]: 2025-11-29 07:10:09.956186408 +0000 UTC m=+0.019184753 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:10:10 np0005539505 nova_compute[186958]: 2025-11-29 07:10:10.216 186962 DEBUG oslo_concurrency.lockutils [req-d867d44c-2065-4eb2-8e1c-2ed67461cf69 req-31735ec8-1154-47ca-bed6-6525d7effa5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-21490f0b-6f12-4093-9b77-881041a7b7e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:10:10 np0005539505 podman[230023]: 2025-11-29 07:10:10.217962359 +0000 UTC m=+0.280960664 container create a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 02:10:10 np0005539505 systemd[1]: Started libpod-conmon-a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040.scope.
Nov 29 02:10:10 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:10:10 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5dea823592dc5b317e331ab9b16e1d2866df42a3ec92fe542afef64a1b443328/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:10:10 np0005539505 podman[230023]: 2025-11-29 07:10:10.33897457 +0000 UTC m=+0.401972905 container init a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:10:10 np0005539505 podman[230023]: 2025-11-29 07:10:10.344300401 +0000 UTC m=+0.407298716 container start a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 02:10:10 np0005539505 neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a[230038]: [NOTICE]   (230042) : New worker (230044) forked
Nov 29 02:10:10 np0005539505 neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a[230038]: [NOTICE]   (230042) : Loading success.
Nov 29 02:10:10 np0005539505 nova_compute[186958]: 2025-11-29 07:10:10.450 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:10 np0005539505 nova_compute[186958]: 2025-11-29 07:10:10.454 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400209.7698674, 21490f0b-6f12-4093-9b77-881041a7b7e1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:10 np0005539505 nova_compute[186958]: 2025-11-29 07:10:10.454 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:10:10 np0005539505 nova_compute[186958]: 2025-11-29 07:10:10.654 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:10 np0005539505 nova_compute[186958]: 2025-11-29 07:10:10.658 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:10:10 np0005539505 nova_compute[186958]: 2025-11-29 07:10:10.734 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:10:11 np0005539505 nova_compute[186958]: 2025-11-29 07:10:11.907 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating instance_info_cache with network_info: [{"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.730 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.731 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.732 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.733 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.733 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.740 186962 DEBUG nova.compute.manager [req-e524c84d-ff0e-4b35-9877-6f7bc0933e96 req-634e97d0-8f28-4129-96ae-e87c7220be63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Received event network-vif-plugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.741 186962 DEBUG oslo_concurrency.lockutils [req-e524c84d-ff0e-4b35-9877-6f7bc0933e96 req-634e97d0-8f28-4129-96ae-e87c7220be63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.741 186962 DEBUG oslo_concurrency.lockutils [req-e524c84d-ff0e-4b35-9877-6f7bc0933e96 req-634e97d0-8f28-4129-96ae-e87c7220be63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.742 186962 DEBUG oslo_concurrency.lockutils [req-e524c84d-ff0e-4b35-9877-6f7bc0933e96 req-634e97d0-8f28-4129-96ae-e87c7220be63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.742 186962 DEBUG nova.compute.manager [req-e524c84d-ff0e-4b35-9877-6f7bc0933e96 req-634e97d0-8f28-4129-96ae-e87c7220be63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Processing event network-vif-plugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.743 186962 DEBUG nova.compute.manager [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.748 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400212.7479653, 21490f0b-6f12-4093-9b77-881041a7b7e1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.749 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.751 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.753 186962 INFO nova.virt.libvirt.driver [-] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Instance spawned successfully.#033[00m
Nov 29 02:10:12 np0005539505 nova_compute[186958]: 2025-11-29 07:10:12.754 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:10:13 np0005539505 nova_compute[186958]: 2025-11-29 07:10:13.104 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:13 np0005539505 nova_compute[186958]: 2025-11-29 07:10:13.110 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:10:13 np0005539505 nova_compute[186958]: 2025-11-29 07:10:13.114 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:13 np0005539505 nova_compute[186958]: 2025-11-29 07:10:13.115 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:13 np0005539505 nova_compute[186958]: 2025-11-29 07:10:13.116 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:13 np0005539505 nova_compute[186958]: 2025-11-29 07:10:13.116 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:13 np0005539505 nova_compute[186958]: 2025-11-29 07:10:13.116 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:13 np0005539505 nova_compute[186958]: 2025-11-29 07:10:13.117 186962 DEBUG nova.virt.libvirt.driver [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:10:13 np0005539505 nova_compute[186958]: 2025-11-29 07:10:13.193 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:13 np0005539505 nova_compute[186958]: 2025-11-29 07:10:13.248 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:10:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:13Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a1:a1:8f 10.100.0.8
Nov 29 02:10:14 np0005539505 nova_compute[186958]: 2025-11-29 07:10:14.292 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:14 np0005539505 nova_compute[186958]: 2025-11-29 07:10:14.735 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400199.734764, 23cc8968-d9b9-42dc-b458-0683a72a0194 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:14 np0005539505 nova_compute[186958]: 2025-11-29 07:10:14.736 186962 INFO nova.compute.manager [-] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:10:15 np0005539505 nova_compute[186958]: 2025-11-29 07:10:15.949 186962 DEBUG nova.compute.manager [None req-6b4f3015-5ca7-49ba-a4c9-982b56623853 - - - - - -] [instance: 23cc8968-d9b9-42dc-b458-0683a72a0194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:15 np0005539505 nova_compute[186958]: 2025-11-29 07:10:15.956 186962 INFO nova.compute.manager [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Took 13.40 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:10:15 np0005539505 nova_compute[186958]: 2025-11-29 07:10:15.957 186962 DEBUG nova.compute.manager [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:17 np0005539505 nova_compute[186958]: 2025-11-29 07:10:17.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:17 np0005539505 nova_compute[186958]: 2025-11-29 07:10:17.924 186962 DEBUG nova.compute.manager [req-28d532ee-f00a-49e9-8699-740e2fe4019e req-e58e7786-5613-4f17-88e7-4a94a5b9aa24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Received event network-vif-plugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:17 np0005539505 nova_compute[186958]: 2025-11-29 07:10:17.925 186962 DEBUG oslo_concurrency.lockutils [req-28d532ee-f00a-49e9-8699-740e2fe4019e req-e58e7786-5613-4f17-88e7-4a94a5b9aa24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:17 np0005539505 nova_compute[186958]: 2025-11-29 07:10:17.925 186962 DEBUG oslo_concurrency.lockutils [req-28d532ee-f00a-49e9-8699-740e2fe4019e req-e58e7786-5613-4f17-88e7-4a94a5b9aa24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:17 np0005539505 nova_compute[186958]: 2025-11-29 07:10:17.925 186962 DEBUG oslo_concurrency.lockutils [req-28d532ee-f00a-49e9-8699-740e2fe4019e req-e58e7786-5613-4f17-88e7-4a94a5b9aa24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:17 np0005539505 nova_compute[186958]: 2025-11-29 07:10:17.926 186962 DEBUG nova.compute.manager [req-28d532ee-f00a-49e9-8699-740e2fe4019e req-e58e7786-5613-4f17-88e7-4a94a5b9aa24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] No waiting events found dispatching network-vif-plugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:17 np0005539505 nova_compute[186958]: 2025-11-29 07:10:17.926 186962 WARNING nova.compute.manager [req-28d532ee-f00a-49e9-8699-740e2fe4019e req-e58e7786-5613-4f17-88e7-4a94a5b9aa24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Received unexpected event network-vif-plugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:10:18 np0005539505 nova_compute[186958]: 2025-11-29 07:10:18.198 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:19 np0005539505 nova_compute[186958]: 2025-11-29 07:10:19.296 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:19 np0005539505 podman[230065]: 2025-11-29 07:10:19.741722135 +0000 UTC m=+0.069354470 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:10:19 np0005539505 podman[230064]: 2025-11-29 07:10:19.74894921 +0000 UTC m=+0.074389442 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, version=9.6, config_id=edpm, architecture=x86_64)
Nov 29 02:10:21 np0005539505 nova_compute[186958]: 2025-11-29 07:10:21.013 186962 INFO nova.compute.manager [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Took 19.94 seconds to build instance.#033[00m
Nov 29 02:10:21 np0005539505 nova_compute[186958]: 2025-11-29 07:10:21.610 186962 DEBUG oslo_concurrency.lockutils [None req-60e66f5d-4175-439f-98a8-111c0c60aa92 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:22 np0005539505 nova_compute[186958]: 2025-11-29 07:10:22.664 186962 INFO nova.compute.manager [None req-4aedbb3c-a0d8-42e8-b438-e9fda852f3a5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Get console output#033[00m
Nov 29 02:10:22 np0005539505 nova_compute[186958]: 2025-11-29 07:10:22.668 186962 DEBUG nova.compute.manager [None req-be4fc5cf-972b-4cf8-a0e7-603870acff72 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Nov 29 02:10:22 np0005539505 podman[230106]: 2025-11-29 07:10:22.724497296 +0000 UTC m=+0.055404494 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:10:22 np0005539505 nova_compute[186958]: 2025-11-29 07:10:22.786 213540 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:10:23 np0005539505 nova_compute[186958]: 2025-11-29 07:10:23.204 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:23 np0005539505 nova_compute[186958]: 2025-11-29 07:10:23.408 186962 DEBUG nova.compute.manager [None req-63b47cba-bc33-405e-8e4c-e689f39925d2 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Getting vnc console get_vnc_console /usr/lib/python3.9/site-packages/nova/compute/manager.py:7196#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.185 186962 DEBUG oslo_concurrency.lockutils [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Acquiring lock "21490f0b-6f12-4093-9b77-881041a7b7e1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.186 186962 DEBUG oslo_concurrency.lockutils [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.186 186962 DEBUG oslo_concurrency.lockutils [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Acquiring lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.187 186962 DEBUG oslo_concurrency.lockutils [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.187 186962 DEBUG oslo_concurrency.lockutils [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.201 186962 INFO nova.compute.manager [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Terminating instance#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.218 186962 DEBUG nova.compute.manager [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.299 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 kernel: tap8968d6e6-fd (unregistering): left promiscuous mode
Nov 29 02:10:24 np0005539505 NetworkManager[55134]: <info>  [1764400224.3413] device (tap8968d6e6-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.351 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:24Z|00376|binding|INFO|Releasing lport 8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 from this chassis (sb_readonly=0)
Nov 29 02:10:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:24Z|00377|binding|INFO|Setting lport 8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 down in Southbound
Nov 29 02:10:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:24Z|00378|binding|INFO|Removing iface tap8968d6e6-fd ovn-installed in OVS
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.353 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.360 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:26:c3 10.100.0.3'], port_security=['fa:16:3e:c5:26:c3 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '21490f0b-6f12-4093-9b77-881041a7b7e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd0ae0dd-8827-4082-9379-b72cc347da8a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f328764266904de48ec1d6484635553c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '92b3f996-a15f-4f11-93f1-541bc2240d76', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ea705d0-3404-4b52-aa05-fc46263c64a7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=8968d6e6-fda0-44a5-9acd-fd9086fbcaa6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.361 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 in datapath fd0ae0dd-8827-4082-9379-b72cc347da8a unbound from our chassis#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.364 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd0ae0dd-8827-4082-9379-b72cc347da8a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.365 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7888697e-f6b3-493a-8f9a-7b606a538e6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.365 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.366 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a namespace which is not needed anymore#033[00m
Nov 29 02:10:24 np0005539505 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Nov 29 02:10:24 np0005539505 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000005d.scope: Consumed 11.304s CPU time.
Nov 29 02:10:24 np0005539505 systemd-machined[153285]: Machine qemu-48-instance-0000005d terminated.
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.437 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.443 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.485 186962 INFO nova.virt.libvirt.driver [-] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Instance destroyed successfully.#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.486 186962 DEBUG nova.objects.instance [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lazy-loading 'resources' on Instance uuid 21490f0b-6f12-4093-9b77-881041a7b7e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.517 186962 DEBUG nova.virt.libvirt.vif [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:09:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-NoVNCConsoleTestJSON-server-1632671651',display_name='tempest-NoVNCConsoleTestJSON-server-1632671651',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-novncconsoletestjson-server-1632671651',id=93,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:10:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f328764266904de48ec1d6484635553c',ramdisk_id='',reservation_id='r-vbp7ruwh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-NoVNCConsoleTestJSON-110685081',owner_user_name='tempest-NoVNCConsoleTestJSON-110685081-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:10:16Z,user_data=None,user_id='53dde915e52e45a3a8ca44845484339a',uuid=21490f0b-6f12-4093-9b77-881041a7b7e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "address": "fa:16:3e:c5:26:c3", "network": {"id": "fd0ae0dd-8827-4082-9379-b72cc347da8a", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-834438218-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f328764266904de48ec1d6484635553c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968d6e6-fd", "ovs_interfaceid": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.518 186962 DEBUG nova.network.os_vif_util [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Converting VIF {"id": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "address": "fa:16:3e:c5:26:c3", "network": {"id": "fd0ae0dd-8827-4082-9379-b72cc347da8a", "bridge": "br-int", "label": "tempest-NoVNCConsoleTestJSON-834438218-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f328764266904de48ec1d6484635553c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8968d6e6-fd", "ovs_interfaceid": "8968d6e6-fda0-44a5-9acd-fd9086fbcaa6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.518 186962 DEBUG nova.network.os_vif_util [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:26:c3,bridge_name='br-int',has_traffic_filtering=True,id=8968d6e6-fda0-44a5-9acd-fd9086fbcaa6,network=Network(fd0ae0dd-8827-4082-9379-b72cc347da8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8968d6e6-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.519 186962 DEBUG os_vif [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:26:c3,bridge_name='br-int',has_traffic_filtering=True,id=8968d6e6-fda0-44a5-9acd-fd9086fbcaa6,network=Network(fd0ae0dd-8827-4082-9379-b72cc347da8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8968d6e6-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.521 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.521 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8968d6e6-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.523 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.524 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.526 186962 INFO os_vif [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:26:c3,bridge_name='br-int',has_traffic_filtering=True,id=8968d6e6-fda0-44a5-9acd-fd9086fbcaa6,network=Network(fd0ae0dd-8827-4082-9379-b72cc347da8a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8968d6e6-fd')#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.527 186962 INFO nova.virt.libvirt.driver [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Deleting instance files /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1_del#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.527 186962 INFO nova.virt.libvirt.driver [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Deletion of /var/lib/nova/instances/21490f0b-6f12-4093-9b77-881041a7b7e1_del complete#033[00m
Nov 29 02:10:24 np0005539505 neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a[230038]: [NOTICE]   (230042) : haproxy version is 2.8.14-c23fe91
Nov 29 02:10:24 np0005539505 neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a[230038]: [NOTICE]   (230042) : path to executable is /usr/sbin/haproxy
Nov 29 02:10:24 np0005539505 neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a[230038]: [WARNING]  (230042) : Exiting Master process...
Nov 29 02:10:24 np0005539505 neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a[230038]: [WARNING]  (230042) : Exiting Master process...
Nov 29 02:10:24 np0005539505 neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a[230038]: [ALERT]    (230042) : Current worker (230044) exited with code 143 (Terminated)
Nov 29 02:10:24 np0005539505 neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a[230038]: [WARNING]  (230042) : All workers exited. Exiting... (0)
Nov 29 02:10:24 np0005539505 systemd[1]: libpod-a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040.scope: Deactivated successfully.
Nov 29 02:10:24 np0005539505 podman[230166]: 2025-11-29 07:10:24.568527872 +0000 UTC m=+0.112801323 container died a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:10:24 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040-userdata-shm.mount: Deactivated successfully.
Nov 29 02:10:24 np0005539505 systemd[1]: var-lib-containers-storage-overlay-5dea823592dc5b317e331ab9b16e1d2866df42a3ec92fe542afef64a1b443328-merged.mount: Deactivated successfully.
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.601 186962 DEBUG nova.compute.manager [req-1772b774-7cd0-4ca3-8b47-fad187bdd812 req-5b83a27d-2fa4-427e-af94-c9c26c82787b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Received event network-vif-unplugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.601 186962 DEBUG oslo_concurrency.lockutils [req-1772b774-7cd0-4ca3-8b47-fad187bdd812 req-5b83a27d-2fa4-427e-af94-c9c26c82787b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.602 186962 DEBUG oslo_concurrency.lockutils [req-1772b774-7cd0-4ca3-8b47-fad187bdd812 req-5b83a27d-2fa4-427e-af94-c9c26c82787b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.602 186962 DEBUG oslo_concurrency.lockutils [req-1772b774-7cd0-4ca3-8b47-fad187bdd812 req-5b83a27d-2fa4-427e-af94-c9c26c82787b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.602 186962 DEBUG nova.compute.manager [req-1772b774-7cd0-4ca3-8b47-fad187bdd812 req-5b83a27d-2fa4-427e-af94-c9c26c82787b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] No waiting events found dispatching network-vif-unplugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.602 186962 DEBUG nova.compute.manager [req-1772b774-7cd0-4ca3-8b47-fad187bdd812 req-5b83a27d-2fa4-427e-af94-c9c26c82787b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Received event network-vif-unplugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.617 186962 INFO nova.compute.manager [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.618 186962 DEBUG oslo.service.loopingcall [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.618 186962 DEBUG nova.compute.manager [-] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.619 186962 DEBUG nova.network.neutron [-] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:10:24 np0005539505 podman[230166]: 2025-11-29 07:10:24.676166578 +0000 UTC m=+0.220439999 container cleanup a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:24 np0005539505 systemd[1]: libpod-conmon-a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040.scope: Deactivated successfully.
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.778 186962 DEBUG oslo_concurrency.lockutils [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.778 186962 DEBUG oslo_concurrency.lockutils [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.778 186962 DEBUG oslo_concurrency.lockutils [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.779 186962 DEBUG oslo_concurrency.lockutils [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.779 186962 DEBUG oslo_concurrency.lockutils [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.792 186962 INFO nova.compute.manager [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Terminating instance#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.812 186962 DEBUG nova.compute.manager [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:10:24 np0005539505 kernel: tap95792ac7-cb (unregistering): left promiscuous mode
Nov 29 02:10:24 np0005539505 NetworkManager[55134]: <info>  [1764400224.8373] device (tap95792ac7-cb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:10:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:24Z|00379|binding|INFO|Releasing lport 95792ac7-cbc8-4bad-903e-600bb3d09fce from this chassis (sb_readonly=0)
Nov 29 02:10:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:24Z|00380|binding|INFO|Setting lport 95792ac7-cbc8-4bad-903e-600bb3d09fce down in Southbound
Nov 29 02:10:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:10:24Z|00381|binding|INFO|Removing iface tap95792ac7-cb ovn-installed in OVS
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.841 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 podman[230208]: 2025-11-29 07:10:24.850484917 +0000 UTC m=+0.155464625 container remove a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.852 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:a1:8f 10.100.0.8'], port_security=['fa:16:3e:a1:a1:8f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '6d4e9a0c-c91c-45a4-911d-7526b420a8a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '8', 'neutron:security_group_ids': '376a466b-335f-4204-8812-ec229fd4d3b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2abd3f5a-1a92-4bfd-a631-54a420dbc598, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=95792ac7-cbc8-4bad-903e-600bb3d09fce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.857 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[853372e0-1e9b-42a3-ab74-dda9a762c07c]: (4, ('Sat Nov 29 07:10:24 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a (a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040)\na463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040\nSat Nov 29 07:10:24 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a (a463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040)\na463d3c92a73967e106bff530ce395790b26432a07b9e33622a978e056ed3040\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.858 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb19ede-8e2f-4a06-bb83-e521b740af42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.859 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd0ae0dd-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.861 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 kernel: tapfd0ae0dd-80: left promiscuous mode
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.874 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 nova_compute[186958]: 2025-11-29 07:10:24.879 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.881 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[83462aa8-6b43-406d-85a9-1ecd181eda92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:24 np0005539505 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000059.scope: Deactivated successfully.
Nov 29 02:10:24 np0005539505 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000059.scope: Consumed 12.676s CPU time.
Nov 29 02:10:24 np0005539505 systemd-machined[153285]: Machine qemu-47-instance-00000059 terminated.
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.903 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9b991c41-4c06-490d-88b3-967acc09f197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.905 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[65f4e4a6-01ae-4e2a-94dd-cd5a307fffa2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.921 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fe00003c-df46-46b2-a7e7-700ccc993768]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565695, 'reachable_time': 19346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230231, 'error': None, 'target': 'ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.924 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd0ae0dd-8827-4082-9379-b72cc347da8a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.924 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[0c74f11d-bf18-4982-9a5b-3f6ba6396d86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.925 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 95792ac7-cbc8-4bad-903e-600bb3d09fce in datapath af9d1967-d1a9-4382-82b7-d9db26a40cb7 unbound from our chassis#033[00m
Nov 29 02:10:24 np0005539505 systemd[1]: run-netns-ovnmeta\x2dfd0ae0dd\x2d8827\x2d4082\x2d9379\x2db72cc347da8a.mount: Deactivated successfully.
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.926 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network af9d1967-d1a9-4382-82b7-d9db26a40cb7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.927 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8cea2238-9943-4e6f-bd2b-a4a2a98f4e8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:24.928 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7 namespace which is not needed anymore#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.067 186962 INFO nova.virt.libvirt.driver [-] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance destroyed successfully.#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.068 186962 DEBUG nova.objects.instance [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:10:25 np0005539505 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[229807]: [NOTICE]   (229853) : haproxy version is 2.8.14-c23fe91
Nov 29 02:10:25 np0005539505 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[229807]: [NOTICE]   (229853) : path to executable is /usr/sbin/haproxy
Nov 29 02:10:25 np0005539505 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[229807]: [WARNING]  (229853) : Exiting Master process...
Nov 29 02:10:25 np0005539505 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[229807]: [ALERT]    (229853) : Current worker (229855) exited with code 143 (Terminated)
Nov 29 02:10:25 np0005539505 neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7[229807]: [WARNING]  (229853) : All workers exited. Exiting... (0)
Nov 29 02:10:25 np0005539505 systemd[1]: libpod-c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3.scope: Deactivated successfully.
Nov 29 02:10:25 np0005539505 podman[230250]: 2025-11-29 07:10:25.083957154 +0000 UTC m=+0.081397882 container died c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:10:25 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3-userdata-shm.mount: Deactivated successfully.
Nov 29 02:10:25 np0005539505 systemd[1]: var-lib-containers-storage-overlay-354373ee511799f7c68bcfd63702800d23fcc0d059e93983f7befaa4fb9946f8-merged.mount: Deactivated successfully.
Nov 29 02:10:25 np0005539505 podman[230250]: 2025-11-29 07:10:25.159998703 +0000 UTC m=+0.157439431 container cleanup c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:10:25 np0005539505 systemd[1]: libpod-conmon-c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3.scope: Deactivated successfully.
Nov 29 02:10:25 np0005539505 podman[230296]: 2025-11-29 07:10:25.221826968 +0000 UTC m=+0.042532829 container remove c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:10:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:25.228 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ea299db7-1107-44f1-b9e9-54ec51abaf36]: (4, ('Sat Nov 29 07:10:24 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7 (c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3)\nc89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3\nSat Nov 29 07:10:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7 (c89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3)\nc89388038a7fbbf1027f6540e30a9bd34e88dccaf4e54b64d60bf8a2db6b63e3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:25.231 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e6459043-e7d2-4c6f-a7b8-bf6499e373fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:25.232 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf9d1967-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.236 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:25 np0005539505 kernel: tapaf9d1967-d0: left promiscuous mode
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.252 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:25.255 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4e60b23e-730c-468a-8fe8-890f316970c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:25.270 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5e83706a-a1af-4beb-9614-a4212fa0653c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:25.272 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[df58eb38-ed9b-4134-bf3c-0abdf88f55b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:25.285 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c46b8480-ef51-4775-94ff-3bffc5c39d7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564825, 'reachable_time': 40282, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230315, 'error': None, 'target': 'ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:25.287 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-af9d1967-d1a9-4382-82b7-d9db26a40cb7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:10:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:25.287 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[e2fabf9b-79be-43b3-bdb7-c4ca2448c561]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.343 186962 DEBUG nova.virt.libvirt.vif [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1405928271',display_name='tempest-TestNetworkAdvancedServerOps-server-1405928271',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1405928271',id=89,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJzX+cYphgzFb/LmLSqgC4l/EgTLaDqQRgz2oIoLmiT9pJmbbaoOE/h8lTp9y4P6Lqu0yte5POR0cnSIwuT6ICbf/J95VY/pQuT7Mh/Rw0RaK2X3rgSaxQ5jqSeZ2XDRaw==',key_name='tempest-TestNetworkAdvancedServerOps-1604525815',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:10:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-q0t03bzu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:10:09Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=6d4e9a0c-c91c-45a4-911d-7526b420a8a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.344 186962 DEBUG nova.network.os_vif_util [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "address": "fa:16:3e:a1:a1:8f", "network": {"id": "af9d1967-d1a9-4382-82b7-d9db26a40cb7", "bridge": "br-int", "label": "tempest-network-smoke--1326373200", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap95792ac7-cb", "ovs_interfaceid": "95792ac7-cbc8-4bad-903e-600bb3d09fce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.345 186962 DEBUG nova.network.os_vif_util [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.345 186962 DEBUG os_vif [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.347 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.347 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap95792ac7-cb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.348 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.349 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.351 186962 INFO os_vif [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a1:a1:8f,bridge_name='br-int',has_traffic_filtering=True,id=95792ac7-cbc8-4bad-903e-600bb3d09fce,network=Network(af9d1967-d1a9-4382-82b7-d9db26a40cb7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap95792ac7-cb')#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.352 186962 INFO nova.virt.libvirt.driver [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Deleting instance files /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9_del#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.357 186962 INFO nova.virt.libvirt.driver [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Deletion of /var/lib/nova/instances/6d4e9a0c-c91c-45a4-911d-7526b420a8a9_del complete#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.468 186962 INFO nova.compute.manager [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Took 0.66 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.468 186962 DEBUG oslo.service.loopingcall [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.469 186962 DEBUG nova.compute.manager [-] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.469 186962 DEBUG nova.network.neutron [-] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:10:25 np0005539505 systemd[1]: run-netns-ovnmeta\x2daf9d1967\x2dd1a9\x2d4382\x2d82b7\x2dd9db26a40cb7.mount: Deactivated successfully.
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.765 186962 DEBUG nova.network.neutron [-] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:25 np0005539505 nova_compute[186958]: 2025-11-29 07:10:25.788 186962 INFO nova.compute.manager [-] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Took 1.17 seconds to deallocate network for instance.#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.081 186962 DEBUG oslo_concurrency.lockutils [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.081 186962 DEBUG oslo_concurrency.lockutils [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.161 186962 DEBUG nova.compute.provider_tree [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.179 186962 DEBUG nova.scheduler.client.report [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.210 186962 DEBUG oslo_concurrency.lockutils [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.242 186962 INFO nova.scheduler.client.report [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Deleted allocations for instance 21490f0b-6f12-4093-9b77-881041a7b7e1#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.305 186962 DEBUG nova.network.neutron [-] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.322 186962 INFO nova.compute.manager [-] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Took 0.85 seconds to deallocate network for instance.#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.325 186962 DEBUG oslo_concurrency.lockutils [None req-1e766fdd-11a9-44bd-84d2-cc57c1e3287d 53dde915e52e45a3a8ca44845484339a f328764266904de48ec1d6484635553c - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.436 186962 DEBUG oslo_concurrency.lockutils [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.436 186962 DEBUG oslo_concurrency.lockutils [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.486 186962 DEBUG nova.compute.provider_tree [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.504 186962 DEBUG nova.scheduler.client.report [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.537 186962 DEBUG oslo_concurrency.lockutils [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.573 186962 INFO nova.scheduler.client.report [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Deleted allocations for instance 6d4e9a0c-c91c-45a4-911d-7526b420a8a9#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.681 186962 DEBUG nova.compute.manager [req-7a5d33d5-f8a0-4b62-9ec4-25de8882447f req-c868049f-814e-4226-85bb-2ab1154f71a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-changed-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.681 186962 DEBUG nova.compute.manager [req-7a5d33d5-f8a0-4b62-9ec4-25de8882447f req-c868049f-814e-4226-85bb-2ab1154f71a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Refreshing instance network info cache due to event network-changed-95792ac7-cbc8-4bad-903e-600bb3d09fce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.681 186962 DEBUG oslo_concurrency.lockutils [req-7a5d33d5-f8a0-4b62-9ec4-25de8882447f req-c868049f-814e-4226-85bb-2ab1154f71a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.682 186962 DEBUG oslo_concurrency.lockutils [req-7a5d33d5-f8a0-4b62-9ec4-25de8882447f req-c868049f-814e-4226-85bb-2ab1154f71a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.682 186962 DEBUG nova.network.neutron [req-7a5d33d5-f8a0-4b62-9ec4-25de8882447f req-c868049f-814e-4226-85bb-2ab1154f71a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Refreshing network info cache for port 95792ac7-cbc8-4bad-903e-600bb3d09fce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.684 186962 DEBUG oslo_concurrency.lockutils [None req-8176c73a-df84-4b8c-baed-31a82e06ae93 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.906s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.783 186962 DEBUG nova.compute.manager [req-08c15b13-c3d6-4f9c-81da-3cbe0c1a5d5b req-f8bb7341-1ab3-4f91-92f3-05958f948e8c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-deleted-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.882 186962 DEBUG nova.compute.manager [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Received event network-vif-plugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.882 186962 DEBUG oslo_concurrency.lockutils [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.882 186962 DEBUG oslo_concurrency.lockutils [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.883 186962 DEBUG oslo_concurrency.lockutils [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "21490f0b-6f12-4093-9b77-881041a7b7e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.883 186962 DEBUG nova.compute.manager [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] No waiting events found dispatching network-vif-plugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.883 186962 WARNING nova.compute.manager [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Received unexpected event network-vif-plugged-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.883 186962 DEBUG nova.compute.manager [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-unplugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.884 186962 DEBUG oslo_concurrency.lockutils [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.884 186962 DEBUG oslo_concurrency.lockutils [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.884 186962 DEBUG oslo_concurrency.lockutils [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.884 186962 DEBUG nova.compute.manager [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] No waiting events found dispatching network-vif-unplugged-95792ac7-cbc8-4bad-903e-600bb3d09fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.884 186962 WARNING nova.compute.manager [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received unexpected event network-vif-unplugged-95792ac7-cbc8-4bad-903e-600bb3d09fce for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.884 186962 DEBUG nova.compute.manager [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.885 186962 DEBUG oslo_concurrency.lockutils [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.885 186962 DEBUG oslo_concurrency.lockutils [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.885 186962 DEBUG oslo_concurrency.lockutils [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "6d4e9a0c-c91c-45a4-911d-7526b420a8a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.885 186962 DEBUG nova.compute.manager [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] No waiting events found dispatching network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.885 186962 WARNING nova.compute.manager [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Received unexpected event network-vif-plugged-95792ac7-cbc8-4bad-903e-600bb3d09fce for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.886 186962 DEBUG nova.compute.manager [req-9a563e50-8a3d-47eb-a3d1-cfb87c231337 req-6a0a5dfa-df01-43cf-b561-0c93d893f0f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Received event network-vif-deleted-8968d6e6-fda0-44a5-9acd-fd9086fbcaa6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:10:26 np0005539505 nova_compute[186958]: 2025-11-29 07:10:26.940 186962 DEBUG nova.network.neutron [req-7a5d33d5-f8a0-4b62-9ec4-25de8882447f req-c868049f-814e-4226-85bb-2ab1154f71a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:10:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:26.948 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:10:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:26.949 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:10:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:10:26.949 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:10:27 np0005539505 nova_compute[186958]: 2025-11-29 07:10:27.375 186962 DEBUG nova.network.neutron [req-7a5d33d5-f8a0-4b62-9ec4-25de8882447f req-c868049f-814e-4226-85bb-2ab1154f71a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Nov 29 02:10:27 np0005539505 nova_compute[186958]: 2025-11-29 07:10:27.376 186962 DEBUG oslo_concurrency.lockutils [req-7a5d33d5-f8a0-4b62-9ec4-25de8882447f req-c868049f-814e-4226-85bb-2ab1154f71a1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-6d4e9a0c-c91c-45a4-911d-7526b420a8a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:10:29 np0005539505 nova_compute[186958]: 2025-11-29 07:10:29.300 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:30 np0005539505 nova_compute[186958]: 2025-11-29 07:10:30.349 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:30 np0005539505 nova_compute[186958]: 2025-11-29 07:10:30.900 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:31 np0005539505 nova_compute[186958]: 2025-11-29 07:10:31.083 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:31 np0005539505 podman[230317]: 2025-11-29 07:10:31.744421441 +0000 UTC m=+0.067163327 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:10:31 np0005539505 podman[230318]: 2025-11-29 07:10:31.765721126 +0000 UTC m=+0.090516060 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:10:34 np0005539505 nova_compute[186958]: 2025-11-29 07:10:34.302 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:35 np0005539505 nova_compute[186958]: 2025-11-29 07:10:35.351 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:37 np0005539505 podman[230368]: 2025-11-29 07:10:37.735252601 +0000 UTC m=+0.069716260 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:10:37 np0005539505 podman[230388]: 2025-11-29 07:10:37.822191458 +0000 UTC m=+0.056497974 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:10:39 np0005539505 nova_compute[186958]: 2025-11-29 07:10:39.305 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:39 np0005539505 nova_compute[186958]: 2025-11-29 07:10:39.483 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400224.4825332, 21490f0b-6f12-4093-9b77-881041a7b7e1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:39 np0005539505 nova_compute[186958]: 2025-11-29 07:10:39.484 186962 INFO nova.compute.manager [-] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:10:39 np0005539505 nova_compute[186958]: 2025-11-29 07:10:39.532 186962 DEBUG nova.compute.manager [None req-74bba94b-ee9d-45bd-addd-baad78bfd7ca - - - - - -] [instance: 21490f0b-6f12-4093-9b77-881041a7b7e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:40 np0005539505 nova_compute[186958]: 2025-11-29 07:10:40.066 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400225.0658882, 6d4e9a0c-c91c-45a4-911d-7526b420a8a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:10:40 np0005539505 nova_compute[186958]: 2025-11-29 07:10:40.067 186962 INFO nova.compute.manager [-] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:10:40 np0005539505 nova_compute[186958]: 2025-11-29 07:10:40.088 186962 DEBUG nova.compute.manager [None req-4011a601-6219-41d1-af01-3e358dd626d5 - - - - - -] [instance: 6d4e9a0c-c91c-45a4-911d-7526b420a8a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:10:40 np0005539505 nova_compute[186958]: 2025-11-29 07:10:40.352 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:44 np0005539505 nova_compute[186958]: 2025-11-29 07:10:44.355 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:45 np0005539505 nova_compute[186958]: 2025-11-29 07:10:45.354 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:10:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:10:49 np0005539505 nova_compute[186958]: 2025-11-29 07:10:49.357 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:50 np0005539505 nova_compute[186958]: 2025-11-29 07:10:50.356 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:50 np0005539505 podman[230410]: 2025-11-29 07:10:50.715754771 +0000 UTC m=+0.050744691 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_id=edpm, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal)
Nov 29 02:10:50 np0005539505 podman[230411]: 2025-11-29 07:10:50.716633876 +0000 UTC m=+0.045555164 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:10:53 np0005539505 nova_compute[186958]: 2025-11-29 07:10:53.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:53 np0005539505 podman[230455]: 2025-11-29 07:10:53.730068028 +0000 UTC m=+0.044220465 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Nov 29 02:10:54 np0005539505 nova_compute[186958]: 2025-11-29 07:10:54.360 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:55 np0005539505 nova_compute[186958]: 2025-11-29 07:10:55.357 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:58 np0005539505 nova_compute[186958]: 2025-11-29 07:10:58.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:10:59 np0005539505 nova_compute[186958]: 2025-11-29 07:10:59.362 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:10:59 np0005539505 nova_compute[186958]: 2025-11-29 07:10:59.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:00 np0005539505 nova_compute[186958]: 2025-11-29 07:11:00.360 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:02 np0005539505 podman[230474]: 2025-11-29 07:11:02.730981952 +0000 UTC m=+0.063014829 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:11:02 np0005539505 podman[230475]: 2025-11-29 07:11:02.764587656 +0000 UTC m=+0.078326584 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.363 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.463 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.463 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.464 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.464 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.620 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.621 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5728MB free_disk=73.22583770751953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.621 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.622 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.715 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.715 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.754 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.770 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.821 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:11:04 np0005539505 nova_compute[186958]: 2025-11-29 07:11:04.821 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:11:05 np0005539505 nova_compute[186958]: 2025-11-29 07:11:05.362 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:11:05.742 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:11:05 np0005539505 nova_compute[186958]: 2025-11-29 07:11:05.743 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:11:05.744 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:11:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:11:05.746 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:11:06 np0005539505 nova_compute[186958]: 2025-11-29 07:11:06.822 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:06 np0005539505 nova_compute[186958]: 2025-11-29 07:11:06.822 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:11:06 np0005539505 nova_compute[186958]: 2025-11-29 07:11:06.822 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:11:07 np0005539505 nova_compute[186958]: 2025-11-29 07:11:07.007 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:11:07 np0005539505 nova_compute[186958]: 2025-11-29 07:11:07.558 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:08 np0005539505 nova_compute[186958]: 2025-11-29 07:11:08.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:08 np0005539505 nova_compute[186958]: 2025-11-29 07:11:08.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:08 np0005539505 nova_compute[186958]: 2025-11-29 07:11:08.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:11:08 np0005539505 podman[230525]: 2025-11-29 07:11:08.730977962 +0000 UTC m=+0.061809945 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:11:08 np0005539505 podman[230526]: 2025-11-29 07:11:08.757112974 +0000 UTC m=+0.084961523 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:11:09 np0005539505 nova_compute[186958]: 2025-11-29 07:11:09.366 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:10 np0005539505 nova_compute[186958]: 2025-11-29 07:11:10.364 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:14 np0005539505 nova_compute[186958]: 2025-11-29 07:11:14.367 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:15 np0005539505 nova_compute[186958]: 2025-11-29 07:11:15.366 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:16 np0005539505 ovn_controller[95143]: 2025-11-29T07:11:16Z|00382|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Nov 29 02:11:18 np0005539505 nova_compute[186958]: 2025-11-29 07:11:18.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:19 np0005539505 nova_compute[186958]: 2025-11-29 07:11:19.369 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:20 np0005539505 nova_compute[186958]: 2025-11-29 07:11:20.368 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:20 np0005539505 nova_compute[186958]: 2025-11-29 07:11:20.957 186962 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 0.80 sec#033[00m
Nov 29 02:11:21 np0005539505 podman[230564]: 2025-11-29 07:11:21.708234962 +0000 UTC m=+0.040752908 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:11:21 np0005539505 podman[230563]: 2025-11-29 07:11:21.713615695 +0000 UTC m=+0.050283939 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal)
Nov 29 02:11:24 np0005539505 nova_compute[186958]: 2025-11-29 07:11:24.371 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:24 np0005539505 podman[230609]: 2025-11-29 07:11:24.71770839 +0000 UTC m=+0.044667989 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:11:25 np0005539505 nova_compute[186958]: 2025-11-29 07:11:25.370 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:11:26.950 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:11:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:11:26.950 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:11:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:11:26.950 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:11:29 np0005539505 nova_compute[186958]: 2025-11-29 07:11:29.374 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:30 np0005539505 nova_compute[186958]: 2025-11-29 07:11:30.372 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:33 np0005539505 podman[230628]: 2025-11-29 07:11:33.717657117 +0000 UTC m=+0.049397992 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:11:33 np0005539505 podman[230629]: 2025-11-29 07:11:33.76354964 +0000 UTC m=+0.091246321 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 29 02:11:34 np0005539505 nova_compute[186958]: 2025-11-29 07:11:34.375 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:35 np0005539505 nova_compute[186958]: 2025-11-29 07:11:35.377 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:39 np0005539505 nova_compute[186958]: 2025-11-29 07:11:39.379 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:39 np0005539505 podman[230674]: 2025-11-29 07:11:39.734014273 +0000 UTC m=+0.060266171 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 02:11:39 np0005539505 podman[230675]: 2025-11-29 07:11:39.749988716 +0000 UTC m=+0.081054921 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:11:40 np0005539505 nova_compute[186958]: 2025-11-29 07:11:40.380 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:44 np0005539505 nova_compute[186958]: 2025-11-29 07:11:44.427 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:45 np0005539505 nova_compute[186958]: 2025-11-29 07:11:45.384 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:49 np0005539505 nova_compute[186958]: 2025-11-29 07:11:49.428 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:50 np0005539505 nova_compute[186958]: 2025-11-29 07:11:50.386 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:51 np0005539505 nova_compute[186958]: 2025-11-29 07:11:51.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:52 np0005539505 podman[230714]: 2025-11-29 07:11:52.736645171 +0000 UTC m=+0.067568789 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Nov 29 02:11:52 np0005539505 podman[230715]: 2025-11-29 07:11:52.753123779 +0000 UTC m=+0.082959266 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:11:54 np0005539505 nova_compute[186958]: 2025-11-29 07:11:54.429 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:55 np0005539505 nova_compute[186958]: 2025-11-29 07:11:55.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:55 np0005539505 nova_compute[186958]: 2025-11-29 07:11:55.388 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:11:55 np0005539505 podman[230758]: 2025-11-29 07:11:55.717667972 +0000 UTC m=+0.052483821 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:11:58 np0005539505 nova_compute[186958]: 2025-11-29 07:11:58.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:11:59 np0005539505 nova_compute[186958]: 2025-11-29 07:11:59.431 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:00 np0005539505 nova_compute[186958]: 2025-11-29 07:12:00.390 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:00 np0005539505 nova_compute[186958]: 2025-11-29 07:12:00.658 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:01 np0005539505 nova_compute[186958]: 2025-11-29 07:12:01.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:02 np0005539505 nova_compute[186958]: 2025-11-29 07:12:02.068 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:04 np0005539505 nova_compute[186958]: 2025-11-29 07:12:04.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:04 np0005539505 nova_compute[186958]: 2025-11-29 07:12:04.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:12:04 np0005539505 nova_compute[186958]: 2025-11-29 07:12:04.433 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:04 np0005539505 podman[230777]: 2025-11-29 07:12:04.720047658 +0000 UTC m=+0.050956387 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:12:04 np0005539505 podman[230778]: 2025-11-29 07:12:04.746482399 +0000 UTC m=+0.073393755 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 29 02:12:05 np0005539505 nova_compute[186958]: 2025-11-29 07:12:05.393 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:09 np0005539505 nova_compute[186958]: 2025-11-29 07:12:09.435 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:10 np0005539505 nova_compute[186958]: 2025-11-29 07:12:10.395 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:10 np0005539505 podman[230828]: 2025-11-29 07:12:10.720318344 +0000 UTC m=+0.053916291 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 02:12:10 np0005539505 podman[230829]: 2025-11-29 07:12:10.729233787 +0000 UTC m=+0.054576370 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:12:12 np0005539505 nova_compute[186958]: 2025-11-29 07:12:12.182 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:12:12 np0005539505 nova_compute[186958]: 2025-11-29 07:12:12.183 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:12 np0005539505 nova_compute[186958]: 2025-11-29 07:12:12.183 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:12:14 np0005539505 nova_compute[186958]: 2025-11-29 07:12:14.437 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:15 np0005539505 nova_compute[186958]: 2025-11-29 07:12:15.396 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:19 np0005539505 nova_compute[186958]: 2025-11-29 07:12:19.438 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:20 np0005539505 nova_compute[186958]: 2025-11-29 07:12:20.399 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:23 np0005539505 podman[230866]: 2025-11-29 07:12:23.716983794 +0000 UTC m=+0.047507560 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:12:23 np0005539505 podman[230865]: 2025-11-29 07:12:23.716973254 +0000 UTC m=+0.053343566 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, version=9.6)
Nov 29 02:12:24 np0005539505 nova_compute[186958]: 2025-11-29 07:12:24.054 186962 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.10 sec#033[00m
Nov 29 02:12:24 np0005539505 nova_compute[186958]: 2025-11-29 07:12:24.440 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:12:24.762 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:12:24 np0005539505 nova_compute[186958]: 2025-11-29 07:12:24.762 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:12:24.763 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:12:25 np0005539505 nova_compute[186958]: 2025-11-29 07:12:25.401 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:12:25.765 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:12:26 np0005539505 nova_compute[186958]: 2025-11-29 07:12:26.320 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:26 np0005539505 nova_compute[186958]: 2025-11-29 07:12:26.321 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:26 np0005539505 nova_compute[186958]: 2025-11-29 07:12:26.321 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:12:26 np0005539505 nova_compute[186958]: 2025-11-29 07:12:26.321 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:12:26 np0005539505 podman[230907]: 2025-11-29 07:12:26.736057096 +0000 UTC m=+0.071451739 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:12:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:12:26.951 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:12:26.951 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:12:26.951 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:27 np0005539505 nova_compute[186958]: 2025-11-29 07:12:27.127 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:12:27 np0005539505 nova_compute[186958]: 2025-11-29 07:12:27.128 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:27 np0005539505 nova_compute[186958]: 2025-11-29 07:12:27.128 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:27 np0005539505 nova_compute[186958]: 2025-11-29 07:12:27.128 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:27 np0005539505 nova_compute[186958]: 2025-11-29 07:12:27.129 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:12:27 np0005539505 nova_compute[186958]: 2025-11-29 07:12:27.129 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.238 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.238 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.238 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.239 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.368 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.369 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5739MB free_disk=73.22585678100586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.369 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.370 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.441 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.446 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.446 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.487 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.585 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.587 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:12:29 np0005539505 nova_compute[186958]: 2025-11-29 07:12:29.587 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:12:30 np0005539505 nova_compute[186958]: 2025-11-29 07:12:30.403 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:32 np0005539505 nova_compute[186958]: 2025-11-29 07:12:32.431 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:34 np0005539505 nova_compute[186958]: 2025-11-29 07:12:34.443 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:35 np0005539505 nova_compute[186958]: 2025-11-29 07:12:35.438 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:35 np0005539505 podman[230926]: 2025-11-29 07:12:35.704481738 +0000 UTC m=+0.041961201 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:12:35 np0005539505 podman[230927]: 2025-11-29 07:12:35.741820977 +0000 UTC m=+0.075621946 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Nov 29 02:12:39 np0005539505 nova_compute[186958]: 2025-11-29 07:12:39.445 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:40 np0005539505 nova_compute[186958]: 2025-11-29 07:12:40.440 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:41 np0005539505 podman[230976]: 2025-11-29 07:12:41.735111005 +0000 UTC m=+0.054984882 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:12:41 np0005539505 podman[230975]: 2025-11-29 07:12:41.766469265 +0000 UTC m=+0.082901634 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0)
Nov 29 02:12:44 np0005539505 nova_compute[186958]: 2025-11-29 07:12:44.486 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:45 np0005539505 nova_compute[186958]: 2025-11-29 07:12:45.442 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.085 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.086 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.087 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:12:48.088 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:12:49 np0005539505 nova_compute[186958]: 2025-11-29 07:12:49.490 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:50 np0005539505 nova_compute[186958]: 2025-11-29 07:12:50.444 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:54 np0005539505 nova_compute[186958]: 2025-11-29 07:12:54.494 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:54 np0005539505 podman[231017]: 2025-11-29 07:12:54.727180497 +0000 UTC m=+0.051642947 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:12:54 np0005539505 podman[231016]: 2025-11-29 07:12:54.735974106 +0000 UTC m=+0.056402602 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 02:12:55 np0005539505 nova_compute[186958]: 2025-11-29 07:12:55.509 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:12:55 np0005539505 nova_compute[186958]: 2025-11-29 07:12:55.958 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:12:57 np0005539505 podman[231061]: 2025-11-29 07:12:57.708872166 +0000 UTC m=+0.045946955 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:12:59 np0005539505 nova_compute[186958]: 2025-11-29 07:12:59.496 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:00 np0005539505 nova_compute[186958]: 2025-11-29 07:13:00.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:00 np0005539505 nova_compute[186958]: 2025-11-29 07:13:00.511 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:01 np0005539505 nova_compute[186958]: 2025-11-29 07:13:01.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:04 np0005539505 nova_compute[186958]: 2025-11-29 07:13:04.497 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:05 np0005539505 nova_compute[186958]: 2025-11-29 07:13:05.543 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:06 np0005539505 nova_compute[186958]: 2025-11-29 07:13:06.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:06 np0005539505 podman[231080]: 2025-11-29 07:13:06.747993605 +0000 UTC m=+0.067706373 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:13:06 np0005539505 podman[231081]: 2025-11-29 07:13:06.769262668 +0000 UTC m=+0.094139573 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:13:07 np0005539505 nova_compute[186958]: 2025-11-29 07:13:07.790 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:13:07 np0005539505 nova_compute[186958]: 2025-11-29 07:13:07.791 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:13:07 np0005539505 nova_compute[186958]: 2025-11-29 07:13:07.791 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:13:07 np0005539505 nova_compute[186958]: 2025-11-29 07:13:07.791 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:13:07 np0005539505 nova_compute[186958]: 2025-11-29 07:13:07.962 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:13:07 np0005539505 nova_compute[186958]: 2025-11-29 07:13:07.963 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5728MB free_disk=73.22587585449219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:13:07 np0005539505 nova_compute[186958]: 2025-11-29 07:13:07.964 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:13:07 np0005539505 nova_compute[186958]: 2025-11-29 07:13:07.964 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:13:09 np0005539505 nova_compute[186958]: 2025-11-29 07:13:09.499 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:10 np0005539505 nova_compute[186958]: 2025-11-29 07:13:10.546 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:12 np0005539505 podman[231132]: 2025-11-29 07:13:12.72001867 +0000 UTC m=+0.053015835 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm)
Nov 29 02:13:12 np0005539505 podman[231131]: 2025-11-29 07:13:12.735921882 +0000 UTC m=+0.061658811 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:13:14 np0005539505 nova_compute[186958]: 2025-11-29 07:13:14.535 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:15 np0005539505 nova_compute[186958]: 2025-11-29 07:13:15.548 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:17 np0005539505 nova_compute[186958]: 2025-11-29 07:13:17.124 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:13:17 np0005539505 nova_compute[186958]: 2025-11-29 07:13:17.125 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:13:17 np0005539505 nova_compute[186958]: 2025-11-29 07:13:17.249 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:13:18 np0005539505 nova_compute[186958]: 2025-11-29 07:13:18.148 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:13:18 np0005539505 nova_compute[186958]: 2025-11-29 07:13:18.149 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:13:18 np0005539505 nova_compute[186958]: 2025-11-29 07:13:18.149 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 10.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:13:19 np0005539505 nova_compute[186958]: 2025-11-29 07:13:19.538 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:20 np0005539505 nova_compute[186958]: 2025-11-29 07:13:20.550 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:21 np0005539505 nova_compute[186958]: 2025-11-29 07:13:21.145 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:21 np0005539505 nova_compute[186958]: 2025-11-29 07:13:21.145 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:21 np0005539505 nova_compute[186958]: 2025-11-29 07:13:21.146 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:13:21 np0005539505 nova_compute[186958]: 2025-11-29 07:13:21.146 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:13:21 np0005539505 nova_compute[186958]: 2025-11-29 07:13:21.193 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:13:21 np0005539505 nova_compute[186958]: 2025-11-29 07:13:21.194 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:21 np0005539505 nova_compute[186958]: 2025-11-29 07:13:21.194 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:21 np0005539505 nova_compute[186958]: 2025-11-29 07:13:21.194 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:21 np0005539505 nova_compute[186958]: 2025-11-29 07:13:21.195 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:13:24 np0005539505 nova_compute[186958]: 2025-11-29 07:13:24.539 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:25 np0005539505 nova_compute[186958]: 2025-11-29 07:13:25.552 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:13:25.712 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:13:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:13:25.713 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:13:25 np0005539505 nova_compute[186958]: 2025-11-29 07:13:25.713 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:25 np0005539505 podman[231171]: 2025-11-29 07:13:25.730254965 +0000 UTC m=+0.060985022 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:13:25 np0005539505 podman[231170]: 2025-11-29 07:13:25.745303442 +0000 UTC m=+0.082277447 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc.)
Nov 29 02:13:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:13:26.952 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:13:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:13:26.953 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:13:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:13:26.953 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:13:28 np0005539505 podman[231215]: 2025-11-29 07:13:28.724166742 +0000 UTC m=+0.056538736 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:13:29 np0005539505 nova_compute[186958]: 2025-11-29 07:13:29.541 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:30 np0005539505 nova_compute[186958]: 2025-11-29 07:13:30.554 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:13:32.716 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:13:34 np0005539505 nova_compute[186958]: 2025-11-29 07:13:34.543 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:35 np0005539505 nova_compute[186958]: 2025-11-29 07:13:35.556 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:37 np0005539505 podman[231234]: 2025-11-29 07:13:37.708316821 +0000 UTC m=+0.045766910 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:13:37 np0005539505 podman[231235]: 2025-11-29 07:13:37.766091231 +0000 UTC m=+0.100028490 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 02:13:39 np0005539505 nova_compute[186958]: 2025-11-29 07:13:39.544 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:40 np0005539505 nova_compute[186958]: 2025-11-29 07:13:40.558 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:43 np0005539505 podman[231285]: 2025-11-29 07:13:43.719947831 +0000 UTC m=+0.048961121 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 29 02:13:43 np0005539505 podman[231284]: 2025-11-29 07:13:43.719968742 +0000 UTC m=+0.053116319 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:13:44 np0005539505 nova_compute[186958]: 2025-11-29 07:13:44.591 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:45 np0005539505 nova_compute[186958]: 2025-11-29 07:13:45.560 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:49 np0005539505 nova_compute[186958]: 2025-11-29 07:13:49.592 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:50 np0005539505 nova_compute[186958]: 2025-11-29 07:13:50.274 186962 DEBUG nova.compute.manager [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 29 02:13:50 np0005539505 nova_compute[186958]: 2025-11-29 07:13:50.574 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.072 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.073 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.128 186962 DEBUG nova.objects.instance [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'pci_requests' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.147 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.147 186962 INFO nova.compute.claims [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.148 186962 DEBUG nova.objects.instance [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'resources' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.175 186962 DEBUG nova.objects.instance [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'numa_topology' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.197 186962 DEBUG nova.objects.instance [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'pci_devices' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.251 186962 INFO nova.compute.resource_tracker [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating resource usage from migration 7d62bbd7-9748-439d-af70-ef4b5d8cefb1#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.251 186962 DEBUG nova.compute.resource_tracker [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Starting to track incoming migration 7d62bbd7-9748-439d-af70-ef4b5d8cefb1 with flavor 1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.441 186962 DEBUG nova.compute.provider_tree [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.464 186962 DEBUG nova.scheduler.client.report [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.494 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:13:51 np0005539505 nova_compute[186958]: 2025-11-29 07:13:51.495 186962 INFO nova.compute.manager [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Migrating#033[00m
Nov 29 02:13:54 np0005539505 nova_compute[186958]: 2025-11-29 07:13:54.594 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:55 np0005539505 nova_compute[186958]: 2025-11-29 07:13:55.422 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:55 np0005539505 nova_compute[186958]: 2025-11-29 07:13:55.583 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:56 np0005539505 podman[231321]: 2025-11-29 07:13:56.718321319 +0000 UTC m=+0.044192075 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:13:56 np0005539505 podman[231320]: 2025-11-29 07:13:56.741011422 +0000 UTC m=+0.073485966 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 02:13:57 np0005539505 nova_compute[186958]: 2025-11-29 07:13:57.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:13:57 np0005539505 systemd-logind[794]: New session 57 of user nova.
Nov 29 02:13:57 np0005539505 systemd[1]: Created slice User Slice of UID 42436.
Nov 29 02:13:57 np0005539505 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 29 02:13:57 np0005539505 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 29 02:13:57 np0005539505 systemd[1]: Starting User Manager for UID 42436...
Nov 29 02:13:57 np0005539505 systemd[231365]: Queued start job for default target Main User Target.
Nov 29 02:13:57 np0005539505 systemd[231365]: Created slice User Application Slice.
Nov 29 02:13:57 np0005539505 systemd[231365]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:13:57 np0005539505 systemd[231365]: Started Daily Cleanup of User's Temporary Directories.
Nov 29 02:13:57 np0005539505 systemd[231365]: Reached target Paths.
Nov 29 02:13:57 np0005539505 systemd[231365]: Reached target Timers.
Nov 29 02:13:57 np0005539505 systemd[231365]: Starting D-Bus User Message Bus Socket...
Nov 29 02:13:57 np0005539505 systemd[231365]: Starting Create User's Volatile Files and Directories...
Nov 29 02:13:57 np0005539505 systemd[231365]: Finished Create User's Volatile Files and Directories.
Nov 29 02:13:57 np0005539505 systemd[231365]: Listening on D-Bus User Message Bus Socket.
Nov 29 02:13:57 np0005539505 systemd[231365]: Reached target Sockets.
Nov 29 02:13:57 np0005539505 systemd[231365]: Reached target Basic System.
Nov 29 02:13:57 np0005539505 systemd[231365]: Reached target Main User Target.
Nov 29 02:13:57 np0005539505 systemd[231365]: Startup finished in 115ms.
Nov 29 02:13:57 np0005539505 systemd[1]: Started User Manager for UID 42436.
Nov 29 02:13:57 np0005539505 systemd[1]: Started Session 57 of User nova.
Nov 29 02:13:57 np0005539505 systemd[1]: session-57.scope: Deactivated successfully.
Nov 29 02:13:57 np0005539505 systemd-logind[794]: Session 57 logged out. Waiting for processes to exit.
Nov 29 02:13:57 np0005539505 systemd-logind[794]: Removed session 57.
Nov 29 02:13:57 np0005539505 systemd-logind[794]: New session 59 of user nova.
Nov 29 02:13:57 np0005539505 systemd[1]: Started Session 59 of User nova.
Nov 29 02:13:57 np0005539505 systemd[1]: session-59.scope: Deactivated successfully.
Nov 29 02:13:57 np0005539505 systemd-logind[794]: Session 59 logged out. Waiting for processes to exit.
Nov 29 02:13:57 np0005539505 systemd-logind[794]: Removed session 59.
Nov 29 02:13:59 np0005539505 nova_compute[186958]: 2025-11-29 07:13:59.636 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:13:59 np0005539505 podman[231388]: 2025-11-29 07:13:59.715002655 +0000 UTC m=+0.048233301 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:14:00 np0005539505 nova_compute[186958]: 2025-11-29 07:14:00.586 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:01 np0005539505 systemd-logind[794]: New session 60 of user nova.
Nov 29 02:14:01 np0005539505 systemd[1]: Started Session 60 of User nova.
Nov 29 02:14:01 np0005539505 nova_compute[186958]: 2025-11-29 07:14:01.547 186962 DEBUG nova.compute.manager [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:01 np0005539505 nova_compute[186958]: 2025-11-29 07:14:01.547 186962 DEBUG oslo_concurrency.lockutils [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:01 np0005539505 nova_compute[186958]: 2025-11-29 07:14:01.548 186962 DEBUG oslo_concurrency.lockutils [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:01 np0005539505 nova_compute[186958]: 2025-11-29 07:14:01.548 186962 DEBUG oslo_concurrency.lockutils [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:01 np0005539505 nova_compute[186958]: 2025-11-29 07:14:01.548 186962 DEBUG nova.compute.manager [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:01 np0005539505 nova_compute[186958]: 2025-11-29 07:14:01.548 186962 WARNING nova.compute.manager [req-a661a79c-1f39-45d4-9dcd-9c77935069a5 req-3d5e9c7c-74d6-4651-b823-5da2b9d4b435 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:14:01 np0005539505 systemd[1]: session-60.scope: Deactivated successfully.
Nov 29 02:14:01 np0005539505 systemd-logind[794]: Session 60 logged out. Waiting for processes to exit.
Nov 29 02:14:01 np0005539505 systemd-logind[794]: Removed session 60.
Nov 29 02:14:01 np0005539505 systemd-logind[794]: New session 61 of user nova.
Nov 29 02:14:01 np0005539505 systemd[1]: Started Session 61 of User nova.
Nov 29 02:14:01 np0005539505 systemd[1]: session-61.scope: Deactivated successfully.
Nov 29 02:14:01 np0005539505 systemd-logind[794]: Session 61 logged out. Waiting for processes to exit.
Nov 29 02:14:01 np0005539505 systemd-logind[794]: Removed session 61.
Nov 29 02:14:02 np0005539505 systemd-logind[794]: New session 62 of user nova.
Nov 29 02:14:02 np0005539505 systemd[1]: Started Session 62 of User nova.
Nov 29 02:14:02 np0005539505 systemd[1]: session-62.scope: Deactivated successfully.
Nov 29 02:14:02 np0005539505 systemd-logind[794]: Session 62 logged out. Waiting for processes to exit.
Nov 29 02:14:02 np0005539505 systemd-logind[794]: Removed session 62.
Nov 29 02:14:02 np0005539505 nova_compute[186958]: 2025-11-29 07:14:02.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:03 np0005539505 nova_compute[186958]: 2025-11-29 07:14:03.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:04 np0005539505 nova_compute[186958]: 2025-11-29 07:14:04.246 186962 DEBUG nova.compute.manager [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:04 np0005539505 nova_compute[186958]: 2025-11-29 07:14:04.246 186962 DEBUG oslo_concurrency.lockutils [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:04 np0005539505 nova_compute[186958]: 2025-11-29 07:14:04.246 186962 DEBUG oslo_concurrency.lockutils [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:04 np0005539505 nova_compute[186958]: 2025-11-29 07:14:04.246 186962 DEBUG oslo_concurrency.lockutils [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:04 np0005539505 nova_compute[186958]: 2025-11-29 07:14:04.247 186962 DEBUG nova.compute.manager [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:04 np0005539505 nova_compute[186958]: 2025-11-29 07:14:04.247 186962 WARNING nova.compute.manager [req-29ee208c-9b8b-4d48-a2e6-303f722040b0 req-cf457de7-5d8d-4eb0-aec9-66caa561e2c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:14:04 np0005539505 nova_compute[186958]: 2025-11-29 07:14:04.267 186962 INFO nova.network.neutron [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating port 07a930ef-a036-4ddf-aa57-c5d56f77847c with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 29 02:14:04 np0005539505 nova_compute[186958]: 2025-11-29 07:14:04.691 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:05.472 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:14:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:05.472 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:14:05 np0005539505 nova_compute[186958]: 2025-11-29 07:14:05.473 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:05 np0005539505 nova_compute[186958]: 2025-11-29 07:14:05.588 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.646 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.647 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.648 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.648 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.820 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.820 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.821 186962 DEBUG nova.network.neutron [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.850 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.851 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5710MB free_disk=73.197265625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.851 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:06 np0005539505 nova_compute[186958]: 2025-11-29 07:14:06.851 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.167 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Applying migration context for instance 5818027f-a5b1-465a-a6e2-f0c8f0de8154 as it has an incoming, in-progress migration 7d62bbd7-9748-439d-af70-ef4b5d8cefb1. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.168 186962 INFO nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating resource usage from migration 7d62bbd7-9748-439d-af70-ef4b5d8cefb1#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.220 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 5818027f-a5b1-465a-a6e2-f0c8f0de8154 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.220 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.221 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.241 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.263 186962 DEBUG nova.compute.manager [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.264 186962 DEBUG nova.compute.manager [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing instance network info cache due to event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.264 186962 DEBUG oslo_concurrency.lockutils [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.284 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.284 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.333 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.358 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.441 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.462 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.513 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:14:07 np0005539505 nova_compute[186958]: 2025-11-29 07:14:07.513 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:08 np0005539505 podman[231422]: 2025-11-29 07:14:08.719170071 +0000 UTC m=+0.048103606 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:14:08 np0005539505 podman[231423]: 2025-11-29 07:14:08.787444979 +0000 UTC m=+0.115112778 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:14:09 np0005539505 nova_compute[186958]: 2025-11-29 07:14:09.694 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:09 np0005539505 nova_compute[186958]: 2025-11-29 07:14:09.754 186962 DEBUG nova.network.neutron [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.190 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.194 186962 DEBUG oslo_concurrency.lockutils [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.194 186962 DEBUG nova.network.neutron [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.501 186962 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.504 186962 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.505 186962 INFO nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Creating image(s)#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.506 186962 DEBUG nova.objects.instance [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.590 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.661 186962 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.757 186962 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.758 186962 DEBUG nova.virt.disk.api [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Checking if we can resize image /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.759 186962 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.811 186962 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:10 np0005539505 nova_compute[186958]: 2025-11-29 07:14:10.812 186962 DEBUG nova.virt.disk.api [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Cannot resize image /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.316 186962 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.316 186962 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Ensure instance console log exists: /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.317 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.317 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.318 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.320 186962 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Start _get_guest_xml network_info=[{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1883832764", "vif_mac": "fa:16:3e:c4:6c:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.325 186962 WARNING nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.331 186962 DEBUG nova.virt.libvirt.host [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.332 186962 DEBUG nova.virt.libvirt.host [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.334 186962 DEBUG nova.virt.libvirt.host [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.335 186962 DEBUG nova.virt.libvirt.host [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.337 186962 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.337 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.338 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.338 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.339 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.339 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.339 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.340 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.340 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.341 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.341 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.342 186962 DEBUG nova.virt.hardware [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.342 186962 DEBUG nova.objects.instance [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.696 186962 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.751 186962 DEBUG oslo_concurrency.processutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.753 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.753 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.756 186962 DEBUG oslo_concurrency.lockutils [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.758 186962 DEBUG nova.virt.libvirt.vif [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-409239588',display_name='tempest-TestNetworkAdvancedServerOps-server-409239588',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-409239588',id=95,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGXK6HijxfcV9+fEMlQT2yR3VoX9Sz77Qk57Xkpwoye1FFlDLU8fY8cJvr+Q2fRauh1dlNIWCagiMxv7znT2NcZAvXyo+qqZudIr0NVBck3Lt9NyetTtYoJBqcrR4BWObg==',key_name='tempest-TestNetworkAdvancedServerOps-1900401721',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:13:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-6l59ck53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:14:02Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=5818027f-a5b1-465a-a6e2-f0c8f0de8154,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1883832764", "vif_mac": "fa:16:3e:c4:6c:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.760 186962 DEBUG nova.network.os_vif_util [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converting VIF {"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1883832764", "vif_mac": "fa:16:3e:c4:6c:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.763 186962 DEBUG nova.network.os_vif_util [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.767 186962 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  <uuid>5818027f-a5b1-465a-a6e2-f0c8f0de8154</uuid>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  <name>instance-0000005f</name>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-409239588</nova:name>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:14:11</nova:creationTime>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:        <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:        <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:        <nova:port uuid="07a930ef-a036-4ddf-aa57-c5d56f77847c">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <entry name="serial">5818027f-a5b1-465a-a6e2-f0c8f0de8154</entry>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <entry name="uuid">5818027f-a5b1-465a-a6e2-f0c8f0de8154</entry>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/disk.config"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:c4:6c:2d"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <target dev="tap07a930ef-a0"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/console.log" append="off"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:14:11 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:14:11 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:14:11 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:14:11 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.769 186962 DEBUG nova.virt.libvirt.vif [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-409239588',display_name='tempest-TestNetworkAdvancedServerOps-server-409239588',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-409239588',id=95,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGXK6HijxfcV9+fEMlQT2yR3VoX9Sz77Qk57Xkpwoye1FFlDLU8fY8cJvr+Q2fRauh1dlNIWCagiMxv7znT2NcZAvXyo+qqZudIr0NVBck3Lt9NyetTtYoJBqcrR4BWObg==',key_name='tempest-TestNetworkAdvancedServerOps-1900401721',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:13:01Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-6l59ck53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:14:02Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=5818027f-a5b1-465a-a6e2-f0c8f0de8154,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1883832764", "vif_mac": "fa:16:3e:c4:6c:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.769 186962 DEBUG nova.network.os_vif_util [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converting VIF {"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1883832764", "vif_mac": "fa:16:3e:c4:6c:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.770 186962 DEBUG nova.network.os_vif_util [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.770 186962 DEBUG os_vif [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.771 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.771 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.772 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.779 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.779 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap07a930ef-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.779 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap07a930ef-a0, col_values=(('external_ids', {'iface-id': '07a930ef-a036-4ddf-aa57-c5d56f77847c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:6c:2d', 'vm-uuid': '5818027f-a5b1-465a-a6e2-f0c8f0de8154'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.782 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.785 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:14:11 np0005539505 NetworkManager[55134]: <info>  [1764400451.7853] manager: (tap07a930ef-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.790 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.791 186962 INFO os_vif [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0')#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.891 186962 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.891 186962 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.891 186962 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] No VIF found with MAC fa:16:3e:c4:6c:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.892 186962 INFO nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Using config drive#033[00m
Nov 29 02:14:11 np0005539505 kernel: tap07a930ef-a0: entered promiscuous mode
Nov 29 02:14:11 np0005539505 NetworkManager[55134]: <info>  [1764400451.9489] manager: (tap07a930ef-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.949 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:11Z|00383|binding|INFO|Claiming lport 07a930ef-a036-4ddf-aa57-c5d56f77847c for this chassis.
Nov 29 02:14:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:11Z|00384|binding|INFO|07a930ef-a036-4ddf-aa57-c5d56f77847c: Claiming fa:16:3e:c4:6c:2d 10.100.0.13
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.953 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.957 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.961 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:11 np0005539505 nova_compute[186958]: 2025-11-29 07:14:11.976 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:11 np0005539505 NetworkManager[55134]: <info>  [1764400451.9766] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Nov 29 02:14:11 np0005539505 NetworkManager[55134]: <info>  [1764400451.9772] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Nov 29 02:14:11 np0005539505 systemd-udevd[231492]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:14:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:11.985 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:6c:2d 10.100.0.13'], port_security=['fa:16:3e:c4:6c:2d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c188a1f4-7511-4259-992e-c9127e6a414b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ede51bf8-0086-4a77-b4a9-badf8936b8c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.197'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aab533cd-f26a-47b5-9334-c93bf39572b9, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=07a930ef-a036-4ddf-aa57-c5d56f77847c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:14:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:11.986 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 07a930ef-a036-4ddf-aa57-c5d56f77847c in datapath c188a1f4-7511-4259-992e-c9127e6a414b bound to our chassis#033[00m
Nov 29 02:14:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:11.988 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c188a1f4-7511-4259-992e-c9127e6a414b#033[00m
Nov 29 02:14:11 np0005539505 NetworkManager[55134]: <info>  [1764400451.9930] device (tap07a930ef-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:14:11 np0005539505 NetworkManager[55134]: <info>  [1764400451.9940] device (tap07a930ef-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:14:12 np0005539505 systemd-machined[153285]: New machine qemu-49-instance-0000005f.
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.001 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b78f54b8-5cc3-4ad1-bcf1-43ecb28f6eb3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.002 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc188a1f4-71 in ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.004 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc188a1f4-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.005 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3650473c-205d-40f3-8973-05a18a9107c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.006 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[251508d0-2b5b-4b16-be10-84f3d065f39f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.019 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[e4356a81-0644-476d-86a2-09e59c7836ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 systemd[1]: Started Virtual Machine qemu-49-instance-0000005f.
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.051 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[24e31285-c6bc-4324-bf21-db90bf0acd1d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.084 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[59237863-2232-4b99-b1ea-abdec02a073b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 NetworkManager[55134]: <info>  [1764400452.1031] manager: (tapc188a1f4-70): new Veth device (/org/freedesktop/NetworkManager/Devices/199)
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.102 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ea614d-3668-4c19-a55a-bdcc78f1ca2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 systemd-udevd[231496]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.129 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[5a6dcfc0-e393-4a54-a772-46269f6b525a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.132 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[4ffc1395-ed0e-4a81-9a7b-8c8b072e5e2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 NetworkManager[55134]: <info>  [1764400452.1546] device (tapc188a1f4-70): carrier: link connected
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.161 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[26af92ac-7c19-4148-bac1-198c3c567f98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.183 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.190 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[08633e9b-89a9-4cac-b0b8-b5a458dfa3c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc188a1f4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:0a:87'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589973, 'reachable_time': 20609, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231526, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.204 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ec49678c-5ca0-4f34-a139-94f1ccae5a8d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe83:a87'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589973, 'tstamp': 589973}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231527, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.214 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.222 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9f5a7a-0905-4d7d-95e0-78cdd0bf04a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc188a1f4-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:0a:87'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589973, 'reachable_time': 20609, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231530, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:12Z|00385|binding|INFO|Setting lport 07a930ef-a036-4ddf-aa57-c5d56f77847c ovn-installed in OVS
Nov 29 02:14:12 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:12Z|00386|binding|INFO|Setting lport 07a930ef-a036-4ddf-aa57-c5d56f77847c up in Southbound
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.224 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.255 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b7327a7e-c1eb-45b9-b7c8-7123ef1d27e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.314 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400452.3135004, 5818027f-a5b1-465a-a6e2-f0c8f0de8154 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.314 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.315 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6e396c16-34d7-4a2b-9d4d-16279cb7473e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.316 186962 DEBUG nova.compute.manager [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.317 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc188a1f4-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.318 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.318 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc188a1f4-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.319 186962 INFO nova.virt.libvirt.driver [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance running successfully.#033[00m
Nov 29 02:14:12 np0005539505 virtqemud[186353]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.321 186962 DEBUG nova.virt.libvirt.guest [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.321 186962 DEBUG nova.virt.libvirt.driver [None req-d941f0b2-a0b6-4785-b6f3-c8610ecf38d8 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.326 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:12 np0005539505 kernel: tapc188a1f4-70: entered promiscuous mode
Nov 29 02:14:12 np0005539505 NetworkManager[55134]: <info>  [1764400452.3271] manager: (tapc188a1f4-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/200)
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.328 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.330 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc188a1f4-70, col_values=(('external_ids', {'iface-id': 'a383047a-7ad7-4f43-a653-f18a79d8acb1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.331 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:12 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:12Z|00387|binding|INFO|Releasing lport a383047a-7ad7-4f43-a653-f18a79d8acb1 from this chassis (sb_readonly=0)
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.332 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.334 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c188a1f4-7511-4259-992e-c9127e6a414b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c188a1f4-7511-4259-992e-c9127e6a414b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.335 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f59e9576-7752-4224-8e2e-3beeb899c6cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.336 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-c188a1f4-7511-4259-992e-c9127e6a414b
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/c188a1f4-7511-4259-992e-c9127e6a414b.pid.haproxy
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID c188a1f4-7511-4259-992e-c9127e6a414b
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:14:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:12.336 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'env', 'PROCESS_TAG=haproxy-c188a1f4-7511-4259-992e-c9127e6a414b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c188a1f4-7511-4259-992e-c9127e6a414b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.342 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.354 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.358 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:14:12 np0005539505 systemd[1]: Stopping User Manager for UID 42436...
Nov 29 02:14:12 np0005539505 systemd[231365]: Activating special unit Exit the Session...
Nov 29 02:14:12 np0005539505 systemd[231365]: Stopped target Main User Target.
Nov 29 02:14:12 np0005539505 systemd[231365]: Stopped target Basic System.
Nov 29 02:14:12 np0005539505 systemd[231365]: Stopped target Paths.
Nov 29 02:14:12 np0005539505 systemd[231365]: Stopped target Sockets.
Nov 29 02:14:12 np0005539505 systemd[231365]: Stopped target Timers.
Nov 29 02:14:12 np0005539505 systemd[231365]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 29 02:14:12 np0005539505 systemd[231365]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 29 02:14:12 np0005539505 systemd[231365]: Closed D-Bus User Message Bus Socket.
Nov 29 02:14:12 np0005539505 systemd[231365]: Stopped Create User's Volatile Files and Directories.
Nov 29 02:14:12 np0005539505 systemd[231365]: Removed slice User Application Slice.
Nov 29 02:14:12 np0005539505 systemd[231365]: Reached target Shutdown.
Nov 29 02:14:12 np0005539505 systemd[231365]: Finished Exit the Session.
Nov 29 02:14:12 np0005539505 systemd[231365]: Reached target Exit the Session.
Nov 29 02:14:12 np0005539505 systemd[1]: user@42436.service: Deactivated successfully.
Nov 29 02:14:12 np0005539505 systemd[1]: Stopped User Manager for UID 42436.
Nov 29 02:14:12 np0005539505 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.412 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.414 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400452.3146925, 5818027f-a5b1-465a-a6e2-f0c8f0de8154 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.414 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] VM Started (Lifecycle Event)#033[00m
Nov 29 02:14:12 np0005539505 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 29 02:14:12 np0005539505 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 29 02:14:12 np0005539505 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 29 02:14:12 np0005539505 systemd[1]: Removed slice User Slice of UID 42436.
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.477 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.482 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.508 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.509 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.509 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:14:12 np0005539505 nova_compute[186958]: 2025-11-29 07:14:12.509 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:14:12 np0005539505 podman[231568]: 2025-11-29 07:14:12.753699307 +0000 UTC m=+0.057711959 container create b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:14:12 np0005539505 systemd[1]: Started libpod-conmon-b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a.scope.
Nov 29 02:14:12 np0005539505 podman[231568]: 2025-11-29 07:14:12.722481991 +0000 UTC m=+0.026494673 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:14:12 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:14:12 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/844fe09f8f9012e0829cf2ddd7c7b50d7a312b118afaff121911eb430850e7a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:14:12 np0005539505 podman[231568]: 2025-11-29 07:14:12.839691838 +0000 UTC m=+0.143704510 container init b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:14:12 np0005539505 podman[231568]: 2025-11-29 07:14:12.846943434 +0000 UTC m=+0.150956086 container start b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 02:14:12 np0005539505 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[231583]: [NOTICE]   (231587) : New worker (231590) forked
Nov 29 02:14:12 np0005539505 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[231583]: [NOTICE]   (231587) : Loading success.
Nov 29 02:14:13 np0005539505 nova_compute[186958]: 2025-11-29 07:14:13.293 186962 DEBUG nova.compute.manager [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:13 np0005539505 nova_compute[186958]: 2025-11-29 07:14:13.295 186962 DEBUG oslo_concurrency.lockutils [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:13 np0005539505 nova_compute[186958]: 2025-11-29 07:14:13.295 186962 DEBUG oslo_concurrency.lockutils [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:13 np0005539505 nova_compute[186958]: 2025-11-29 07:14:13.296 186962 DEBUG oslo_concurrency.lockutils [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:13 np0005539505 nova_compute[186958]: 2025-11-29 07:14:13.296 186962 DEBUG nova.compute.manager [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:13 np0005539505 nova_compute[186958]: 2025-11-29 07:14:13.296 186962 WARNING nova.compute.manager [req-94671927-9baa-4a5b-91ec-9f6e0c45ab8f req-c1c3198d-8b68-4b7c-84a1-296355070391 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:14:13 np0005539505 nova_compute[186958]: 2025-11-29 07:14:13.347 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:14 np0005539505 nova_compute[186958]: 2025-11-29 07:14:14.733 186962 DEBUG nova.network.neutron [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updated VIF entry in instance network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:14:14 np0005539505 nova_compute[186958]: 2025-11-29 07:14:14.734 186962 DEBUG nova.network.neutron [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:14 np0005539505 nova_compute[186958]: 2025-11-29 07:14:14.735 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:14 np0005539505 podman[231599]: 2025-11-29 07:14:14.758023202 +0000 UTC m=+0.088369399 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 02:14:14 np0005539505 podman[231600]: 2025-11-29 07:14:14.786024807 +0000 UTC m=+0.111920038 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.license=GPLv2)
Nov 29 02:14:14 np0005539505 nova_compute[186958]: 2025-11-29 07:14:14.986 186962 DEBUG oslo_concurrency.lockutils [req-2f3bec87-1ed9-48b3-955e-d024debf5a5a req-eabea806-0fe2-4908-8a0a-c80526898ee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:14 np0005539505 nova_compute[186958]: 2025-11-29 07:14:14.987 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:14 np0005539505 nova_compute[186958]: 2025-11-29 07:14:14.987 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:14:14 np0005539505 nova_compute[186958]: 2025-11-29 07:14:14.990 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:15.475 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:15 np0005539505 nova_compute[186958]: 2025-11-29 07:14:15.622 186962 DEBUG nova.compute.manager [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:15 np0005539505 nova_compute[186958]: 2025-11-29 07:14:15.623 186962 DEBUG oslo_concurrency.lockutils [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:15 np0005539505 nova_compute[186958]: 2025-11-29 07:14:15.623 186962 DEBUG oslo_concurrency.lockutils [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:15 np0005539505 nova_compute[186958]: 2025-11-29 07:14:15.624 186962 DEBUG oslo_concurrency.lockutils [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:15 np0005539505 nova_compute[186958]: 2025-11-29 07:14:15.624 186962 DEBUG nova.compute.manager [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:15 np0005539505 nova_compute[186958]: 2025-11-29 07:14:15.624 186962 WARNING nova.compute.manager [req-6dc1b89d-6980-433e-bbe2-b776008446e0 req-1cd31f17-90ea-422f-ac9b-66ea17f68d33 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:14:15 np0005539505 nova_compute[186958]: 2025-11-29 07:14:15.854 186962 DEBUG nova.network.neutron [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Port 07a930ef-a036-4ddf-aa57-c5d56f77847c binding to destination host compute-2.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Nov 29 02:14:15 np0005539505 nova_compute[186958]: 2025-11-29 07:14:15.856 186962 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:16 np0005539505 nova_compute[186958]: 2025-11-29 07:14:16.782 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:19 np0005539505 nova_compute[186958]: 2025-11-29 07:14:19.736 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:20 np0005539505 nova_compute[186958]: 2025-11-29 07:14:20.917 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:21 np0005539505 nova_compute[186958]: 2025-11-29 07:14:21.164 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:21 np0005539505 nova_compute[186958]: 2025-11-29 07:14:21.165 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:14:21 np0005539505 nova_compute[186958]: 2025-11-29 07:14:21.165 186962 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:21 np0005539505 nova_compute[186958]: 2025-11-29 07:14:21.166 186962 DEBUG nova.network.neutron [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:14:21 np0005539505 nova_compute[186958]: 2025-11-29 07:14:21.168 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:21 np0005539505 nova_compute[186958]: 2025-11-29 07:14:21.169 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:21 np0005539505 nova_compute[186958]: 2025-11-29 07:14:21.169 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:21 np0005539505 nova_compute[186958]: 2025-11-29 07:14:21.170 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:14:21 np0005539505 nova_compute[186958]: 2025-11-29 07:14:21.784 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:23 np0005539505 nova_compute[186958]: 2025-11-29 07:14:23.532 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:23 np0005539505 nova_compute[186958]: 2025-11-29 07:14:23.735 186962 DEBUG nova.network.neutron [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.021 186962 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.118 186962 DEBUG nova.virt.libvirt.driver [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Creating tmpfile /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154/tmp88opuhzf to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Nov 29 02:14:24 np0005539505 kernel: tap07a930ef-a0 (unregistering): left promiscuous mode
Nov 29 02:14:24 np0005539505 NetworkManager[55134]: <info>  [1764400464.3188] device (tap07a930ef-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:14:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:24Z|00388|binding|INFO|Releasing lport 07a930ef-a036-4ddf-aa57-c5d56f77847c from this chassis (sb_readonly=0)
Nov 29 02:14:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:24Z|00389|binding|INFO|Setting lport 07a930ef-a036-4ddf-aa57-c5d56f77847c down in Southbound
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.327 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:24Z|00390|binding|INFO|Removing iface tap07a930ef-a0 ovn-installed in OVS
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.344 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:6c:2d 10.100.0.13'], port_security=['fa:16:3e:c4:6c:2d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '5818027f-a5b1-465a-a6e2-f0c8f0de8154', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c188a1f4-7511-4259-992e-c9127e6a414b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ede51bf8-0086-4a77-b4a9-badf8936b8c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.197', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aab533cd-f26a-47b5-9334-c93bf39572b9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=07a930ef-a036-4ddf-aa57-c5d56f77847c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.345 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.346 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 07a930ef-a036-4ddf-aa57-c5d56f77847c in datapath c188a1f4-7511-4259-992e-c9127e6a414b unbound from our chassis#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.348 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c188a1f4-7511-4259-992e-c9127e6a414b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.350 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[71b90e26-f845-41d9-aa7c-f11d6be87f62]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.351 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b namespace which is not needed anymore#033[00m
Nov 29 02:14:24 np0005539505 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000005f.scope: Deactivated successfully.
Nov 29 02:14:24 np0005539505 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000005f.scope: Consumed 11.744s CPU time.
Nov 29 02:14:24 np0005539505 systemd-machined[153285]: Machine qemu-49-instance-0000005f terminated.
Nov 29 02:14:24 np0005539505 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[231583]: [NOTICE]   (231587) : haproxy version is 2.8.14-c23fe91
Nov 29 02:14:24 np0005539505 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[231583]: [NOTICE]   (231587) : path to executable is /usr/sbin/haproxy
Nov 29 02:14:24 np0005539505 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[231583]: [WARNING]  (231587) : Exiting Master process...
Nov 29 02:14:24 np0005539505 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[231583]: [ALERT]    (231587) : Current worker (231590) exited with code 143 (Terminated)
Nov 29 02:14:24 np0005539505 neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b[231583]: [WARNING]  (231587) : All workers exited. Exiting... (0)
Nov 29 02:14:24 np0005539505 systemd[1]: libpod-b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a.scope: Deactivated successfully.
Nov 29 02:14:24 np0005539505 podman[231673]: 2025-11-29 07:14:24.502497965 +0000 UTC m=+0.063770431 container died b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:14:24 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a-userdata-shm.mount: Deactivated successfully.
Nov 29 02:14:24 np0005539505 systemd[1]: var-lib-containers-storage-overlay-844fe09f8f9012e0829cf2ddd7c7b50d7a312b118afaff121911eb430850e7a3-merged.mount: Deactivated successfully.
Nov 29 02:14:24 np0005539505 podman[231673]: 2025-11-29 07:14:24.549263682 +0000 UTC m=+0.110536148 container cleanup b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:14:24 np0005539505 systemd[1]: libpod-conmon-b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a.scope: Deactivated successfully.
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.557 186962 INFO nova.virt.libvirt.driver [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Instance destroyed successfully.#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.557 186962 DEBUG nova.objects.instance [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.570 186962 DEBUG nova.virt.libvirt.vif [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:11:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-409239588',display_name='tempest-TestNetworkAdvancedServerOps-server-409239588',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-409239588',id=95,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGXK6HijxfcV9+fEMlQT2yR3VoX9Sz77Qk57Xkpwoye1FFlDLU8fY8cJvr+Q2fRauh1dlNIWCagiMxv7znT2NcZAvXyo+qqZudIr0NVBck3Lt9NyetTtYoJBqcrR4BWObg==',key_name='tempest-TestNetworkAdvancedServerOps-1900401721',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:12Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-6l59ck53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:12Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=5818027f-a5b1-465a-a6e2-f0c8f0de8154,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.570 186962 DEBUG nova.network.os_vif_util [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.571 186962 DEBUG nova.network.os_vif_util [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.571 186962 DEBUG os_vif [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.574 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.575 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07a930ef-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.615 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:24 np0005539505 podman[231720]: 2025-11-29 07:14:24.615347448 +0000 UTC m=+0.044954657 container remove b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.617 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.621 186962 INFO os_vif [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c4:6c:2d,bridge_name='br-int',has_traffic_filtering=True,id=07a930ef-a036-4ddf-aa57-c5d56f77847c,network=Network(c188a1f4-7511-4259-992e-c9127e6a414b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap07a930ef-a0')#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.622 186962 INFO nova.virt.libvirt.driver [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Deleting instance files /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_del#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.621 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4b420e2a-0e5c-42e3-ad54-ac04808484f6]: (4, ('Sat Nov 29 07:14:24 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b (b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a)\nb8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a\nSat Nov 29 07:14:24 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b (b8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a)\nb8f0466c1deb3ebb4f85b26881a15479b4b9afc8b5b9e72de3598b642131f53a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.623 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8c0345-2964-42b5-8583-77e5cf971f7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.624 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc188a1f4-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:24 np0005539505 kernel: tapc188a1f4-70: left promiscuous mode
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.628 186962 INFO nova.virt.libvirt.driver [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Deletion of /var/lib/nova/instances/5818027f-a5b1-465a-a6e2-f0c8f0de8154_del complete#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.631 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.639 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.642 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8b172f95-a9f2-4f60-94ea-68c07134ccb2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.661 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7a096d-6d37-4302-8b78-2f1fe8db5f9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.662 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6d24db3f-9008-40e6-8ada-ccb20c83ca0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.677 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6380e119-0d11-44d4-bbcb-cd2bf5a0f898]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589966, 'reachable_time': 34034, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231735, 'error': None, 'target': 'ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.681 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c188a1f4-7511-4259-992e-c9127e6a414b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:14:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:24.681 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[1e257913-f5e2-4cc0-8792-1ee8c4f6f5ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:24 np0005539505 systemd[1]: run-netns-ovnmeta\x2dc188a1f4\x2d7511\x2d4259\x2d992e\x2dc9127e6a414b.mount: Deactivated successfully.
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.738 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.839 186962 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:24 np0005539505 nova_compute[186958]: 2025-11-29 07:14:24.840 186962 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:25 np0005539505 nova_compute[186958]: 2025-11-29 07:14:25.193 186962 DEBUG nova.objects.instance [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 5818027f-a5b1-465a-a6e2-f0c8f0de8154 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:25 np0005539505 nova_compute[186958]: 2025-11-29 07:14:25.323 186962 DEBUG nova.compute.provider_tree [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:14:25 np0005539505 nova_compute[186958]: 2025-11-29 07:14:25.380 186962 DEBUG nova.scheduler.client.report [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:14:25 np0005539505 nova_compute[186958]: 2025-11-29 07:14:25.781 186962 DEBUG oslo_concurrency.lockutils [None req-47679014-0acb-4d7d-815f-28358b61cf77 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.941s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:25 np0005539505 nova_compute[186958]: 2025-11-29 07:14:25.890 186962 DEBUG nova.compute.manager [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:25 np0005539505 nova_compute[186958]: 2025-11-29 07:14:25.890 186962 DEBUG oslo_concurrency.lockutils [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:25 np0005539505 nova_compute[186958]: 2025-11-29 07:14:25.891 186962 DEBUG oslo_concurrency.lockutils [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:25 np0005539505 nova_compute[186958]: 2025-11-29 07:14:25.891 186962 DEBUG oslo_concurrency.lockutils [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:25 np0005539505 nova_compute[186958]: 2025-11-29 07:14:25.891 186962 DEBUG nova.compute.manager [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:25 np0005539505 nova_compute[186958]: 2025-11-29 07:14:25.892 186962 WARNING nova.compute.manager [req-f2ba9600-0a1e-458e-be66-3edddce2659f req-151710e9-4277-4e4a-8156-ccb3ce4fb1aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-unplugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:14:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:26.953 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:26.953 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:26.954 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:27 np0005539505 podman[231737]: 2025-11-29 07:14:27.718827895 +0000 UTC m=+0.049177447 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:14:27 np0005539505 podman[231736]: 2025-11-29 07:14:27.724520357 +0000 UTC m=+0.057360260 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7)
Nov 29 02:14:28 np0005539505 nova_compute[186958]: 2025-11-29 07:14:28.541 186962 DEBUG nova.compute.manager [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:28 np0005539505 nova_compute[186958]: 2025-11-29 07:14:28.542 186962 DEBUG oslo_concurrency.lockutils [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:28 np0005539505 nova_compute[186958]: 2025-11-29 07:14:28.542 186962 DEBUG oslo_concurrency.lockutils [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:28 np0005539505 nova_compute[186958]: 2025-11-29 07:14:28.542 186962 DEBUG oslo_concurrency.lockutils [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:28 np0005539505 nova_compute[186958]: 2025-11-29 07:14:28.542 186962 DEBUG nova.compute.manager [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:28 np0005539505 nova_compute[186958]: 2025-11-29 07:14:28.542 186962 WARNING nova.compute.manager [req-fc9260e7-b283-4485-85f3-83237ec819e7 req-75d338a6-d665-45da-a19b-dd9ebc0e7606 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.412 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Acquiring lock "61add905-a2e1-4724-a51d-25f62f9bfc45" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.412 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.440 186962 DEBUG nova.compute.manager [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.605 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.606 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.615 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.621 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.622 186962 INFO nova.compute.claims [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.739 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.788 186962 DEBUG nova.compute.manager [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.789 186962 DEBUG nova.compute.manager [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing instance network info cache due to event network-changed-07a930ef-a036-4ddf-aa57-c5d56f77847c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.789 186962 DEBUG oslo_concurrency.lockutils [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.790 186962 DEBUG oslo_concurrency.lockutils [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:29 np0005539505 nova_compute[186958]: 2025-11-29 07:14:29.790 186962 DEBUG nova.network.neutron [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Refreshing network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.010 186962 DEBUG nova.compute.provider_tree [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.037 186962 DEBUG nova.scheduler.client.report [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.110 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.504s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.110 186962 DEBUG nova.compute.manager [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.242 186962 DEBUG nova.compute.manager [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.242 186962 DEBUG nova.network.neutron [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.268 186962 INFO nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.294 186962 DEBUG nova.compute.manager [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.499 186962 DEBUG nova.compute.manager [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.500 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.501 186962 INFO nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Creating image(s)#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.501 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Acquiring lock "/var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.502 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "/var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.502 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "/var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.515 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.571 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.572 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.573 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.589 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.644 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.645 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.679 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.680 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.680 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.737 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.739 186962 DEBUG nova.virt.disk.api [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Checking if we can resize image /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.739 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:30 np0005539505 podman[231785]: 2025-11-29 07:14:30.740001126 +0000 UTC m=+0.074813305 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.794 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.794 186962 DEBUG nova.virt.disk.api [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Cannot resize image /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.795 186962 DEBUG nova.objects.instance [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lazy-loading 'migration_context' on Instance uuid 61add905-a2e1-4724-a51d-25f62f9bfc45 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.844 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.844 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Ensure instance console log exists: /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.845 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.845 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:30 np0005539505 nova_compute[186958]: 2025-11-29 07:14:30.845 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:31 np0005539505 nova_compute[186958]: 2025-11-29 07:14:31.126 186962 DEBUG nova.policy [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:14:34 np0005539505 nova_compute[186958]: 2025-11-29 07:14:34.618 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:34 np0005539505 nova_compute[186958]: 2025-11-29 07:14:34.741 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:35 np0005539505 nova_compute[186958]: 2025-11-29 07:14:35.035 186962 DEBUG nova.network.neutron [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Successfully created port: 713aec40-e89c-478d-ba3a-018459dfab17 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:14:35 np0005539505 nova_compute[186958]: 2025-11-29 07:14:35.131 186962 DEBUG nova.network.neutron [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updated VIF entry in instance network info cache for port 07a930ef-a036-4ddf-aa57-c5d56f77847c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:14:35 np0005539505 nova_compute[186958]: 2025-11-29 07:14:35.132 186962 DEBUG nova.network.neutron [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Updating instance_info_cache with network_info: [{"id": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "address": "fa:16:3e:c4:6c:2d", "network": {"id": "c188a1f4-7511-4259-992e-c9127e6a414b", "bridge": "br-int", "label": "tempest-network-smoke--1883832764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap07a930ef-a0", "ovs_interfaceid": "07a930ef-a036-4ddf-aa57-c5d56f77847c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:35 np0005539505 nova_compute[186958]: 2025-11-29 07:14:35.158 186962 DEBUG oslo_concurrency.lockutils [req-ecb22528-8996-4b50-be5b-38f91ae80a1c req-613fbd94-66cb-4e56-894e-71fb6c1c7e24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5818027f-a5b1-465a-a6e2-f0c8f0de8154" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.647 186962 DEBUG nova.compute.manager [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.648 186962 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.648 186962 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.648 186962 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.648 186962 DEBUG nova.compute.manager [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.649 186962 WARNING nova.compute.manager [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.649 186962 DEBUG nova.compute.manager [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.649 186962 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.649 186962 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.649 186962 DEBUG oslo_concurrency.lockutils [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5818027f-a5b1-465a-a6e2-f0c8f0de8154-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.650 186962 DEBUG nova.compute.manager [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] No waiting events found dispatching network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.650 186962 WARNING nova.compute.manager [req-d160997e-0175-4adb-98c5-e266b3cdd3c4 req-07561fad-107a-4b8d-a3bc-bde5babfe8fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Received unexpected event network-vif-plugged-07a930ef-a036-4ddf-aa57-c5d56f77847c for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.805 186962 DEBUG nova.network.neutron [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Successfully updated port: 713aec40-e89c-478d-ba3a-018459dfab17 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.838 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Acquiring lock "refresh_cache-61add905-a2e1-4724-a51d-25f62f9bfc45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.839 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Acquired lock "refresh_cache-61add905-a2e1-4724-a51d-25f62f9bfc45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:37 np0005539505 nova_compute[186958]: 2025-11-29 07:14:37.839 186962 DEBUG nova.network.neutron [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:14:38 np0005539505 nova_compute[186958]: 2025-11-29 07:14:38.000 186962 DEBUG nova.compute.manager [req-94cc39eb-676b-456f-8775-4098a9123444 req-1a5e4c92-3d41-460a-9630-dd4f71954ac8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Received event network-changed-713aec40-e89c-478d-ba3a-018459dfab17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:38 np0005539505 nova_compute[186958]: 2025-11-29 07:14:38.003 186962 DEBUG nova.compute.manager [req-94cc39eb-676b-456f-8775-4098a9123444 req-1a5e4c92-3d41-460a-9630-dd4f71954ac8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Refreshing instance network info cache due to event network-changed-713aec40-e89c-478d-ba3a-018459dfab17. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:14:38 np0005539505 nova_compute[186958]: 2025-11-29 07:14:38.003 186962 DEBUG oslo_concurrency.lockutils [req-94cc39eb-676b-456f-8775-4098a9123444 req-1a5e4c92-3d41-460a-9630-dd4f71954ac8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-61add905-a2e1-4724-a51d-25f62f9bfc45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:14:38 np0005539505 nova_compute[186958]: 2025-11-29 07:14:38.141 186962 DEBUG nova.network.neutron [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:14:39 np0005539505 nova_compute[186958]: 2025-11-29 07:14:39.555 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400464.5545206, 5818027f-a5b1-465a-a6e2-f0c8f0de8154 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:39 np0005539505 nova_compute[186958]: 2025-11-29 07:14:39.556 186962 INFO nova.compute.manager [-] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:14:39 np0005539505 nova_compute[186958]: 2025-11-29 07:14:39.581 186962 DEBUG nova.compute.manager [None req-2df5f021-df18-4d8d-8215-b0a2b96d86fd - - - - - -] [instance: 5818027f-a5b1-465a-a6e2-f0c8f0de8154] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:39 np0005539505 nova_compute[186958]: 2025-11-29 07:14:39.620 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:39 np0005539505 podman[231811]: 2025-11-29 07:14:39.714571263 +0000 UTC m=+0.050030260 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:14:39 np0005539505 nova_compute[186958]: 2025-11-29 07:14:39.743 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:39 np0005539505 podman[231812]: 2025-11-29 07:14:39.753175618 +0000 UTC m=+0.084860948 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.204 186962 DEBUG nova.network.neutron [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Updating instance_info_cache with network_info: [{"id": "713aec40-e89c-478d-ba3a-018459dfab17", "address": "fa:16:3e:d0:72:87", "network": {"id": "e5f3373c-f7bf-4750-84f2-e76242aed770", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1189278736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f74fdd4985ee459e9ad295d5f888bd61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap713aec40-e8", "ovs_interfaceid": "713aec40-e89c-478d-ba3a-018459dfab17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.223 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Releasing lock "refresh_cache-61add905-a2e1-4724-a51d-25f62f9bfc45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.223 186962 DEBUG nova.compute.manager [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Instance network_info: |[{"id": "713aec40-e89c-478d-ba3a-018459dfab17", "address": "fa:16:3e:d0:72:87", "network": {"id": "e5f3373c-f7bf-4750-84f2-e76242aed770", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1189278736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f74fdd4985ee459e9ad295d5f888bd61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap713aec40-e8", "ovs_interfaceid": "713aec40-e89c-478d-ba3a-018459dfab17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.223 186962 DEBUG oslo_concurrency.lockutils [req-94cc39eb-676b-456f-8775-4098a9123444 req-1a5e4c92-3d41-460a-9630-dd4f71954ac8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-61add905-a2e1-4724-a51d-25f62f9bfc45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.224 186962 DEBUG nova.network.neutron [req-94cc39eb-676b-456f-8775-4098a9123444 req-1a5e4c92-3d41-460a-9630-dd4f71954ac8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Refreshing network info cache for port 713aec40-e89c-478d-ba3a-018459dfab17 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.226 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Start _get_guest_xml network_info=[{"id": "713aec40-e89c-478d-ba3a-018459dfab17", "address": "fa:16:3e:d0:72:87", "network": {"id": "e5f3373c-f7bf-4750-84f2-e76242aed770", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1189278736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f74fdd4985ee459e9ad295d5f888bd61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap713aec40-e8", "ovs_interfaceid": "713aec40-e89c-478d-ba3a-018459dfab17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.230 186962 WARNING nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.235 186962 DEBUG nova.virt.libvirt.host [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.236 186962 DEBUG nova.virt.libvirt.host [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.249 186962 DEBUG nova.virt.libvirt.host [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.250 186962 DEBUG nova.virt.libvirt.host [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.251 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.251 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.252 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.252 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.252 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.253 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.253 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.253 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.253 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.253 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.254 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.254 186962 DEBUG nova.virt.hardware [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.257 186962 DEBUG nova.virt.libvirt.vif [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:14:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-994196049',display_name='tempest-ServerMetadataNegativeTestJSON-server-994196049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-994196049',id=98,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f74fdd4985ee459e9ad295d5f888bd61',ramdisk_id='',reservation_id='r-18550hz4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-374554764',owner_user_name='tempest-ServerMetadataNegativeTestJSON-374554764-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:14:30Z,user_data=None,user_id='970e2faf924c47f4baca59706f567228',uuid=61add905-a2e1-4724-a51d-25f62f9bfc45,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "713aec40-e89c-478d-ba3a-018459dfab17", "address": "fa:16:3e:d0:72:87", "network": {"id": "e5f3373c-f7bf-4750-84f2-e76242aed770", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1189278736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f74fdd4985ee459e9ad295d5f888bd61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap713aec40-e8", "ovs_interfaceid": "713aec40-e89c-478d-ba3a-018459dfab17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.257 186962 DEBUG nova.network.os_vif_util [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Converting VIF {"id": "713aec40-e89c-478d-ba3a-018459dfab17", "address": "fa:16:3e:d0:72:87", "network": {"id": "e5f3373c-f7bf-4750-84f2-e76242aed770", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1189278736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f74fdd4985ee459e9ad295d5f888bd61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap713aec40-e8", "ovs_interfaceid": "713aec40-e89c-478d-ba3a-018459dfab17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.258 186962 DEBUG nova.network.os_vif_util [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:72:87,bridge_name='br-int',has_traffic_filtering=True,id=713aec40-e89c-478d-ba3a-018459dfab17,network=Network(e5f3373c-f7bf-4750-84f2-e76242aed770),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap713aec40-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.259 186962 DEBUG nova.objects.instance [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lazy-loading 'pci_devices' on Instance uuid 61add905-a2e1-4724-a51d-25f62f9bfc45 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.275 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  <uuid>61add905-a2e1-4724-a51d-25f62f9bfc45</uuid>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  <name>instance-00000062</name>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerMetadataNegativeTestJSON-server-994196049</nova:name>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:14:40</nova:creationTime>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:        <nova:user uuid="970e2faf924c47f4baca59706f567228">tempest-ServerMetadataNegativeTestJSON-374554764-project-member</nova:user>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:        <nova:project uuid="f74fdd4985ee459e9ad295d5f888bd61">tempest-ServerMetadataNegativeTestJSON-374554764</nova:project>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:        <nova:port uuid="713aec40-e89c-478d-ba3a-018459dfab17">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <entry name="serial">61add905-a2e1-4724-a51d-25f62f9bfc45</entry>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <entry name="uuid">61add905-a2e1-4724-a51d-25f62f9bfc45</entry>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk.config"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:d0:72:87"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <target dev="tap713aec40-e8"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/console.log" append="off"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:14:40 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:14:40 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:14:40 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:14:40 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.277 186962 DEBUG nova.compute.manager [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Preparing to wait for external event network-vif-plugged-713aec40-e89c-478d-ba3a-018459dfab17 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.277 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Acquiring lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.277 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.277 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.278 186962 DEBUG nova.virt.libvirt.vif [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:14:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-994196049',display_name='tempest-ServerMetadataNegativeTestJSON-server-994196049',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-994196049',id=98,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f74fdd4985ee459e9ad295d5f888bd61',ramdisk_id='',reservation_id='r-18550hz4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-374554764',owner_user_name='tempest-ServerMetadataNegativeTestJSON-374554764-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:14:30Z,user_data=None,user_id='970e2faf924c47f4baca59706f567228',uuid=61add905-a2e1-4724-a51d-25f62f9bfc45,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "713aec40-e89c-478d-ba3a-018459dfab17", "address": "fa:16:3e:d0:72:87", "network": {"id": "e5f3373c-f7bf-4750-84f2-e76242aed770", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1189278736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f74fdd4985ee459e9ad295d5f888bd61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap713aec40-e8", "ovs_interfaceid": "713aec40-e89c-478d-ba3a-018459dfab17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.278 186962 DEBUG nova.network.os_vif_util [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Converting VIF {"id": "713aec40-e89c-478d-ba3a-018459dfab17", "address": "fa:16:3e:d0:72:87", "network": {"id": "e5f3373c-f7bf-4750-84f2-e76242aed770", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1189278736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f74fdd4985ee459e9ad295d5f888bd61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap713aec40-e8", "ovs_interfaceid": "713aec40-e89c-478d-ba3a-018459dfab17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.279 186962 DEBUG nova.network.os_vif_util [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:72:87,bridge_name='br-int',has_traffic_filtering=True,id=713aec40-e89c-478d-ba3a-018459dfab17,network=Network(e5f3373c-f7bf-4750-84f2-e76242aed770),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap713aec40-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.279 186962 DEBUG os_vif [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:72:87,bridge_name='br-int',has_traffic_filtering=True,id=713aec40-e89c-478d-ba3a-018459dfab17,network=Network(e5f3373c-f7bf-4750-84f2-e76242aed770),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap713aec40-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.280 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.280 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.280 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.285 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.285 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap713aec40-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.286 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap713aec40-e8, col_values=(('external_ids', {'iface-id': '713aec40-e89c-478d-ba3a-018459dfab17', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:72:87', 'vm-uuid': '61add905-a2e1-4724-a51d-25f62f9bfc45'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.287 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:40 np0005539505 NetworkManager[55134]: <info>  [1764400480.2891] manager: (tap713aec40-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/201)
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.290 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.293 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.294 186962 INFO os_vif [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:72:87,bridge_name='br-int',has_traffic_filtering=True,id=713aec40-e89c-478d-ba3a-018459dfab17,network=Network(e5f3373c-f7bf-4750-84f2-e76242aed770),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap713aec40-e8')#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.364 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.365 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.365 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] No VIF found with MAC fa:16:3e:d0:72:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.366 186962 INFO nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Using config drive#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.991 186962 INFO nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Creating config drive at /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk.config#033[00m
Nov 29 02:14:40 np0005539505 nova_compute[186958]: 2025-11-29 07:14:40.997 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy3twfaka execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.124 186962 DEBUG oslo_concurrency.processutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy3twfaka" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:14:41 np0005539505 kernel: tap713aec40-e8: entered promiscuous mode
Nov 29 02:14:41 np0005539505 NetworkManager[55134]: <info>  [1764400481.1826] manager: (tap713aec40-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/202)
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.184 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:41Z|00391|binding|INFO|Claiming lport 713aec40-e89c-478d-ba3a-018459dfab17 for this chassis.
Nov 29 02:14:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:41Z|00392|binding|INFO|713aec40-e89c-478d-ba3a-018459dfab17: Claiming fa:16:3e:d0:72:87 10.100.0.5
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.194 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:72:87 10.100.0.5'], port_security=['fa:16:3e:d0:72:87 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5f3373c-f7bf-4750-84f2-e76242aed770', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4bbba178-6567-4234-b6c4-90095d9c80ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45ac6440-1523-49e7-be17-f00b457c625b, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=713aec40-e89c-478d-ba3a-018459dfab17) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.195 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 713aec40-e89c-478d-ba3a-018459dfab17 in datapath e5f3373c-f7bf-4750-84f2-e76242aed770 bound to our chassis#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.197 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e5f3373c-f7bf-4750-84f2-e76242aed770#033[00m
Nov 29 02:14:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:41Z|00393|binding|INFO|Setting lport 713aec40-e89c-478d-ba3a-018459dfab17 ovn-installed in OVS
Nov 29 02:14:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:41Z|00394|binding|INFO|Setting lport 713aec40-e89c-478d-ba3a-018459dfab17 up in Southbound
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.199 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.211 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9dce74f9-e2a7-4a56-b12c-57214032d4c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 systemd-udevd[231880]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.212 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape5f3373c-f1 in ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.215 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape5f3373c-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.215 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b850770d-1fd1-4ee1-ac82-b046b9dc32b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.216 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d097b9f3-d6b5-48f4-955a-7e8c71523533]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 systemd-machined[153285]: New machine qemu-50-instance-00000062.
Nov 29 02:14:41 np0005539505 NetworkManager[55134]: <info>  [1764400481.2261] device (tap713aec40-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:14:41 np0005539505 NetworkManager[55134]: <info>  [1764400481.2273] device (tap713aec40-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.229 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[befdf810-5f19-4d9b-b60c-3e7d7c073a4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 systemd[1]: Started Virtual Machine qemu-50-instance-00000062.
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.252 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a6fcb453-55f5-4336-85fc-3601de01f1fc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.276 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[05b07059-1b13-48cc-90d6-038ef4236c8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 systemd-udevd[231884]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.281 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d6338cd4-b887-41b5-80ad-c5a42f25240c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 NetworkManager[55134]: <info>  [1764400481.2820] manager: (tape5f3373c-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/203)
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.308 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a5aa70e8-4cfc-47f8-8cb6-ffefe9077672]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.311 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[baf7c8f1-2727-47bf-a889-ad912eab53e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 NetworkManager[55134]: <info>  [1764400481.3374] device (tape5f3373c-f0): carrier: link connected
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.344 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[6afe796b-5ddb-4b54-a387-bdc30ad88975]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.360 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[45d9764e-b1c2-4cdd-93c8-f89e80a27d7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape5f3373c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:14:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592892, 'reachable_time': 43247, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231913, 'error': None, 'target': 'ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.373 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b7dee24f-b5b4-4330-b215-db502c997fa6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe21:14be'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 592892, 'tstamp': 592892}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231914, 'error': None, 'target': 'ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.388 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0d3af6f7-ca0b-4141-91ac-557709a6a704]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape5f3373c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:21:14:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 132], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592892, 'reachable_time': 43247, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231915, 'error': None, 'target': 'ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.415 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5a3d2fca-5a83-4cfe-a5a9-b0abff517750]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.467 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a79fcc7c-4762-44af-98d6-b65a3fcce6fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.469 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5f3373c-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.469 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.470 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5f3373c-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:41 np0005539505 kernel: tape5f3373c-f0: entered promiscuous mode
Nov 29 02:14:41 np0005539505 NetworkManager[55134]: <info>  [1764400481.4726] manager: (tape5f3373c-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/204)
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.472 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.474 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.475 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape5f3373c-f0, col_values=(('external_ids', {'iface-id': '6b2429b5-2f2c-490d-b227-bcdd5f9a89eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.476 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:41Z|00395|binding|INFO|Releasing lport 6b2429b5-2f2c-490d-b227-bcdd5f9a89eb from this chassis (sb_readonly=0)
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.486 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.487 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e5f3373c-f7bf-4750-84f2-e76242aed770.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e5f3373c-f7bf-4750-84f2-e76242aed770.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.487 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb732c5-0ab5-4304-8d7b-14728ab2ce63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.488 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-e5f3373c-f7bf-4750-84f2-e76242aed770
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/e5f3373c-f7bf-4750-84f2-e76242aed770.pid.haproxy
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID e5f3373c-f7bf-4750-84f2-e76242aed770
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:14:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:41.489 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770', 'env', 'PROCESS_TAG=haproxy-e5f3373c-f7bf-4750-84f2-e76242aed770', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e5f3373c-f7bf-4750-84f2-e76242aed770.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.555 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400481.555002, 61add905-a2e1-4724-a51d-25f62f9bfc45 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.555 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] VM Started (Lifecycle Event)#033[00m
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.578 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.581 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400481.557967, 61add905-a2e1-4724-a51d-25f62f9bfc45 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.581 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.604 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.607 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:14:41 np0005539505 nova_compute[186958]: 2025-11-29 07:14:41.630 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:14:41 np0005539505 podman[231954]: 2025-11-29 07:14:41.826254978 +0000 UTC m=+0.044700819 container create 62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:14:41 np0005539505 systemd[1]: Started libpod-conmon-62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885.scope.
Nov 29 02:14:41 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:14:41 np0005539505 podman[231954]: 2025-11-29 07:14:41.800421195 +0000 UTC m=+0.018867036 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:14:41 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c46d3b42bdcf79cc037105661a5dcd27d98c09d6c7b4b31d0948539d18953926/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:14:41 np0005539505 podman[231954]: 2025-11-29 07:14:41.923729784 +0000 UTC m=+0.142175645 container init 62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:14:41 np0005539505 podman[231954]: 2025-11-29 07:14:41.929956851 +0000 UTC m=+0.148402692 container start 62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 02:14:41 np0005539505 neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770[231969]: [NOTICE]   (231973) : New worker (231975) forked
Nov 29 02:14:41 np0005539505 neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770[231969]: [NOTICE]   (231973) : Loading success.
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.639 186962 DEBUG nova.compute.manager [req-1d4b517e-d45c-4442-9661-7d4793246c23 req-17ee02fd-aeb8-41ad-84d5-7ece7513a9af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Received event network-vif-plugged-713aec40-e89c-478d-ba3a-018459dfab17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.640 186962 DEBUG oslo_concurrency.lockutils [req-1d4b517e-d45c-4442-9661-7d4793246c23 req-17ee02fd-aeb8-41ad-84d5-7ece7513a9af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.640 186962 DEBUG oslo_concurrency.lockutils [req-1d4b517e-d45c-4442-9661-7d4793246c23 req-17ee02fd-aeb8-41ad-84d5-7ece7513a9af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.640 186962 DEBUG oslo_concurrency.lockutils [req-1d4b517e-d45c-4442-9661-7d4793246c23 req-17ee02fd-aeb8-41ad-84d5-7ece7513a9af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.641 186962 DEBUG nova.compute.manager [req-1d4b517e-d45c-4442-9661-7d4793246c23 req-17ee02fd-aeb8-41ad-84d5-7ece7513a9af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Processing event network-vif-plugged-713aec40-e89c-478d-ba3a-018459dfab17 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.641 186962 DEBUG nova.compute.manager [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.645 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400482.6449015, 61add905-a2e1-4724-a51d-25f62f9bfc45 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.645 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.646 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.649 186962 INFO nova.virt.libvirt.driver [-] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Instance spawned successfully.#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.649 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.687 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.692 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.692 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.693 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.693 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.694 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.695 186962 DEBUG nova.virt.libvirt.driver [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.700 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.742 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.786 186962 INFO nova.compute.manager [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Took 12.29 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.787 186962 DEBUG nova.compute.manager [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.878 186962 DEBUG nova.network.neutron [req-94cc39eb-676b-456f-8775-4098a9123444 req-1a5e4c92-3d41-460a-9630-dd4f71954ac8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Updated VIF entry in instance network info cache for port 713aec40-e89c-478d-ba3a-018459dfab17. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.878 186962 DEBUG nova.network.neutron [req-94cc39eb-676b-456f-8775-4098a9123444 req-1a5e4c92-3d41-460a-9630-dd4f71954ac8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Updating instance_info_cache with network_info: [{"id": "713aec40-e89c-478d-ba3a-018459dfab17", "address": "fa:16:3e:d0:72:87", "network": {"id": "e5f3373c-f7bf-4750-84f2-e76242aed770", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1189278736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f74fdd4985ee459e9ad295d5f888bd61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap713aec40-e8", "ovs_interfaceid": "713aec40-e89c-478d-ba3a-018459dfab17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.920 186962 DEBUG oslo_concurrency.lockutils [req-94cc39eb-676b-456f-8775-4098a9123444 req-1a5e4c92-3d41-460a-9630-dd4f71954ac8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-61add905-a2e1-4724-a51d-25f62f9bfc45" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.938 186962 INFO nova.compute.manager [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Took 13.38 seconds to build instance.#033[00m
Nov 29 02:14:42 np0005539505 nova_compute[186958]: 2025-11-29 07:14:42.960 186962 DEBUG oslo_concurrency.lockutils [None req-6ef29b4b-97a2-4df4-9a38-e394e855c0de 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:44 np0005539505 nova_compute[186958]: 2025-11-29 07:14:44.745 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:44 np0005539505 nova_compute[186958]: 2025-11-29 07:14:44.952 186962 DEBUG nova.compute.manager [req-b06c9b69-7b6a-4127-a558-08829bb57c97 req-b72702ea-3a8b-4787-abb8-9fa3f6669c98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Received event network-vif-plugged-713aec40-e89c-478d-ba3a-018459dfab17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:44 np0005539505 nova_compute[186958]: 2025-11-29 07:14:44.953 186962 DEBUG oslo_concurrency.lockutils [req-b06c9b69-7b6a-4127-a558-08829bb57c97 req-b72702ea-3a8b-4787-abb8-9fa3f6669c98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:44 np0005539505 nova_compute[186958]: 2025-11-29 07:14:44.953 186962 DEBUG oslo_concurrency.lockutils [req-b06c9b69-7b6a-4127-a558-08829bb57c97 req-b72702ea-3a8b-4787-abb8-9fa3f6669c98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:44 np0005539505 nova_compute[186958]: 2025-11-29 07:14:44.953 186962 DEBUG oslo_concurrency.lockutils [req-b06c9b69-7b6a-4127-a558-08829bb57c97 req-b72702ea-3a8b-4787-abb8-9fa3f6669c98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:44 np0005539505 nova_compute[186958]: 2025-11-29 07:14:44.954 186962 DEBUG nova.compute.manager [req-b06c9b69-7b6a-4127-a558-08829bb57c97 req-b72702ea-3a8b-4787-abb8-9fa3f6669c98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] No waiting events found dispatching network-vif-plugged-713aec40-e89c-478d-ba3a-018459dfab17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:44 np0005539505 nova_compute[186958]: 2025-11-29 07:14:44.954 186962 WARNING nova.compute.manager [req-b06c9b69-7b6a-4127-a558-08829bb57c97 req-b72702ea-3a8b-4787-abb8-9fa3f6669c98 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Received unexpected event network-vif-plugged-713aec40-e89c-478d-ba3a-018459dfab17 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:14:45 np0005539505 nova_compute[186958]: 2025-11-29 07:14:45.309 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:45 np0005539505 podman[231985]: 2025-11-29 07:14:45.726822469 +0000 UTC m=+0.053180740 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:14:45 np0005539505 podman[231984]: 2025-11-29 07:14:45.730287428 +0000 UTC m=+0.060279482 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.089 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000062', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'user_id': '970e2faf924c47f4baca59706f567228', 'hostId': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.090 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.100 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.101 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df196c10-df94-45b5-8301-ec77a811c759', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-vda', 'timestamp': '2025-11-29T07:14:48.090311', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16da5160-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.731147825, 'message_signature': '45a484fe23b9d55ad53dd0f57dbddac22e162c7d522f3791c960393cc072ef35'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-sda', 'timestamp': '2025-11-29T07:14:48.090311', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16da63da-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.731147825, 'message_signature': '7c5fb253c652d983e8f3e9eab667a42f5cd0fe349d66ce8e7c39141c58f646e8'}]}, 'timestamp': '2025-11-29 07:14:48.101729', '_unique_id': 'a930ddab857542b3a45975702a415b80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.103 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.117 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.117 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 61add905-a2e1-4724-a51d-25f62f9bfc45: ceilometer.compute.pollsters.NoVolumeException
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.138 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.139 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a20f9bd8-9436-4abd-866b-d57e1139a550', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-vda', 'timestamp': '2025-11-29T07:14:48.118021', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e023ec-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': 'ef39b78bc01b7a03dfc84469197760029555b0117f558fdd1ba7d6a0e8e54b3d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-sda', 'timestamp': '2025-11-29T07:14:48.118021', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e030e4-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': '6efd057581d9db686bb1ecdd27db04cc189f0da4717483f807359ee343d09b55'}]}, 'timestamp': '2025-11-29 07:14:48.139714', '_unique_id': '31a924a65bd24bff9ce0cb8f29799aef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.140 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.141 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c149e9d7-94af-4b9a-955d-b64042d4e7d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-vda', 'timestamp': '2025-11-29T07:14:48.141731', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e08bc0-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': '12e8a3e189b9a9a3e78ba1c75f7b5fd7122f9795d99a20ac4c60430c9316f40b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-sda', 'timestamp': '2025-11-29T07:14:48.141731', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e0969c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': '578a06722bdb91321308de8d4f00fed3265a9a68b52644a0b04d0fe3e453e046'}]}, 'timestamp': '2025-11-29 07:14:48.142337', '_unique_id': '6c535543c564451fb40ac5b0afe71213'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.143 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.144 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-994196049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-994196049>]
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.144 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.144 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-994196049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-994196049>]
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.147 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 61add905-a2e1-4724-a51d-25f62f9bfc45 / tap713aec40-e8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.147 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6eda349-a597-41f1-815c-cdc6a426c84d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': 'instance-00000062-61add905-a2e1-4724-a51d-25f62f9bfc45-tap713aec40-e8', 'timestamp': '2025-11-29T07:14:48.144733', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'tap713aec40-e8', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:72:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap713aec40-e8'}, 'message_id': '16e16860-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.785576319, 'message_signature': 'cf58a5f67322721f41aa5368b191779e71cf22c6c3dedae8bc6a1ba1ca79c1d8'}]}, 'timestamp': '2025-11-29 07:14:48.147686', '_unique_id': '45370718d84641d989a36b1d5fafed3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.148 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '944723b1-01e0-4625-af25-be88745d5cec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': 'instance-00000062-61add905-a2e1-4724-a51d-25f62f9bfc45-tap713aec40-e8', 'timestamp': '2025-11-29T07:14:48.149139', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'tap713aec40-e8', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:72:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap713aec40-e8'}, 'message_id': '16e1ac76-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.785576319, 'message_signature': '69fdbdb449ba80f484feda36f04c7b944e4c24724373a9c66861660d4cd599b2'}]}, 'timestamp': '2025-11-29 07:14:48.149422', '_unique_id': '0825d1912dfd4ac6ab6ec742af2dcb05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.150 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.150 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-994196049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-994196049>]
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.151 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.151 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.151 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-994196049>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerMetadataNegativeTestJSON-server-994196049>]
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.151 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.151 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e7d7b7f-68bc-4b7e-b233-0b8409cdca54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-vda', 'timestamp': '2025-11-29T07:14:48.151608', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e20bee-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.731147825, 'message_signature': '0b1235f557ee1534ecbdd61a6c0cfdfd1e1a44893e58380f68d2944a3a0e04cd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-sda', 'timestamp': '2025-11-29T07:14:48.151608', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e2151c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.731147825, 'message_signature': '0e74195f68645fbd09dc4b0cbd12c83231904eedf8c7b0e26a66bd53e63c4c56'}]}, 'timestamp': '2025-11-29 07:14:48.152086', '_unique_id': '6f669be426964062a15d98ecba0d8d35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.153 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a97a8b77-59f9-48e0-937a-bc0c4a6122c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': 'instance-00000062-61add905-a2e1-4724-a51d-25f62f9bfc45-tap713aec40-e8', 'timestamp': '2025-11-29T07:14:48.153526', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'tap713aec40-e8', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:72:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap713aec40-e8'}, 'message_id': '16e257b6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.785576319, 'message_signature': 'de436f9d8352e04c1f09371b848f077cf727e7296d567165e3e53f0e913ee3ad'}]}, 'timestamp': '2025-11-29 07:14:48.153821', '_unique_id': '011146574e4e426bb2e896381306abb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.155 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70db87c9-8d2c-4509-b0b8-0bc2fd949df5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': 'instance-00000062-61add905-a2e1-4724-a51d-25f62f9bfc45-tap713aec40-e8', 'timestamp': '2025-11-29T07:14:48.155688', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'tap713aec40-e8', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:72:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap713aec40-e8'}, 'message_id': '16e2ac66-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.785576319, 'message_signature': '820eb0f7756d7c6eeaccaba5df1648239d5f0b94d7959dd34685f9c319246fed'}]}, 'timestamp': '2025-11-29 07:14:48.155990', '_unique_id': 'b60564c853cb4a90a6b64ee97784cb02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.157 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c641cb79-98f9-432c-be1c-cf609172983c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': 'instance-00000062-61add905-a2e1-4724-a51d-25f62f9bfc45-tap713aec40-e8', 'timestamp': '2025-11-29T07:14:48.157545', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'tap713aec40-e8', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:72:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap713aec40-e8'}, 'message_id': '16e2f464-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.785576319, 'message_signature': '9f4e1aecdf84d3a882edf79166fefafff2a5842ef8887340aa078fcec289f549'}]}, 'timestamp': '2025-11-29 07:14:48.157816', '_unique_id': 'ea6f7dc841c3463e9d8d3a1f77fff350'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '078d23dd-3d50-4fa6-89bf-04c688d0d67d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': 'instance-00000062-61add905-a2e1-4724-a51d-25f62f9bfc45-tap713aec40-e8', 'timestamp': '2025-11-29T07:14:48.159198', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'tap713aec40-e8', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:72:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap713aec40-e8'}, 'message_id': '16e3356e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.785576319, 'message_signature': 'd1b1313bb720ec2c2efad586138fb797a48ce5ad15348b7dc3970be147bc5d28'}]}, 'timestamp': '2025-11-29 07:14:48.159481', '_unique_id': '4736e71d3b2345d7b200c2ad826963f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.160 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.160 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0d1b57f-bc39-4a08-85ee-c5de8af8da74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': 'instance-00000062-61add905-a2e1-4724-a51d-25f62f9bfc45-tap713aec40-e8', 'timestamp': '2025-11-29T07:14:48.160846', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'tap713aec40-e8', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:72:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap713aec40-e8'}, 'message_id': '16e374fc-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.785576319, 'message_signature': 'e623c948306f9af70562d6790ea17b808e5289addcdab54cd1b06e2e0358e94f'}]}, 'timestamp': '2025-11-29 07:14:48.161107', '_unique_id': '3f41cefb88dc44068af04f8fc12dbd66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.161 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.162 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.162 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.162 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf30a2b5-ba7f-4860-a5b2-60e51232b852', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-vda', 'timestamp': '2025-11-29T07:14:48.162537', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e3b73c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': '2189e2d54145ce0cd75f81bef682f0cfbe53cc83cd050f6b65ab76cc7b4908a5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-sda', 'timestamp': '2025-11-29T07:14:48.162537', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e3c06a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': '722dba6551f6a50c45647a9cfef719b594e9a81455adf789ee259f1a70bc5acc'}]}, 'timestamp': '2025-11-29 07:14:48.163026', '_unique_id': '52f6cf490937426ea0008e7399dc1f7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.164 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.164 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '488d00aa-3719-403c-86f2-0f73294b4f94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': 'instance-00000062-61add905-a2e1-4724-a51d-25f62f9bfc45-tap713aec40-e8', 'timestamp': '2025-11-29T07:14:48.164442', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'tap713aec40-e8', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:72:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap713aec40-e8'}, 'message_id': '16e40174-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.785576319, 'message_signature': 'e3c4745ed5eb1c96b29a5201b34a5300a9def00ed7858a99b72387010909f41b'}]}, 'timestamp': '2025-11-29 07:14:48.164703', '_unique_id': '2cde18c244294470932249a01fd7365b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.168 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.read.latency volume: 141487086 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.169 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.read.latency volume: 700620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d0686a2-742f-4a86-a34d-8e9d41e0937f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 141487086, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-vda', 'timestamp': '2025-11-29T07:14:48.168911', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e4b15a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': '554291bd7c75a52aed0211215dde5c1f95ddb6c7c8b341582d0d46ccae36856d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 700620, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-sda', 'timestamp': '2025-11-29T07:14:48.168911', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e4bbfa-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': 'a854067c71d7a1c84da7dfb4d160c24e0255eb22e37087b1ab3849391bd48592'}]}, 'timestamp': '2025-11-29 07:14:48.169467', '_unique_id': 'f9d65a8cdbcb4b67b94c434e7d2c3a12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.170 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.171 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.171 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.171 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98a6a3d4-37ce-4bd4-bd1d-4cea1cb4e3d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-vda', 'timestamp': '2025-11-29T07:14:48.171251', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e50be6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.731147825, 'message_signature': '9b62489a9a8952f77fb9bda40c18b4eba768d38de0edfe1cd875c194a5fdf310'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-sda', 'timestamp': '2025-11-29T07:14:48.171251', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e51604-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.731147825, 'message_signature': 'cf1c628574b90ecd403cd6f67f459df35ca05befc9b03ed2b7eba16d60a12cf0'}]}, 'timestamp': '2025-11-29 07:14:48.171808', '_unique_id': '15eb8a87e7e645eb81b6d96d027a2b5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.173 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.173 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.173 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9216acae-3174-4fbd-9916-e158a100e735', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-vda', 'timestamp': '2025-11-29T07:14:48.173391', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e55fd8-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': '0d89c6b3fa3601cb5ee71d9df3b42f04faf6592c027162ebadf461f8ba4d374e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-sda', 'timestamp': '2025-11-29T07:14:48.173391', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e56a32-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': '294e7584ed086b048ebac8f9a4ec8caff098ed0c4749511b787161600a0620cb'}]}, 'timestamp': '2025-11-29 07:14:48.173930', '_unique_id': '541eac646b17460cb395788e3d3a7b12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.176 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.176 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e594ae02-133b-43c5-af47-7c1e987f054e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-vda', 'timestamp': '2025-11-29T07:14:48.176039', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '16e5c932-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': '46ee725308eec0f24ab144d14a30591fcc03e93a898399312a8803fcfc9cfb40'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45-sda', 'timestamp': '2025-11-29T07:14:48.176039', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '16e5d328-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758845951, 'message_signature': '02457bb0cb3fd214f12c53dd19b6af9fdc2a9c435f8628f40cb532ef51130e63'}]}, 'timestamp': '2025-11-29 07:14:48.176644', '_unique_id': '4e3b3553160f4c629cc37d0fa2740a33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.178 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.178 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/cpu volume: 5280000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24994c26-4b68-4d61-bd44-f44f1c28880a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5280000000, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'timestamp': '2025-11-29T07:14:48.178520', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'instance-00000062', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '16e627ce-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.758326466, 'message_signature': '27fb2c67f6ca7c4277c4763505ee5563c4410068fe5efb4f898959614bcabbb8'}]}, 'timestamp': '2025-11-29 07:14:48.178924', '_unique_id': '2cd8f5d6c1724e30a61ecba8479e064f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.180 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.180 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1764014e-f086-411a-8d04-0c661a8c58fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': 'instance-00000062-61add905-a2e1-4724-a51d-25f62f9bfc45-tap713aec40-e8', 'timestamp': '2025-11-29T07:14:48.180393', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'tap713aec40-e8', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:72:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap713aec40-e8'}, 'message_id': '16e670d0-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.785576319, 'message_signature': '2ecead3d800a8d719377de0008f85e0cb8ad4294fcb8c50923b50d968d3f4e38'}]}, 'timestamp': '2025-11-29 07:14:48.180661', '_unique_id': '4435af0c178b4ae7a30ea1f868d12e96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 DEBUG ceilometer.compute.pollsters [-] 61add905-a2e1-4724-a51d-25f62f9bfc45/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec27ba55-e5c0-4a68-b319-8da12a1dcd19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '970e2faf924c47f4baca59706f567228', 'user_name': None, 'project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'project_name': None, 'resource_id': 'instance-00000062-61add905-a2e1-4724-a51d-25f62f9bfc45-tap713aec40-e8', 'timestamp': '2025-11-29T07:14:48.182105', 'resource_metadata': {'display_name': 'tempest-ServerMetadataNegativeTestJSON-server-994196049', 'name': 'tap713aec40-e8', 'instance_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'instance_type': 'm1.nano', 'host': '2f81117e0189be218f7ff71fa33e4d6dd1280d1b7f4bfdee4c04fd94', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d0:72:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap713aec40-e8'}, 'message_id': '16e6b432-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 5935.785576319, 'message_signature': '172f8797f75a2491f9e14b5c06ad77e15ba703a686aa04be0b6329d3d91ade9a'}]}, 'timestamp': '2025-11-29 07:14:48.182388', '_unique_id': '5bbdb59d0ba24c95b3e4cb38eb4f594b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:14:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:14:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.382 186962 DEBUG oslo_concurrency.lockutils [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Acquiring lock "61add905-a2e1-4724-a51d-25f62f9bfc45" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.382 186962 DEBUG oslo_concurrency.lockutils [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.382 186962 DEBUG oslo_concurrency.lockutils [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Acquiring lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.383 186962 DEBUG oslo_concurrency.lockutils [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.383 186962 DEBUG oslo_concurrency.lockutils [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.420 186962 INFO nova.compute.manager [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Terminating instance#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.492 186962 DEBUG nova.compute.manager [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:14:49 np0005539505 kernel: tap713aec40-e8 (unregistering): left promiscuous mode
Nov 29 02:14:49 np0005539505 NetworkManager[55134]: <info>  [1764400489.5221] device (tap713aec40-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.530 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:49 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:49Z|00396|binding|INFO|Releasing lport 713aec40-e89c-478d-ba3a-018459dfab17 from this chassis (sb_readonly=0)
Nov 29 02:14:49 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:49Z|00397|binding|INFO|Setting lport 713aec40-e89c-478d-ba3a-018459dfab17 down in Southbound
Nov 29 02:14:49 np0005539505 ovn_controller[95143]: 2025-11-29T07:14:49Z|00398|binding|INFO|Removing iface tap713aec40-e8 ovn-installed in OVS
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.534 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.549 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.570 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:72:87 10.100.0.5'], port_security=['fa:16:3e:d0:72:87 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '61add905-a2e1-4724-a51d-25f62f9bfc45', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e5f3373c-f7bf-4750-84f2-e76242aed770', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f74fdd4985ee459e9ad295d5f888bd61', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4bbba178-6567-4234-b6c4-90095d9c80ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=45ac6440-1523-49e7-be17-f00b457c625b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=713aec40-e89c-478d-ba3a-018459dfab17) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.572 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 713aec40-e89c-478d-ba3a-018459dfab17 in datapath e5f3373c-f7bf-4750-84f2-e76242aed770 unbound from our chassis#033[00m
Nov 29 02:14:49 np0005539505 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000062.scope: Deactivated successfully.
Nov 29 02:14:49 np0005539505 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000062.scope: Consumed 7.344s CPU time.
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.574 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e5f3373c-f7bf-4750-84f2-e76242aed770, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.575 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[be3918ca-0fa3-4b1a-80a6-7c6b8af8b89c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.576 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770 namespace which is not needed anymore#033[00m
Nov 29 02:14:49 np0005539505 systemd-machined[153285]: Machine qemu-50-instance-00000062 terminated.
Nov 29 02:14:49 np0005539505 neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770[231969]: [NOTICE]   (231973) : haproxy version is 2.8.14-c23fe91
Nov 29 02:14:49 np0005539505 neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770[231969]: [NOTICE]   (231973) : path to executable is /usr/sbin/haproxy
Nov 29 02:14:49 np0005539505 neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770[231969]: [WARNING]  (231973) : Exiting Master process...
Nov 29 02:14:49 np0005539505 neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770[231969]: [ALERT]    (231973) : Current worker (231975) exited with code 143 (Terminated)
Nov 29 02:14:49 np0005539505 neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770[231969]: [WARNING]  (231973) : All workers exited. Exiting... (0)
Nov 29 02:14:49 np0005539505 systemd[1]: libpod-62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885.scope: Deactivated successfully.
Nov 29 02:14:49 np0005539505 podman[232048]: 2025-11-29 07:14:49.705984681 +0000 UTC m=+0.044349110 container died 62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:14:49 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885-userdata-shm.mount: Deactivated successfully.
Nov 29 02:14:49 np0005539505 systemd[1]: var-lib-containers-storage-overlay-c46d3b42bdcf79cc037105661a5dcd27d98c09d6c7b4b31d0948539d18953926-merged.mount: Deactivated successfully.
Nov 29 02:14:49 np0005539505 podman[232048]: 2025-11-29 07:14:49.748570299 +0000 UTC m=+0.086934728 container cleanup 62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.757 186962 INFO nova.virt.libvirt.driver [-] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Instance destroyed successfully.#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.757 186962 DEBUG nova.objects.instance [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lazy-loading 'resources' on Instance uuid 61add905-a2e1-4724-a51d-25f62f9bfc45 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.784 186962 DEBUG nova.virt.libvirt.vif [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:14:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-994196049',display_name='tempest-ServerMetadataNegativeTestJSON-server-994196049',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-994196049',id=98,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:14:42Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f74fdd4985ee459e9ad295d5f888bd61',ramdisk_id='',reservation_id='r-18550hz4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataNegativeTestJSON-374554764',owner_user_name='tempest-ServerMetadataNegativeTestJSON-374554764-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:14:42Z,user_data=None,user_id='970e2faf924c47f4baca59706f567228',uuid=61add905-a2e1-4724-a51d-25f62f9bfc45,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "713aec40-e89c-478d-ba3a-018459dfab17", "address": "fa:16:3e:d0:72:87", "network": {"id": "e5f3373c-f7bf-4750-84f2-e76242aed770", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1189278736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f74fdd4985ee459e9ad295d5f888bd61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap713aec40-e8", "ovs_interfaceid": "713aec40-e89c-478d-ba3a-018459dfab17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.785 186962 DEBUG nova.network.os_vif_util [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Converting VIF {"id": "713aec40-e89c-478d-ba3a-018459dfab17", "address": "fa:16:3e:d0:72:87", "network": {"id": "e5f3373c-f7bf-4750-84f2-e76242aed770", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-1189278736-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f74fdd4985ee459e9ad295d5f888bd61", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap713aec40-e8", "ovs_interfaceid": "713aec40-e89c-478d-ba3a-018459dfab17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.786 186962 DEBUG nova.network.os_vif_util [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:72:87,bridge_name='br-int',has_traffic_filtering=True,id=713aec40-e89c-478d-ba3a-018459dfab17,network=Network(e5f3373c-f7bf-4750-84f2-e76242aed770),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap713aec40-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.786 186962 DEBUG os_vif [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:72:87,bridge_name='br-int',has_traffic_filtering=True,id=713aec40-e89c-478d-ba3a-018459dfab17,network=Network(e5f3373c-f7bf-4750-84f2-e76242aed770),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap713aec40-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.788 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.790 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap713aec40-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.791 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.792 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.794 186962 INFO os_vif [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:72:87,bridge_name='br-int',has_traffic_filtering=True,id=713aec40-e89c-478d-ba3a-018459dfab17,network=Network(e5f3373c-f7bf-4750-84f2-e76242aed770),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap713aec40-e8')#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.795 186962 INFO nova.virt.libvirt.driver [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Deleting instance files /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45_del#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.796 186962 INFO nova.virt.libvirt.driver [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Deletion of /var/lib/nova/instances/61add905-a2e1-4724-a51d-25f62f9bfc45_del complete#033[00m
Nov 29 02:14:49 np0005539505 systemd[1]: libpod-conmon-62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885.scope: Deactivated successfully.
Nov 29 02:14:49 np0005539505 podman[232094]: 2025-11-29 07:14:49.859610129 +0000 UTC m=+0.050610827 container remove 62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.864 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb52af5-c6d8-4f94-a6e5-42a56833c789]: (4, ('Sat Nov 29 07:14:49 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770 (62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885)\n62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885\nSat Nov 29 07:14:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770 (62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885)\n62829f30fdda0b9151370e0fe858172f2fc179874168b79c07b083494640b885\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.866 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d7e170-8884-4d8e-ad70-77ff66064027]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.867 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5f3373c-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.868 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:49 np0005539505 kernel: tape5f3373c-f0: left promiscuous mode
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.871 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.873 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e12490c8-0faa-41ff-933e-0d22b8b0c866]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:49 np0005539505 nova_compute[186958]: 2025-11-29 07:14:49.882 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.894 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a66d80-3607-4d1c-bb2c-14bb20fa11a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.896 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc0afd3-3218-4c10-9be5-cf11fb3c0ba3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.914 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[26f142ae-db97-46bb-8cdb-82fb3604353e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 592885, 'reachable_time': 43740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232106, 'error': None, 'target': 'ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.917 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e5f3373c-f7bf-4750-84f2-e76242aed770 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:14:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:49.917 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[3fabffed-fea8-4b8b-968d-b249a7cf5e1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:14:49 np0005539505 systemd[1]: run-netns-ovnmeta\x2de5f3373c\x2df7bf\x2d4750\x2d84f2\x2de76242aed770.mount: Deactivated successfully.
Nov 29 02:14:50 np0005539505 nova_compute[186958]: 2025-11-29 07:14:50.271 186962 DEBUG nova.compute.manager [req-5f09bf1a-4662-4dbb-95d9-acc40f506dde req-b6b7494c-e242-47a6-8d71-efcc7ad5c38c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Received event network-vif-unplugged-713aec40-e89c-478d-ba3a-018459dfab17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:50 np0005539505 nova_compute[186958]: 2025-11-29 07:14:50.271 186962 DEBUG oslo_concurrency.lockutils [req-5f09bf1a-4662-4dbb-95d9-acc40f506dde req-b6b7494c-e242-47a6-8d71-efcc7ad5c38c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:50 np0005539505 nova_compute[186958]: 2025-11-29 07:14:50.272 186962 DEBUG oslo_concurrency.lockutils [req-5f09bf1a-4662-4dbb-95d9-acc40f506dde req-b6b7494c-e242-47a6-8d71-efcc7ad5c38c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:50 np0005539505 nova_compute[186958]: 2025-11-29 07:14:50.272 186962 DEBUG oslo_concurrency.lockutils [req-5f09bf1a-4662-4dbb-95d9-acc40f506dde req-b6b7494c-e242-47a6-8d71-efcc7ad5c38c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:50 np0005539505 nova_compute[186958]: 2025-11-29 07:14:50.272 186962 DEBUG nova.compute.manager [req-5f09bf1a-4662-4dbb-95d9-acc40f506dde req-b6b7494c-e242-47a6-8d71-efcc7ad5c38c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] No waiting events found dispatching network-vif-unplugged-713aec40-e89c-478d-ba3a-018459dfab17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:50 np0005539505 nova_compute[186958]: 2025-11-29 07:14:50.272 186962 DEBUG nova.compute.manager [req-5f09bf1a-4662-4dbb-95d9-acc40f506dde req-b6b7494c-e242-47a6-8d71-efcc7ad5c38c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Received event network-vif-unplugged-713aec40-e89c-478d-ba3a-018459dfab17 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:14:50 np0005539505 nova_compute[186958]: 2025-11-29 07:14:50.444 186962 INFO nova.compute.manager [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Took 0.95 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:14:50 np0005539505 nova_compute[186958]: 2025-11-29 07:14:50.445 186962 DEBUG oslo.service.loopingcall [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:14:50 np0005539505 nova_compute[186958]: 2025-11-29 07:14:50.445 186962 DEBUG nova.compute.manager [-] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:14:50 np0005539505 nova_compute[186958]: 2025-11-29 07:14:50.446 186962 DEBUG nova.network.neutron [-] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:14:53 np0005539505 nova_compute[186958]: 2025-11-29 07:14:53.327 186962 DEBUG nova.compute.manager [req-b5f232c7-127c-4749-b90e-e1e4c6f8226d req-7297dc96-7108-4f4b-993e-833553e03973 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Received event network-vif-plugged-713aec40-e89c-478d-ba3a-018459dfab17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:53 np0005539505 nova_compute[186958]: 2025-11-29 07:14:53.327 186962 DEBUG oslo_concurrency.lockutils [req-b5f232c7-127c-4749-b90e-e1e4c6f8226d req-7297dc96-7108-4f4b-993e-833553e03973 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:53 np0005539505 nova_compute[186958]: 2025-11-29 07:14:53.328 186962 DEBUG oslo_concurrency.lockutils [req-b5f232c7-127c-4749-b90e-e1e4c6f8226d req-7297dc96-7108-4f4b-993e-833553e03973 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:53 np0005539505 nova_compute[186958]: 2025-11-29 07:14:53.328 186962 DEBUG oslo_concurrency.lockutils [req-b5f232c7-127c-4749-b90e-e1e4c6f8226d req-7297dc96-7108-4f4b-993e-833553e03973 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:53 np0005539505 nova_compute[186958]: 2025-11-29 07:14:53.328 186962 DEBUG nova.compute.manager [req-b5f232c7-127c-4749-b90e-e1e4c6f8226d req-7297dc96-7108-4f4b-993e-833553e03973 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] No waiting events found dispatching network-vif-plugged-713aec40-e89c-478d-ba3a-018459dfab17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:14:53 np0005539505 nova_compute[186958]: 2025-11-29 07:14:53.329 186962 WARNING nova.compute.manager [req-b5f232c7-127c-4749-b90e-e1e4c6f8226d req-7297dc96-7108-4f4b-993e-833553e03973 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Received unexpected event network-vif-plugged-713aec40-e89c-478d-ba3a-018459dfab17 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:14:54 np0005539505 nova_compute[186958]: 2025-11-29 07:14:54.791 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:55 np0005539505 nova_compute[186958]: 2025-11-29 07:14:55.262 186962 DEBUG nova.network.neutron [-] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:14:55 np0005539505 nova_compute[186958]: 2025-11-29 07:14:55.288 186962 INFO nova.compute.manager [-] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Took 4.84 seconds to deallocate network for instance.#033[00m
Nov 29 02:14:55 np0005539505 nova_compute[186958]: 2025-11-29 07:14:55.376 186962 DEBUG nova.compute.manager [req-42d6e3f0-387a-4b52-8b23-a4c2b39e6421 req-64a850bc-d944-4446-a935-9a59e58b72aa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Received event network-vif-deleted-713aec40-e89c-478d-ba3a-018459dfab17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:14:55 np0005539505 nova_compute[186958]: 2025-11-29 07:14:55.397 186962 DEBUG oslo_concurrency.lockutils [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:14:55 np0005539505 nova_compute[186958]: 2025-11-29 07:14:55.397 186962 DEBUG oslo_concurrency.lockutils [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:14:55 np0005539505 nova_compute[186958]: 2025-11-29 07:14:55.470 186962 DEBUG nova.compute.provider_tree [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:14:55 np0005539505 nova_compute[186958]: 2025-11-29 07:14:55.496 186962 DEBUG nova.scheduler.client.report [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:14:55 np0005539505 nova_compute[186958]: 2025-11-29 07:14:55.531 186962 DEBUG oslo_concurrency.lockutils [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:55 np0005539505 nova_compute[186958]: 2025-11-29 07:14:55.570 186962 INFO nova.scheduler.client.report [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Deleted allocations for instance 61add905-a2e1-4724-a51d-25f62f9bfc45#033[00m
Nov 29 02:14:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:55.632 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:14:55 np0005539505 nova_compute[186958]: 2025-11-29 07:14:55.632 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:14:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:55.633 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:14:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:14:55.634 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:14:55 np0005539505 nova_compute[186958]: 2025-11-29 07:14:55.701 186962 DEBUG oslo_concurrency.lockutils [None req-bc8a22a9-2dc3-451d-bf59-f891a85224a7 970e2faf924c47f4baca59706f567228 f74fdd4985ee459e9ad295d5f888bd61 - - default default] Lock "61add905-a2e1-4724-a51d-25f62f9bfc45" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:14:58 np0005539505 podman[232109]: 2025-11-29 07:14:58.728508228 +0000 UTC m=+0.059737296 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:14:58 np0005539505 podman[232108]: 2025-11-29 07:14:58.737828622 +0000 UTC m=+0.070986925 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:14:59 np0005539505 nova_compute[186958]: 2025-11-29 07:14:59.381 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:14:59 np0005539505 nova_compute[186958]: 2025-11-29 07:14:59.793 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:01 np0005539505 podman[232151]: 2025-11-29 07:15:01.751045925 +0000 UTC m=+0.083182941 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 02:15:03 np0005539505 nova_compute[186958]: 2025-11-29 07:15:03.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:03 np0005539505 nova_compute[186958]: 2025-11-29 07:15:03.668 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:03 np0005539505 nova_compute[186958]: 2025-11-29 07:15:03.847 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:04 np0005539505 nova_compute[186958]: 2025-11-29 07:15:04.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:04 np0005539505 nova_compute[186958]: 2025-11-29 07:15:04.756 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400489.7550495, 61add905-a2e1-4724-a51d-25f62f9bfc45 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:15:04 np0005539505 nova_compute[186958]: 2025-11-29 07:15:04.756 186962 INFO nova.compute.manager [-] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:15:04 np0005539505 nova_compute[186958]: 2025-11-29 07:15:04.783 186962 DEBUG nova.compute.manager [None req-2122f1e3-25ca-4f3f-99a6-60bbf87dab5b - - - - - -] [instance: 61add905-a2e1-4724-a51d-25f62f9bfc45] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:04 np0005539505 nova_compute[186958]: 2025-11-29 07:15:04.795 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:08 np0005539505 nova_compute[186958]: 2025-11-29 07:15:08.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:08 np0005539505 nova_compute[186958]: 2025-11-29 07:15:08.441 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:08 np0005539505 nova_compute[186958]: 2025-11-29 07:15:08.442 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:08 np0005539505 nova_compute[186958]: 2025-11-29 07:15:08.442 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:08 np0005539505 nova_compute[186958]: 2025-11-29 07:15:08.442 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:15:08 np0005539505 nova_compute[186958]: 2025-11-29 07:15:08.587 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:15:08 np0005539505 nova_compute[186958]: 2025-11-29 07:15:08.588 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5725MB free_disk=73.22476196289062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:15:08 np0005539505 nova_compute[186958]: 2025-11-29 07:15:08.588 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:08 np0005539505 nova_compute[186958]: 2025-11-29 07:15:08.588 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:09 np0005539505 nova_compute[186958]: 2025-11-29 07:15:09.277 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:15:09 np0005539505 nova_compute[186958]: 2025-11-29 07:15:09.277 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:15:09 np0005539505 nova_compute[186958]: 2025-11-29 07:15:09.303 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:15:09 np0005539505 nova_compute[186958]: 2025-11-29 07:15:09.339 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:15:09 np0005539505 nova_compute[186958]: 2025-11-29 07:15:09.384 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:15:09 np0005539505 nova_compute[186958]: 2025-11-29 07:15:09.385 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.797s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:09 np0005539505 nova_compute[186958]: 2025-11-29 07:15:09.797 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:15:10 np0005539505 podman[232172]: 2025-11-29 07:15:10.710076351 +0000 UTC m=+0.047211170 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:15:10 np0005539505 podman[232173]: 2025-11-29 07:15:10.750090076 +0000 UTC m=+0.080901146 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:15:12 np0005539505 nova_compute[186958]: 2025-11-29 07:15:12.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:12 np0005539505 nova_compute[186958]: 2025-11-29 07:15:12.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:12 np0005539505 nova_compute[186958]: 2025-11-29 07:15:12.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:15:13 np0005539505 nova_compute[186958]: 2025-11-29 07:15:13.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:13 np0005539505 nova_compute[186958]: 2025-11-29 07:15:13.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:15:13 np0005539505 nova_compute[186958]: 2025-11-29 07:15:13.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:15:13 np0005539505 nova_compute[186958]: 2025-11-29 07:15:13.437 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:15:13 np0005539505 nova_compute[186958]: 2025-11-29 07:15:13.437 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:14 np0005539505 nova_compute[186958]: 2025-11-29 07:15:14.799 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:15:14 np0005539505 nova_compute[186958]: 2025-11-29 07:15:14.800 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:14 np0005539505 nova_compute[186958]: 2025-11-29 07:15:14.800 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 29 02:15:14 np0005539505 nova_compute[186958]: 2025-11-29 07:15:14.800 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:15:14 np0005539505 nova_compute[186958]: 2025-11-29 07:15:14.801 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:15:14 np0005539505 nova_compute[186958]: 2025-11-29 07:15:14.802 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:16 np0005539505 podman[232222]: 2025-11-29 07:15:16.717671555 +0000 UTC m=+0.055034512 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS)
Nov 29 02:15:16 np0005539505 podman[232223]: 2025-11-29 07:15:16.719078735 +0000 UTC m=+0.052676715 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:15:19 np0005539505 nova_compute[186958]: 2025-11-29 07:15:19.803 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:15:22 np0005539505 nova_compute[186958]: 2025-11-29 07:15:22.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:24 np0005539505 nova_compute[186958]: 2025-11-29 07:15:24.805 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:15:24 np0005539505 nova_compute[186958]: 2025-11-29 07:15:24.806 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:24 np0005539505 nova_compute[186958]: 2025-11-29 07:15:24.806 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 29 02:15:24 np0005539505 nova_compute[186958]: 2025-11-29 07:15:24.806 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:15:24 np0005539505 nova_compute[186958]: 2025-11-29 07:15:24.806 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:15:24 np0005539505 nova_compute[186958]: 2025-11-29 07:15:24.807 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:24 np0005539505 nova_compute[186958]: 2025-11-29 07:15:24.808 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:26.954 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:26.954 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:26.954 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:29 np0005539505 podman[232265]: 2025-11-29 07:15:29.731853897 +0000 UTC m=+0.052003357 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:15:29 np0005539505 podman[232264]: 2025-11-29 07:15:29.731845636 +0000 UTC m=+0.051758269 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Nov 29 02:15:29 np0005539505 nova_compute[186958]: 2025-11-29 07:15:29.808 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:29 np0005539505 nova_compute[186958]: 2025-11-29 07:15:29.810 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:32 np0005539505 podman[232307]: 2025-11-29 07:15:32.706781824 +0000 UTC m=+0.043831344 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:15:34 np0005539505 nova_compute[186958]: 2025-11-29 07:15:34.811 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:15:34 np0005539505 nova_compute[186958]: 2025-11-29 07:15:34.812 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.107 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.108 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.132 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Acquiring lock "a7c751db-add0-45f3-8b76-5f5474c66e46" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.133 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.133 186962 DEBUG nova.compute.manager [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.160 186962 DEBUG nova.compute.manager [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.243 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.244 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.253 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.253 186962 INFO nova.compute.claims [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.429 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.540 186962 DEBUG nova.compute.provider_tree [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.560 186962 DEBUG nova.scheduler.client.report [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.596 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.352s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.597 186962 DEBUG nova.compute.manager [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.600 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.606 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.607 186962 INFO nova.compute.claims [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.694 186962 DEBUG nova.compute.manager [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.695 186962 DEBUG nova.network.neutron [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.714 186962 INFO nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.734 186962 DEBUG nova.compute.manager [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.790 186962 DEBUG nova.compute.provider_tree [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.814 186962 DEBUG nova.scheduler.client.report [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.847 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.848 186962 DEBUG nova.compute.manager [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.907 186962 DEBUG nova.compute.manager [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.908 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.908 186962 INFO nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Creating image(s)#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.909 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.909 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.910 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.922 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.942 186962 DEBUG nova.compute.manager [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.942 186962 DEBUG nova.network.neutron [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.973 186962 INFO nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.977 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.977 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.978 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:35 np0005539505 nova_compute[186958]: 2025-11-29 07:15:35.989 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.005 186962 DEBUG nova.compute.manager [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.018 186962 DEBUG nova.policy [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.042 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.042 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.119 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk 1073741824" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.120 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.120 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.137 186962 DEBUG nova.policy [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1192c8e43d154d7496b324c217093f43', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '95d2550fe1b64fe3b009f5592973da32', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.144 186962 DEBUG nova.compute.manager [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.146 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.146 186962 INFO nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Creating image(s)#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.147 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Acquiring lock "/var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.147 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "/var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.148 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "/var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.167 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.185 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.186 186962 DEBUG nova.virt.disk.api [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Checking if we can resize image /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.187 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.224 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.225 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.226 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.238 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.255 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.256 186962 DEBUG nova.virt.disk.api [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Cannot resize image /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.257 186962 DEBUG nova.objects.instance [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'migration_context' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.269 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.269 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Ensure instance console log exists: /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.270 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.270 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.270 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.291 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.292 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.751 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk 1073741824" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.752 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.526s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.753 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.810 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.811 186962 DEBUG nova.virt.disk.api [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Checking if we can resize image /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.811 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.865 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.866 186962 DEBUG nova.virt.disk.api [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Cannot resize image /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.867 186962 DEBUG nova.objects.instance [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lazy-loading 'migration_context' on Instance uuid a7c751db-add0-45f3-8b76-5f5474c66e46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.882 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.882 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Ensure instance console log exists: /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.882 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.883 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:36 np0005539505 nova_compute[186958]: 2025-11-29 07:15:36.883 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:37 np0005539505 nova_compute[186958]: 2025-11-29 07:15:37.588 186962 DEBUG nova.network.neutron [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Successfully created port: 398d5fef-7a9f-4bbe-8db0-754814939ba5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:15:37 np0005539505 nova_compute[186958]: 2025-11-29 07:15:37.612 186962 DEBUG nova.network.neutron [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Successfully created port: 7f253c88-5c90-410c-bbe6-a152ae7c3a63 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:15:39 np0005539505 nova_compute[186958]: 2025-11-29 07:15:39.668 186962 DEBUG nova.network.neutron [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Successfully updated port: 7f253c88-5c90-410c-bbe6-a152ae7c3a63 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:15:39 np0005539505 nova_compute[186958]: 2025-11-29 07:15:39.709 186962 DEBUG nova.network.neutron [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Successfully updated port: 398d5fef-7a9f-4bbe-8db0-754814939ba5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:15:39 np0005539505 nova_compute[186958]: 2025-11-29 07:15:39.812 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:15:39 np0005539505 nova_compute[186958]: 2025-11-29 07:15:39.814 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:39 np0005539505 nova_compute[186958]: 2025-11-29 07:15:39.814 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 29 02:15:39 np0005539505 nova_compute[186958]: 2025-11-29 07:15:39.814 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:15:39 np0005539505 nova_compute[186958]: 2025-11-29 07:15:39.814 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:15:39 np0005539505 nova_compute[186958]: 2025-11-29 07:15:39.816 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:15:41 np0005539505 podman[232356]: 2025-11-29 07:15:41.715935171 +0000 UTC m=+0.051911814 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:15:41 np0005539505 podman[232357]: 2025-11-29 07:15:41.776107628 +0000 UTC m=+0.109259451 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 29 02:15:44 np0005539505 nova_compute[186958]: 2025-11-29 07:15:44.742 186962 DEBUG nova.compute.manager [req-f11def9f-9efe-4a47-90a2-e1b86fdd7abf req-26876aac-1a62-418d-9b70-ab45b4f495ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-changed-7f253c88-5c90-410c-bbe6-a152ae7c3a63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:44 np0005539505 nova_compute[186958]: 2025-11-29 07:15:44.742 186962 DEBUG nova.compute.manager [req-f11def9f-9efe-4a47-90a2-e1b86fdd7abf req-26876aac-1a62-418d-9b70-ab45b4f495ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Refreshing instance network info cache due to event network-changed-7f253c88-5c90-410c-bbe6-a152ae7c3a63. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:15:44 np0005539505 nova_compute[186958]: 2025-11-29 07:15:44.742 186962 DEBUG oslo_concurrency.lockutils [req-f11def9f-9efe-4a47-90a2-e1b86fdd7abf req-26876aac-1a62-418d-9b70-ab45b4f495ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:15:44 np0005539505 nova_compute[186958]: 2025-11-29 07:15:44.743 186962 DEBUG oslo_concurrency.lockutils [req-f11def9f-9efe-4a47-90a2-e1b86fdd7abf req-26876aac-1a62-418d-9b70-ab45b4f495ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:15:44 np0005539505 nova_compute[186958]: 2025-11-29 07:15:44.743 186962 DEBUG nova.network.neutron [req-f11def9f-9efe-4a47-90a2-e1b86fdd7abf req-26876aac-1a62-418d-9b70-ab45b4f495ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Refreshing network info cache for port 7f253c88-5c90-410c-bbe6-a152ae7c3a63 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:15:44 np0005539505 nova_compute[186958]: 2025-11-29 07:15:44.744 186962 DEBUG nova.compute.manager [req-fef0f251-245e-4a16-a767-6ee4ddbc0aab req-a4d4dec5-55ef-44a4-b640-a49760cc899e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Received event network-changed-398d5fef-7a9f-4bbe-8db0-754814939ba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:44 np0005539505 nova_compute[186958]: 2025-11-29 07:15:44.745 186962 DEBUG nova.compute.manager [req-fef0f251-245e-4a16-a767-6ee4ddbc0aab req-a4d4dec5-55ef-44a4-b640-a49760cc899e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Refreshing instance network info cache due to event network-changed-398d5fef-7a9f-4bbe-8db0-754814939ba5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:15:44 np0005539505 nova_compute[186958]: 2025-11-29 07:15:44.745 186962 DEBUG oslo_concurrency.lockutils [req-fef0f251-245e-4a16-a767-6ee4ddbc0aab req-a4d4dec5-55ef-44a4-b640-a49760cc899e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-a7c751db-add0-45f3-8b76-5f5474c66e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:15:44 np0005539505 nova_compute[186958]: 2025-11-29 07:15:44.745 186962 DEBUG oslo_concurrency.lockutils [req-fef0f251-245e-4a16-a767-6ee4ddbc0aab req-a4d4dec5-55ef-44a4-b640-a49760cc899e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-a7c751db-add0-45f3-8b76-5f5474c66e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:15:44 np0005539505 nova_compute[186958]: 2025-11-29 07:15:44.745 186962 DEBUG nova.network.neutron [req-fef0f251-245e-4a16-a767-6ee4ddbc0aab req-a4d4dec5-55ef-44a4-b640-a49760cc899e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Refreshing network info cache for port 398d5fef-7a9f-4bbe-8db0-754814939ba5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:15:44 np0005539505 nova_compute[186958]: 2025-11-29 07:15:44.817 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:45 np0005539505 nova_compute[186958]: 2025-11-29 07:15:45.110 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:15:45 np0005539505 nova_compute[186958]: 2025-11-29 07:15:45.283 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Acquiring lock "refresh_cache-a7c751db-add0-45f3-8b76-5f5474c66e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:15:46 np0005539505 nova_compute[186958]: 2025-11-29 07:15:46.356 186962 DEBUG nova.network.neutron [req-f11def9f-9efe-4a47-90a2-e1b86fdd7abf req-26876aac-1a62-418d-9b70-ab45b4f495ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:15:46 np0005539505 nova_compute[186958]: 2025-11-29 07:15:46.373 186962 DEBUG nova.network.neutron [req-fef0f251-245e-4a16-a767-6ee4ddbc0aab req-a4d4dec5-55ef-44a4-b640-a49760cc899e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:15:46 np0005539505 nova_compute[186958]: 2025-11-29 07:15:46.694 186962 DEBUG nova.network.neutron [req-f11def9f-9efe-4a47-90a2-e1b86fdd7abf req-26876aac-1a62-418d-9b70-ab45b4f495ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:15:46 np0005539505 nova_compute[186958]: 2025-11-29 07:15:46.700 186962 DEBUG nova.network.neutron [req-fef0f251-245e-4a16-a767-6ee4ddbc0aab req-a4d4dec5-55ef-44a4-b640-a49760cc899e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:15:47 np0005539505 nova_compute[186958]: 2025-11-29 07:15:47.361 186962 DEBUG oslo_concurrency.lockutils [req-f11def9f-9efe-4a47-90a2-e1b86fdd7abf req-26876aac-1a62-418d-9b70-ab45b4f495ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:15:47 np0005539505 nova_compute[186958]: 2025-11-29 07:15:47.362 186962 DEBUG oslo_concurrency.lockutils [req-fef0f251-245e-4a16-a767-6ee4ddbc0aab req-a4d4dec5-55ef-44a4-b640-a49760cc899e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-a7c751db-add0-45f3-8b76-5f5474c66e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:15:47 np0005539505 nova_compute[186958]: 2025-11-29 07:15:47.362 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:15:47 np0005539505 nova_compute[186958]: 2025-11-29 07:15:47.363 186962 DEBUG nova.network.neutron [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:15:47 np0005539505 nova_compute[186958]: 2025-11-29 07:15:47.364 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Acquired lock "refresh_cache-a7c751db-add0-45f3-8b76-5f5474c66e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:15:47 np0005539505 nova_compute[186958]: 2025-11-29 07:15:47.364 186962 DEBUG nova.network.neutron [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:15:47 np0005539505 podman[232407]: 2025-11-29 07:15:47.727772165 +0000 UTC m=+0.061310109 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:15:47 np0005539505 podman[232408]: 2025-11-29 07:15:47.741281268 +0000 UTC m=+0.072200348 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2)
Nov 29 02:15:49 np0005539505 nova_compute[186958]: 2025-11-29 07:15:49.818 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:15:49 np0005539505 nova_compute[186958]: 2025-11-29 07:15:49.820 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:49 np0005539505 nova_compute[186958]: 2025-11-29 07:15:49.820 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 29 02:15:49 np0005539505 nova_compute[186958]: 2025-11-29 07:15:49.820 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:15:49 np0005539505 nova_compute[186958]: 2025-11-29 07:15:49.821 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:15:49 np0005539505 nova_compute[186958]: 2025-11-29 07:15:49.821 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:50 np0005539505 nova_compute[186958]: 2025-11-29 07:15:50.454 186962 DEBUG nova.network.neutron [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:15:50 np0005539505 nova_compute[186958]: 2025-11-29 07:15:50.509 186962 DEBUG nova.network.neutron [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:15:52 np0005539505 nova_compute[186958]: 2025-11-29 07:15:52.883 186962 DEBUG nova.network.neutron [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.061 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.062 186962 DEBUG nova.compute.manager [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Instance network_info: |[{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.067 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Start _get_guest_xml network_info=[{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.073 186962 WARNING nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.078 186962 DEBUG nova.virt.libvirt.host [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.079 186962 DEBUG nova.virt.libvirt.host [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.081 186962 DEBUG nova.virt.libvirt.host [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.082 186962 DEBUG nova.virt.libvirt.host [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.083 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.083 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.084 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.084 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.084 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.084 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.085 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.085 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.085 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.085 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.085 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.086 186962 DEBUG nova.virt.hardware [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.089 186962 DEBUG nova.virt.libvirt.vif [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:15:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.089 186962 DEBUG nova.network.os_vif_util [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.090 186962 DEBUG nova.network.os_vif_util [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:c1:22,bridge_name='br-int',has_traffic_filtering=True,id=7f253c88-5c90-410c-bbe6-a152ae7c3a63,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f253c88-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.090 186962 DEBUG nova.objects.instance [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.184 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  <uuid>7da96eef-5195-4fe9-8421-3b8b79420a86</uuid>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  <name>instance-00000065</name>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:15:53</nova:creationTime>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:        <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:        <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:        <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <entry name="serial">7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <entry name="uuid">7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.config"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:c1:c1:22"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <target dev="tap7f253c88-5c"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log" append="off"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:15:53 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:15:53 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:15:53 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:15:53 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.186 186962 DEBUG nova.compute.manager [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Preparing to wait for external event network-vif-plugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.187 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.188 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.188 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.189 186962 DEBUG nova.virt.libvirt.vif [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:15:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.190 186962 DEBUG nova.network.os_vif_util [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.191 186962 DEBUG nova.network.os_vif_util [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:c1:22,bridge_name='br-int',has_traffic_filtering=True,id=7f253c88-5c90-410c-bbe6-a152ae7c3a63,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f253c88-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.192 186962 DEBUG os_vif [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:c1:22,bridge_name='br-int',has_traffic_filtering=True,id=7f253c88-5c90-410c-bbe6-a152ae7c3a63,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f253c88-5c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.193 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.193 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.194 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.199 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.199 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f253c88-5c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.200 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7f253c88-5c, col_values=(('external_ids', {'iface-id': '7f253c88-5c90-410c-bbe6-a152ae7c3a63', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:c1:22', 'vm-uuid': '7da96eef-5195-4fe9-8421-3b8b79420a86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:53 np0005539505 NetworkManager[55134]: <info>  [1764400553.2026] manager: (tap7f253c88-5c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.203 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.208 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.209 186962 INFO os_vif [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:c1:22,bridge_name='br-int',has_traffic_filtering=True,id=7f253c88-5c90-410c-bbe6-a152ae7c3a63,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f253c88-5c')#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.339 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.339 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.340 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:c1:c1:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.340 186962 INFO nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Using config drive#033[00m
Nov 29 02:15:53 np0005539505 nova_compute[186958]: 2025-11-29 07:15:53.952 186962 DEBUG nova.network.neutron [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Updating instance_info_cache with network_info: [{"id": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "address": "fa:16:3e:0d:0c:7e", "network": {"id": "b6f3b8b4-1344-4bf8-acd5-1f745785c2a3", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1411633579-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "95d2550fe1b64fe3b009f5592973da32", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap398d5fef-7a", "ovs_interfaceid": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.043 186962 INFO nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Creating config drive at /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.config#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.049 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkkkcrd_w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.174 186962 DEBUG oslo_concurrency.processutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkkkcrd_w" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.209 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Releasing lock "refresh_cache-a7c751db-add0-45f3-8b76-5f5474c66e46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.210 186962 DEBUG nova.compute.manager [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Instance network_info: |[{"id": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "address": "fa:16:3e:0d:0c:7e", "network": {"id": "b6f3b8b4-1344-4bf8-acd5-1f745785c2a3", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1411633579-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "95d2550fe1b64fe3b009f5592973da32", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap398d5fef-7a", "ovs_interfaceid": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.215 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Start _get_guest_xml network_info=[{"id": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "address": "fa:16:3e:0d:0c:7e", "network": {"id": "b6f3b8b4-1344-4bf8-acd5-1f745785c2a3", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1411633579-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "95d2550fe1b64fe3b009f5592973da32", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap398d5fef-7a", "ovs_interfaceid": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.224 186962 WARNING nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.229 186962 DEBUG nova.virt.libvirt.host [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.230 186962 DEBUG nova.virt.libvirt.host [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.234 186962 DEBUG nova.virt.libvirt.host [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.235 186962 DEBUG nova.virt.libvirt.host [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.236 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.236 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:15:54 np0005539505 kernel: tap7f253c88-5c: entered promiscuous mode
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.237 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.237 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.237 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.237 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.238 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.238 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.238 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.238 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.239 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.239 186962 DEBUG nova.virt.hardware [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:15:54 np0005539505 NetworkManager[55134]: <info>  [1764400554.2410] manager: (tap7f253c88-5c): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Nov 29 02:15:54 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:54Z|00399|binding|INFO|Claiming lport 7f253c88-5c90-410c-bbe6-a152ae7c3a63 for this chassis.
Nov 29 02:15:54 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:54Z|00400|binding|INFO|7f253c88-5c90-410c-bbe6-a152ae7c3a63: Claiming fa:16:3e:c1:c1:22 10.100.0.5
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.245 186962 DEBUG nova.virt.libvirt.vif [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-310634475',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servertagstestjson-server-310634475',id=100,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='95d2550fe1b64fe3b009f5592973da32',ramdisk_id='',reservation_id='r-2sy09d0d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-1064287827',owner_user_name='tempest-ServerTagsTestJSON-1064287827-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:15:36Z,user_data=None,user_id='1192c8e43d154d7496b324c217093f43',uuid=a7c751db-add0-45f3-8b76-5f5474c66e46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "address": "fa:16:3e:0d:0c:7e", "network": {"id": "b6f3b8b4-1344-4bf8-acd5-1f745785c2a3", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1411633579-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "95d2550fe1b64fe3b009f5592973da32", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap398d5fef-7a", "ovs_interfaceid": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.246 186962 DEBUG nova.network.os_vif_util [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Converting VIF {"id": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "address": "fa:16:3e:0d:0c:7e", "network": {"id": "b6f3b8b4-1344-4bf8-acd5-1f745785c2a3", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1411633579-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "95d2550fe1b64fe3b009f5592973da32", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap398d5fef-7a", "ovs_interfaceid": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.246 186962 DEBUG nova.network.os_vif_util [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:0c:7e,bridge_name='br-int',has_traffic_filtering=True,id=398d5fef-7a9f-4bbe-8db0-754814939ba5,network=Network(b6f3b8b4-1344-4bf8-acd5-1f745785c2a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap398d5fef-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.247 186962 DEBUG nova.objects.instance [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lazy-loading 'pci_devices' on Instance uuid a7c751db-add0-45f3-8b76-5f5474c66e46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.248 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 systemd-udevd[232462]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:15:54 np0005539505 NetworkManager[55134]: <info>  [1764400554.2863] device (tap7f253c88-5c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:15:54 np0005539505 NetworkManager[55134]: <info>  [1764400554.2875] device (tap7f253c88-5c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:15:54 np0005539505 systemd-machined[153285]: New machine qemu-51-instance-00000065.
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.296 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:c1:22 10.100.0.5'], port_security=['fa:16:3e:c1:c1:22 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '2', 'neutron:security_group_ids': '07f51098-ec31-4030-87d3-0b3bc87fde1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=7f253c88-5c90-410c-bbe6-a152ae7c3a63) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.297 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.298 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 7f253c88-5c90-410c-bbe6-a152ae7c3a63 in datapath 90812230-35cb-4e21-b16b-75b900100d8b bound to our chassis#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.300 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:15:54 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:54Z|00401|binding|INFO|Setting lport 7f253c88-5c90-410c-bbe6-a152ae7c3a63 ovn-installed in OVS
Nov 29 02:15:54 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:54Z|00402|binding|INFO|Setting lport 7f253c88-5c90-410c-bbe6-a152ae7c3a63 up in Southbound
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.304 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 systemd[1]: Started Virtual Machine qemu-51-instance-00000065.
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.312 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[426cae4a-5be8-4ea3-bf9d-fe207547bcea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.314 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap90812230-31 in ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.315 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap90812230-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.315 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[afde808e-6d57-4aad-8491-abbecb5a745f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.316 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[680e1307-0c7d-4171-9c9d-b7083cd5aacd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.327 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[eb36e55b-9fa1-4700-ae20-c1441aef94e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.347 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  <uuid>a7c751db-add0-45f3-8b76-5f5474c66e46</uuid>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  <name>instance-00000064</name>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerTagsTestJSON-server-310634475</nova:name>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:15:54</nova:creationTime>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:        <nova:user uuid="1192c8e43d154d7496b324c217093f43">tempest-ServerTagsTestJSON-1064287827-project-member</nova:user>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:        <nova:project uuid="95d2550fe1b64fe3b009f5592973da32">tempest-ServerTagsTestJSON-1064287827</nova:project>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:        <nova:port uuid="398d5fef-7a9f-4bbe-8db0-754814939ba5">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <entry name="serial">a7c751db-add0-45f3-8b76-5f5474c66e46</entry>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <entry name="uuid">a7c751db-add0-45f3-8b76-5f5474c66e46</entry>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk.config"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:0d:0c:7e"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <target dev="tap398d5fef-7a"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/console.log" append="off"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:15:54 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:15:54 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:15:54 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:15:54 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.348 186962 DEBUG nova.compute.manager [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Preparing to wait for external event network-vif-plugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.348 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Acquiring lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.348 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.348 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.348 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f4873747-77b8-466e-8204-5dc959ce9175]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.349 186962 DEBUG nova.virt.libvirt.vif [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-310634475',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servertagstestjson-server-310634475',id=100,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='95d2550fe1b64fe3b009f5592973da32',ramdisk_id='',reservation_id='r-2sy09d0d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerTagsTestJSON-1064287827',owner_user_name='tempest-ServerTagsTestJSON-1064287827-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:15:36Z,user_data=None,user_id='1192c8e43d154d7496b324c217093f43',uuid=a7c751db-add0-45f3-8b76-5f5474c66e46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "address": "fa:16:3e:0d:0c:7e", "network": {"id": "b6f3b8b4-1344-4bf8-acd5-1f745785c2a3", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1411633579-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "95d2550fe1b64fe3b009f5592973da32", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap398d5fef-7a", "ovs_interfaceid": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.350 186962 DEBUG nova.network.os_vif_util [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Converting VIF {"id": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "address": "fa:16:3e:0d:0c:7e", "network": {"id": "b6f3b8b4-1344-4bf8-acd5-1f745785c2a3", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1411633579-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "95d2550fe1b64fe3b009f5592973da32", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap398d5fef-7a", "ovs_interfaceid": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.350 186962 DEBUG nova.network.os_vif_util [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:0c:7e,bridge_name='br-int',has_traffic_filtering=True,id=398d5fef-7a9f-4bbe-8db0-754814939ba5,network=Network(b6f3b8b4-1344-4bf8-acd5-1f745785c2a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap398d5fef-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.351 186962 DEBUG os_vif [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:0c:7e,bridge_name='br-int',has_traffic_filtering=True,id=398d5fef-7a9f-4bbe-8db0-754814939ba5,network=Network(b6f3b8b4-1344-4bf8-acd5-1f745785c2a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap398d5fef-7a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.351 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.351 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.352 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.354 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.354 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap398d5fef-7a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.355 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap398d5fef-7a, col_values=(('external_ids', {'iface-id': '398d5fef-7a9f-4bbe-8db0-754814939ba5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:0c:7e', 'vm-uuid': 'a7c751db-add0-45f3-8b76-5f5474c66e46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.356 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 NetworkManager[55134]: <info>  [1764400554.3571] manager: (tap398d5fef-7a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.358 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.362 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.363 186962 INFO os_vif [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:0c:7e,bridge_name='br-int',has_traffic_filtering=True,id=398d5fef-7a9f-4bbe-8db0-754814939ba5,network=Network(b6f3b8b4-1344-4bf8-acd5-1f745785c2a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap398d5fef-7a')#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.380 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[92ce0a4a-11b8-4d65-aba2-f2c2f6efcf59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.385 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e5a9652c-70c0-402b-9bab-09624a45fc72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 NetworkManager[55134]: <info>  [1764400554.3864] manager: (tap90812230-30): new Veth device (/org/freedesktop/NetworkManager/Devices/208)
Nov 29 02:15:54 np0005539505 systemd-udevd[232465]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.415 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[684f3390-c279-49ad-a858-89b38bdfe7e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.419 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[366d1c81-a8f1-4842-94c6-a0959bdb35f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 NetworkManager[55134]: <info>  [1764400554.4387] device (tap90812230-30): carrier: link connected
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.444 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[dc094e15-cd25-4fcb-ae43-25a539dbe51f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.458 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e320b190-cc71-4988-9e01-1660c1582851]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600202, 'reachable_time': 25646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232500, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.473 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[332ea0b1-6ac5-40f2-ab09-b348c6983667]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:5f07'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600202, 'tstamp': 600202}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232501, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.490 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e96ee6-c906-4331-8bcb-7700d58a610f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600202, 'reachable_time': 25646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232502, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.518 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[38e8e490-d508-4f4c-9189-955d220fe0af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.564 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.564 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.564 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] No VIF found with MAC fa:16:3e:0d:0c:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.565 186962 INFO nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Using config drive#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.574 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c48f4f78-d43d-4a96-9526-dc3499c438e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.576 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.576 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.576 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:54 np0005539505 NetworkManager[55134]: <info>  [1764400554.5787] manager: (tap90812230-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.579 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 kernel: tap90812230-30: entered promiscuous mode
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.583 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.584 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:54 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:54Z|00403|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.585 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.598 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.600 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.601 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.602 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ef5fe28f-9489-4a18-8e51-190d28df6ccc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.602 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-90812230-35cb-4e21-b16b-75b900100d8b
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 90812230-35cb-4e21-b16b-75b900100d8b
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:15:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:54.603 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'env', 'PROCESS_TAG=haproxy-90812230-35cb-4e21-b16b-75b900100d8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/90812230-35cb-4e21-b16b-75b900100d8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.717 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400554.7170506, 7da96eef-5195-4fe9-8421-3b8b79420a86 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.718 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] VM Started (Lifecycle Event)#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.804 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.809 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400554.7180583, 7da96eef-5195-4fe9-8421-3b8b79420a86 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.809 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.822 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.868 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.872 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:15:54 np0005539505 nova_compute[186958]: 2025-11-29 07:15:54.911 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:15:54 np0005539505 podman[232546]: 2025-11-29 07:15:54.994552376 +0000 UTC m=+0.085376744 container create c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:15:55 np0005539505 podman[232546]: 2025-11-29 07:15:54.932809494 +0000 UTC m=+0.023633892 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:15:55 np0005539505 systemd[1]: Started libpod-conmon-c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c.scope.
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.064 186962 INFO nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Creating config drive at /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk.config#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.069 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpny0dr7m9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:15:55 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:15:55 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88e9b78e730c9b3d65dafb4ca93b833fdb4de6c29794669184b2ab67acfbc325/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:15:55 np0005539505 podman[232546]: 2025-11-29 07:15:55.12862897 +0000 UTC m=+0.219453348 container init c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:15:55 np0005539505 podman[232546]: 2025-11-29 07:15:55.136185244 +0000 UTC m=+0.227009612 container start c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:15:55 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232561]: [NOTICE]   (232568) : New worker (232570) forked
Nov 29 02:15:55 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232561]: [NOTICE]   (232568) : Loading success.
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.194 186962 DEBUG oslo_concurrency.processutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpny0dr7m9" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:15:55 np0005539505 kernel: tap398d5fef-7a: entered promiscuous mode
Nov 29 02:15:55 np0005539505 NetworkManager[55134]: <info>  [1764400555.2530] manager: (tap398d5fef-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/210)
Nov 29 02:15:55 np0005539505 systemd-udevd[232488]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:15:55 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:55Z|00404|binding|INFO|Claiming lport 398d5fef-7a9f-4bbe-8db0-754814939ba5 for this chassis.
Nov 29 02:15:55 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:55Z|00405|binding|INFO|398d5fef-7a9f-4bbe-8db0-754814939ba5: Claiming fa:16:3e:0d:0c:7e 10.100.0.4
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.254 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.257 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:55 np0005539505 NetworkManager[55134]: <info>  [1764400555.2669] device (tap398d5fef-7a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:15:55 np0005539505 NetworkManager[55134]: <info>  [1764400555.2686] device (tap398d5fef-7a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.290 186962 DEBUG nova.compute.manager [req-158acd29-61bc-4fd4-9e0f-3b4743a5a628 req-2aa54e5f-d3a5-413a-9e49-c16daaaff0cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-plugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.291 186962 DEBUG oslo_concurrency.lockutils [req-158acd29-61bc-4fd4-9e0f-3b4743a5a628 req-2aa54e5f-d3a5-413a-9e49-c16daaaff0cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.291 186962 DEBUG oslo_concurrency.lockutils [req-158acd29-61bc-4fd4-9e0f-3b4743a5a628 req-2aa54e5f-d3a5-413a-9e49-c16daaaff0cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.291 186962 DEBUG oslo_concurrency.lockutils [req-158acd29-61bc-4fd4-9e0f-3b4743a5a628 req-2aa54e5f-d3a5-413a-9e49-c16daaaff0cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.292 186962 DEBUG nova.compute.manager [req-158acd29-61bc-4fd4-9e0f-3b4743a5a628 req-2aa54e5f-d3a5-413a-9e49-c16daaaff0cf 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Processing event network-vif-plugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.293 186962 DEBUG nova.compute.manager [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.298 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400555.2980237, 7da96eef-5195-4fe9-8421-3b8b79420a86 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.298 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:15:55 np0005539505 systemd-machined[153285]: New machine qemu-52-instance-00000064.
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.301 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.303 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:0c:7e 10.100.0.4'], port_security=['fa:16:3e:0d:0c:7e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a7c751db-add0-45f3-8b76-5f5474c66e46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '95d2550fe1b64fe3b009f5592973da32', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f2a8450d-8f67-4881-aba3-0f7af59b28ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=15762181-9deb-4214-a6c7-f86e29de5da9, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=398d5fef-7a9f-4bbe-8db0-754814939ba5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.306 186962 INFO nova.virt.libvirt.driver [-] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Instance spawned successfully.#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.306 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.305 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 398d5fef-7a9f-4bbe-8db0-754814939ba5 in datapath b6f3b8b4-1344-4bf8-acd5-1f745785c2a3 bound to our chassis#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.307 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b6f3b8b4-1344-4bf8-acd5-1f745785c2a3#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.309 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:55 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:55Z|00406|binding|INFO|Setting lport 398d5fef-7a9f-4bbe-8db0-754814939ba5 ovn-installed in OVS
Nov 29 02:15:55 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:55Z|00407|binding|INFO|Setting lport 398d5fef-7a9f-4bbe-8db0-754814939ba5 up in Southbound
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.315 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:55 np0005539505 systemd[1]: Started Virtual Machine qemu-52-instance-00000064.
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.319 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[de7a47d8-dd33-42fd-93e8-9aba10fe8b52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.320 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb6f3b8b4-11 in ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.322 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb6f3b8b4-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.322 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[04c37432-cac6-4401-a0fb-0c079dfa7ed4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.323 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8e535ba3-8eda-484b-be62-236ba2f06556]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.333 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[863d7da3-37c0-42be-a897-605bdf5318d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.347 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[55cd47d8-6673-428e-88cd-4ee2e8f8a0ea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.377 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.378 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc54d42-d1c8-49f6-a233-1a0510bf0200]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.384 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:15:55 np0005539505 NetworkManager[55134]: <info>  [1764400555.3878] manager: (tapb6f3b8b4-10): new Veth device (/org/freedesktop/NetworkManager/Devices/211)
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.387 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.388 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.387 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b12c018f-f17f-4079-a7fd-433cad1f5cbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.388 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.389 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.389 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.390 186962 DEBUG nova.virt.libvirt.driver [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.419 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[fee4d0f3-4aca-4d88-bce2-15faf6df3028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.422 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[49c1ffa7-e985-4edf-8a1f-705aea0c493a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.440 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:15:55 np0005539505 NetworkManager[55134]: <info>  [1764400555.4468] device (tapb6f3b8b4-10): carrier: link connected
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.452 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[9484e499-5574-436b-80b5-15d858e3b07f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.471 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b3de014b-a4ca-4212-969f-854e96b5db2f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb6f3b8b4-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:29:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600303, 'reachable_time': 36037, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232611, 'error': None, 'target': 'ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.490 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[270cc8bc-caba-4508-9b8a-cca97e599495]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:2996'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600303, 'tstamp': 600303}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232612, 'error': None, 'target': 'ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.507 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e8764381-b2ed-4a63-a3d6-735e2130c6ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb6f3b8b4-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:29:96'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 137], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600303, 'reachable_time': 36037, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232613, 'error': None, 'target': 'ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.539 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d5e9dc9a-1024-4ecc-b115-ce16dda894cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.609 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[664c2c79-c8ca-44f1-9ea1-d1e845b79733]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.610 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb6f3b8b4-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.611 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.611 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb6f3b8b4-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:55 np0005539505 NetworkManager[55134]: <info>  [1764400555.6147] manager: (tapb6f3b8b4-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Nov 29 02:15:55 np0005539505 kernel: tapb6f3b8b4-10: entered promiscuous mode
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.614 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.623 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb6f3b8b4-10, col_values=(('external_ids', {'iface-id': 'd5723e6f-213b-4368-8411-945ca5628b67'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.624 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.625 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b6f3b8b4-1344-4bf8-acd5-1f745785c2a3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b6f3b8b4-1344-4bf8-acd5-1f745785c2a3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:15:55 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:55Z|00408|binding|INFO|Releasing lport d5723e6f-213b-4368-8411-945ca5628b67 from this chassis (sb_readonly=0)
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.626 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2e44c5ae-7133-4f4b-8dc2-ab7e33d16ec9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.627 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/b6f3b8b4-1344-4bf8-acd5-1f745785c2a3.pid.haproxy
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID b6f3b8b4-1344-4bf8-acd5-1f745785c2a3
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.627 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3', 'env', 'PROCESS_TAG=haproxy-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b6f3b8b4-1344-4bf8-acd5-1f745785c2a3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.633 186962 INFO nova.compute.manager [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Took 19.73 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.633 186962 DEBUG nova.compute.manager [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.636 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.718 186962 DEBUG nova.compute.manager [req-c7130286-c8c6-4469-954c-dc60d42ebb24 req-5b4a2431-92b0-43f1-bc51-f9dc42dcd7f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Received event network-vif-plugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.718 186962 DEBUG oslo_concurrency.lockutils [req-c7130286-c8c6-4469-954c-dc60d42ebb24 req-5b4a2431-92b0-43f1-bc51-f9dc42dcd7f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.719 186962 DEBUG oslo_concurrency.lockutils [req-c7130286-c8c6-4469-954c-dc60d42ebb24 req-5b4a2431-92b0-43f1-bc51-f9dc42dcd7f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.719 186962 DEBUG oslo_concurrency.lockutils [req-c7130286-c8c6-4469-954c-dc60d42ebb24 req-5b4a2431-92b0-43f1-bc51-f9dc42dcd7f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.719 186962 DEBUG nova.compute.manager [req-c7130286-c8c6-4469-954c-dc60d42ebb24 req-5b4a2431-92b0-43f1-bc51-f9dc42dcd7f3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Processing event network-vif-plugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.785 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400555.783887, a7c751db-add0-45f3-8b76-5f5474c66e46 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.786 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] VM Started (Lifecycle Event)#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.788 186962 DEBUG nova.compute.manager [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.791 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.794 186962 INFO nova.virt.libvirt.driver [-] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Instance spawned successfully.#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.794 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.899 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.908 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.912 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.912 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.913 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.913 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.914 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.914 186962 DEBUG nova.virt.libvirt.driver [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:15:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:55.935 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:15:55 np0005539505 nova_compute[186958]: 2025-11-29 07:15:55.935 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:56 np0005539505 podman[232649]: 2025-11-29 07:15:56.007934889 +0000 UTC m=+0.048567779 container create 88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.039 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.040 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400555.7840393, a7c751db-add0-45f3-8b76-5f5474c66e46 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.040 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:15:56 np0005539505 systemd[1]: Started libpod-conmon-88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2.scope.
Nov 29 02:15:56 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:15:56 np0005539505 podman[232649]: 2025-11-29 07:15:55.982296671 +0000 UTC m=+0.022929581 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:15:56 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/604edd57708fb731f5c28e7b921442f4e714e71b1bbc10f9cfd63917fc433a69/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:15:56 np0005539505 podman[232649]: 2025-11-29 07:15:56.094482105 +0000 UTC m=+0.135115025 container init 88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 02:15:56 np0005539505 podman[232649]: 2025-11-29 07:15:56.100288299 +0000 UTC m=+0.140921189 container start 88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:15:56 np0005539505 neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3[232663]: [NOTICE]   (232668) : New worker (232670) forked
Nov 29 02:15:56 np0005539505 neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3[232663]: [NOTICE]   (232668) : Loading success.
Nov 29 02:15:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:56.163 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.184 186962 INFO nova.compute.manager [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Took 20.98 seconds to build instance.#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.238 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.242 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400555.790849, a7c751db-add0-45f3-8b76-5f5474c66e46 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.242 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.277 186962 DEBUG oslo_concurrency.lockutils [None req-66151a8b-d13b-4566-943c-684c14d26092 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.309 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.312 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.370 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.395 186962 INFO nova.compute.manager [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Took 20.25 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.395 186962 DEBUG nova.compute.manager [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.650 186962 INFO nova.compute.manager [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Took 21.41 seconds to build instance.#033[00m
Nov 29 02:15:56 np0005539505 nova_compute[186958]: 2025-11-29 07:15:56.702 186962 DEBUG oslo_concurrency.lockutils [None req-e87a0da0-d8ff-43d3-b700-6783a5c40e63 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.744 186962 DEBUG nova.compute.manager [req-47c94833-50fd-4f7a-a8dd-b315a73baee6 req-63e9a9f2-bd0b-493d-bdf3-ef52f505f837 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-plugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.744 186962 DEBUG oslo_concurrency.lockutils [req-47c94833-50fd-4f7a-a8dd-b315a73baee6 req-63e9a9f2-bd0b-493d-bdf3-ef52f505f837 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.752 186962 DEBUG oslo_concurrency.lockutils [req-47c94833-50fd-4f7a-a8dd-b315a73baee6 req-63e9a9f2-bd0b-493d-bdf3-ef52f505f837 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.752 186962 DEBUG oslo_concurrency.lockutils [req-47c94833-50fd-4f7a-a8dd-b315a73baee6 req-63e9a9f2-bd0b-493d-bdf3-ef52f505f837 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.753 186962 DEBUG nova.compute.manager [req-47c94833-50fd-4f7a-a8dd-b315a73baee6 req-63e9a9f2-bd0b-493d-bdf3-ef52f505f837 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] No waiting events found dispatching network-vif-plugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.753 186962 WARNING nova.compute.manager [req-47c94833-50fd-4f7a-a8dd-b315a73baee6 req-63e9a9f2-bd0b-493d-bdf3-ef52f505f837 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received unexpected event network-vif-plugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.995 186962 DEBUG nova.compute.manager [req-cd1b8718-12a1-4202-b465-009bb5752633 req-8c22a0a7-2996-482e-9090-d47f8b0849d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Received event network-vif-plugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.996 186962 DEBUG oslo_concurrency.lockutils [req-cd1b8718-12a1-4202-b465-009bb5752633 req-8c22a0a7-2996-482e-9090-d47f8b0849d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.996 186962 DEBUG oslo_concurrency.lockutils [req-cd1b8718-12a1-4202-b465-009bb5752633 req-8c22a0a7-2996-482e-9090-d47f8b0849d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.996 186962 DEBUG oslo_concurrency.lockutils [req-cd1b8718-12a1-4202-b465-009bb5752633 req-8c22a0a7-2996-482e-9090-d47f8b0849d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.996 186962 DEBUG nova.compute.manager [req-cd1b8718-12a1-4202-b465-009bb5752633 req-8c22a0a7-2996-482e-9090-d47f8b0849d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] No waiting events found dispatching network-vif-plugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:15:57 np0005539505 nova_compute[186958]: 2025-11-29 07:15:57.997 186962 WARNING nova.compute.manager [req-cd1b8718-12a1-4202-b465-009bb5752633 req-8c22a0a7-2996-482e-9090-d47f8b0849d3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Received unexpected event network-vif-plugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:15:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:15:58.166 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:15:58 np0005539505 NetworkManager[55134]: <info>  [1764400558.6175] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/213)
Nov 29 02:15:58 np0005539505 NetworkManager[55134]: <info>  [1764400558.6187] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/214)
Nov 29 02:15:58 np0005539505 nova_compute[186958]: 2025-11-29 07:15:58.617 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:58 np0005539505 nova_compute[186958]: 2025-11-29 07:15:58.754 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:58Z|00409|binding|INFO|Releasing lport d5723e6f-213b-4368-8411-945ca5628b67 from this chassis (sb_readonly=0)
Nov 29 02:15:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:15:58Z|00410|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:15:58 np0005539505 nova_compute[186958]: 2025-11-29 07:15:58.783 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:59 np0005539505 nova_compute[186958]: 2025-11-29 07:15:59.356 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:15:59 np0005539505 nova_compute[186958]: 2025-11-29 07:15:59.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:15:59 np0005539505 nova_compute[186958]: 2025-11-29 07:15:59.825 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:00 np0005539505 nova_compute[186958]: 2025-11-29 07:16:00.302 186962 DEBUG nova.compute.manager [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-changed-7f253c88-5c90-410c-bbe6-a152ae7c3a63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:00 np0005539505 nova_compute[186958]: 2025-11-29 07:16:00.303 186962 DEBUG nova.compute.manager [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Refreshing instance network info cache due to event network-changed-7f253c88-5c90-410c-bbe6-a152ae7c3a63. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:16:00 np0005539505 nova_compute[186958]: 2025-11-29 07:16:00.303 186962 DEBUG oslo_concurrency.lockutils [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:00 np0005539505 nova_compute[186958]: 2025-11-29 07:16:00.304 186962 DEBUG oslo_concurrency.lockutils [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:00 np0005539505 nova_compute[186958]: 2025-11-29 07:16:00.304 186962 DEBUG nova.network.neutron [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Refreshing network info cache for port 7f253c88-5c90-410c-bbe6-a152ae7c3a63 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:16:00 np0005539505 nova_compute[186958]: 2025-11-29 07:16:00.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:00 np0005539505 podman[232681]: 2025-11-29 07:16:00.726645323 +0000 UTC m=+0.056073562 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:16:00 np0005539505 podman[232680]: 2025-11-29 07:16:00.758738354 +0000 UTC m=+0.091276311 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:16:02 np0005539505 nova_compute[186958]: 2025-11-29 07:16:02.200 186962 DEBUG nova.network.neutron [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updated VIF entry in instance network info cache for port 7f253c88-5c90-410c-bbe6-a152ae7c3a63. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:16:02 np0005539505 nova_compute[186958]: 2025-11-29 07:16:02.202 186962 DEBUG nova.network.neutron [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:02 np0005539505 nova_compute[186958]: 2025-11-29 07:16:02.599 186962 DEBUG oslo_concurrency.lockutils [req-c69bd32f-11f8-4817-a65c-128a91004984 req-948694af-cc72-4208-8984-a6f2775df16e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:03 np0005539505 podman[232725]: 2025-11-29 07:16:03.716304039 +0000 UTC m=+0.052246993 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.303 186962 DEBUG oslo_concurrency.lockutils [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Acquiring lock "a7c751db-add0-45f3-8b76-5f5474c66e46" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.304 186962 DEBUG oslo_concurrency.lockutils [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.305 186962 DEBUG oslo_concurrency.lockutils [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Acquiring lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.305 186962 DEBUG oslo_concurrency.lockutils [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.306 186962 DEBUG oslo_concurrency.lockutils [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.341 186962 INFO nova.compute.manager [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Terminating instance#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.358 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.415 186962 DEBUG nova.compute.manager [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:16:04 np0005539505 kernel: tap398d5fef-7a (unregistering): left promiscuous mode
Nov 29 02:16:04 np0005539505 NetworkManager[55134]: <info>  [1764400564.4455] device (tap398d5fef-7a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.456 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:04Z|00411|binding|INFO|Releasing lport 398d5fef-7a9f-4bbe-8db0-754814939ba5 from this chassis (sb_readonly=0)
Nov 29 02:16:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:04Z|00412|binding|INFO|Setting lport 398d5fef-7a9f-4bbe-8db0-754814939ba5 down in Southbound
Nov 29 02:16:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:04Z|00413|binding|INFO|Removing iface tap398d5fef-7a ovn-installed in OVS
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.477 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.482 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:0c:7e 10.100.0.4'], port_security=['fa:16:3e:0d:0c:7e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a7c751db-add0-45f3-8b76-5f5474c66e46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '95d2550fe1b64fe3b009f5592973da32', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f2a8450d-8f67-4881-aba3-0f7af59b28ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=15762181-9deb-4214-a6c7-f86e29de5da9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=398d5fef-7a9f-4bbe-8db0-754814939ba5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.484 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 398d5fef-7a9f-4bbe-8db0-754814939ba5 in datapath b6f3b8b4-1344-4bf8-acd5-1f745785c2a3 unbound from our chassis#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.487 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b6f3b8b4-1344-4bf8-acd5-1f745785c2a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.488 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7cddff-1d59-4ed2-8169-fbb7de6100e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.488 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3 namespace which is not needed anymore#033[00m
Nov 29 02:16:04 np0005539505 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000064.scope: Deactivated successfully.
Nov 29 02:16:04 np0005539505 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000064.scope: Consumed 9.214s CPU time.
Nov 29 02:16:04 np0005539505 systemd-machined[153285]: Machine qemu-52-instance-00000064 terminated.
Nov 29 02:16:04 np0005539505 neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3[232663]: [NOTICE]   (232668) : haproxy version is 2.8.14-c23fe91
Nov 29 02:16:04 np0005539505 neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3[232663]: [NOTICE]   (232668) : path to executable is /usr/sbin/haproxy
Nov 29 02:16:04 np0005539505 neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3[232663]: [WARNING]  (232668) : Exiting Master process...
Nov 29 02:16:04 np0005539505 neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3[232663]: [WARNING]  (232668) : Exiting Master process...
Nov 29 02:16:04 np0005539505 neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3[232663]: [ALERT]    (232668) : Current worker (232670) exited with code 143 (Terminated)
Nov 29 02:16:04 np0005539505 neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3[232663]: [WARNING]  (232668) : All workers exited. Exiting... (0)
Nov 29 02:16:04 np0005539505 systemd[1]: libpod-88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2.scope: Deactivated successfully.
Nov 29 02:16:04 np0005539505 podman[232767]: 2025-11-29 07:16:04.618394084 +0000 UTC m=+0.048413754 container died 88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:16:04 np0005539505 kernel: tap398d5fef-7a: entered promiscuous mode
Nov 29 02:16:04 np0005539505 systemd-udevd[232747]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:16:04 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2-userdata-shm.mount: Deactivated successfully.
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.642 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:04Z|00414|binding|INFO|Claiming lport 398d5fef-7a9f-4bbe-8db0-754814939ba5 for this chassis.
Nov 29 02:16:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:04Z|00415|binding|INFO|398d5fef-7a9f-4bbe-8db0-754814939ba5: Claiming fa:16:3e:0d:0c:7e 10.100.0.4
Nov 29 02:16:04 np0005539505 NetworkManager[55134]: <info>  [1764400564.6498] manager: (tap398d5fef-7a): new Tun device (/org/freedesktop/NetworkManager/Devices/215)
Nov 29 02:16:04 np0005539505 systemd[1]: var-lib-containers-storage-overlay-604edd57708fb731f5c28e7b921442f4e714e71b1bbc10f9cfd63917fc433a69-merged.mount: Deactivated successfully.
Nov 29 02:16:04 np0005539505 kernel: tap398d5fef-7a (unregistering): left promiscuous mode
Nov 29 02:16:04 np0005539505 virtnodedevd[186570]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 29 02:16:04 np0005539505 virtnodedevd[186570]: hostname: compute-2
Nov 29 02:16:04 np0005539505 virtnodedevd[186570]: ethtool ioctl error on tap398d5fef-7a: No such device
Nov 29 02:16:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:04Z|00416|binding|INFO|Setting lport 398d5fef-7a9f-4bbe-8db0-754814939ba5 ovn-installed in OVS
Nov 29 02:16:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:04Z|00417|if_status|INFO|Not setting lport 398d5fef-7a9f-4bbe-8db0-754814939ba5 down as sb is readonly
Nov 29 02:16:04 np0005539505 virtnodedevd[186570]: ethtool ioctl error on tap398d5fef-7a: No such device
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.670 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:04 np0005539505 podman[232767]: 2025-11-29 07:16:04.674197688 +0000 UTC m=+0.104217358 container cleanup 88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:16:04 np0005539505 virtnodedevd[186570]: ethtool ioctl error on tap398d5fef-7a: No such device
Nov 29 02:16:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:04Z|00418|binding|INFO|Releasing lport 398d5fef-7a9f-4bbe-8db0-754814939ba5 from this chassis (sb_readonly=0)
Nov 29 02:16:04 np0005539505 virtnodedevd[186570]: ethtool ioctl error on tap398d5fef-7a: No such device
Nov 29 02:16:04 np0005539505 virtnodedevd[186570]: ethtool ioctl error on tap398d5fef-7a: No such device
Nov 29 02:16:04 np0005539505 systemd[1]: libpod-conmon-88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2.scope: Deactivated successfully.
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.682 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:0c:7e 10.100.0.4'], port_security=['fa:16:3e:0d:0c:7e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a7c751db-add0-45f3-8b76-5f5474c66e46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '95d2550fe1b64fe3b009f5592973da32', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f2a8450d-8f67-4881-aba3-0f7af59b28ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=15762181-9deb-4214-a6c7-f86e29de5da9, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=398d5fef-7a9f-4bbe-8db0-754814939ba5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.693 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:04 np0005539505 virtnodedevd[186570]: ethtool ioctl error on tap398d5fef-7a: No such device
Nov 29 02:16:04 np0005539505 virtnodedevd[186570]: ethtool ioctl error on tap398d5fef-7a: No such device
Nov 29 02:16:04 np0005539505 virtnodedevd[186570]: ethtool ioctl error on tap398d5fef-7a: No such device
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.715 186962 INFO nova.virt.libvirt.driver [-] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Instance destroyed successfully.#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.716 186962 DEBUG nova.objects.instance [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lazy-loading 'resources' on Instance uuid a7c751db-add0-45f3-8b76-5f5474c66e46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.722 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:0c:7e 10.100.0.4'], port_security=['fa:16:3e:0d:0c:7e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a7c751db-add0-45f3-8b76-5f5474c66e46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '95d2550fe1b64fe3b009f5592973da32', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f2a8450d-8f67-4881-aba3-0f7af59b28ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=15762181-9deb-4214-a6c7-f86e29de5da9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=398d5fef-7a9f-4bbe-8db0-754814939ba5) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:16:04 np0005539505 podman[232813]: 2025-11-29 07:16:04.751495871 +0000 UTC m=+0.046380797 container remove 88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.758 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[406e2944-0874-4288-a83a-5a167b1c87ce]: (4, ('Sat Nov 29 07:16:04 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3 (88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2)\n88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2\nSat Nov 29 07:16:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3 (88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2)\n88e07cc8e937eeaa6b7ac8c2d11bd3c427208559803f79a8292e93dfd3610ca2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.760 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[99047a44-ddc2-4e44-bea9-1cacb2c20d96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.761 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb6f3b8b4-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.763 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:04 np0005539505 kernel: tapb6f3b8b4-10: left promiscuous mode
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.777 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.780 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[72a20103-4175-46b6-b93e-849a366b6d90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.800 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3443aa57-05c3-4842-87cc-f48b4f4ac5ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.802 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c42868ae-a503-4a57-a6b8-5e6b85013a4e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.816 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b6545f6c-eea2-4ec7-8140-84602295bb72]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600295, 'reachable_time': 40563, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232842, 'error': None, 'target': 'ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.819 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b6f3b8b4-1344-4bf8-acd5-1f745785c2a3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.819 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[da79c8c0-f2ff-4c91-b3fc-dd0ec3558822]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.821 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 398d5fef-7a9f-4bbe-8db0-754814939ba5 in datapath b6f3b8b4-1344-4bf8-acd5-1f745785c2a3 unbound from our chassis#033[00m
Nov 29 02:16:04 np0005539505 systemd[1]: run-netns-ovnmeta\x2db6f3b8b4\x2d1344\x2d4bf8\x2dacd5\x2d1f745785c2a3.mount: Deactivated successfully.
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.822 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b6f3b8b4-1344-4bf8-acd5-1f745785c2a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.824 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2dac29-a58e-4f61-9d11-09d3c0dcb5d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.824 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 398d5fef-7a9f-4bbe-8db0-754814939ba5 in datapath b6f3b8b4-1344-4bf8-acd5-1f745785c2a3 unbound from our chassis#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.826 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b6f3b8b4-1344-4bf8-acd5-1f745785c2a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:16:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:04.826 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[03bf1ae3-a6c6-4898-86d2-07a72e414f5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.826 186962 DEBUG nova.virt.libvirt.vif [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerTagsTestJSON-server-310634475',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servertagstestjson-server-310634475',id=100,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='95d2550fe1b64fe3b009f5592973da32',ramdisk_id='',reservation_id='r-2sy09d0d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerTagsTestJSON-1064287827',owner_user_name='tempest-ServerTagsTestJSON-1064287827-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:56Z,user_data=None,user_id='1192c8e43d154d7496b324c217093f43',uuid=a7c751db-add0-45f3-8b76-5f5474c66e46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "address": "fa:16:3e:0d:0c:7e", "network": {"id": "b6f3b8b4-1344-4bf8-acd5-1f745785c2a3", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1411633579-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "95d2550fe1b64fe3b009f5592973da32", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap398d5fef-7a", "ovs_interfaceid": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.827 186962 DEBUG nova.network.os_vif_util [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Converting VIF {"id": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "address": "fa:16:3e:0d:0c:7e", "network": {"id": "b6f3b8b4-1344-4bf8-acd5-1f745785c2a3", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1411633579-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "95d2550fe1b64fe3b009f5592973da32", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap398d5fef-7a", "ovs_interfaceid": "398d5fef-7a9f-4bbe-8db0-754814939ba5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.828 186962 DEBUG nova.network.os_vif_util [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:0c:7e,bridge_name='br-int',has_traffic_filtering=True,id=398d5fef-7a9f-4bbe-8db0-754814939ba5,network=Network(b6f3b8b4-1344-4bf8-acd5-1f745785c2a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap398d5fef-7a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.829 186962 DEBUG os_vif [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:0c:7e,bridge_name='br-int',has_traffic_filtering=True,id=398d5fef-7a9f-4bbe-8db0-754814939ba5,network=Network(b6f3b8b4-1344-4bf8-acd5-1f745785c2a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap398d5fef-7a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.830 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.831 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap398d5fef-7a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.834 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.837 186962 INFO os_vif [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:0c:7e,bridge_name='br-int',has_traffic_filtering=True,id=398d5fef-7a9f-4bbe-8db0-754814939ba5,network=Network(b6f3b8b4-1344-4bf8-acd5-1f745785c2a3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap398d5fef-7a')#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.838 186962 INFO nova.virt.libvirt.driver [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Deleting instance files /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46_del#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.838 186962 INFO nova.virt.libvirt.driver [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Deletion of /var/lib/nova/instances/a7c751db-add0-45f3-8b76-5f5474c66e46_del complete#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.882 186962 DEBUG nova.compute.manager [req-cd092556-9ca0-404b-87b9-3b960a47ffc7 req-1c45a95a-5fd2-4658-b121-f78c37eb63b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Received event network-vif-unplugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.891 186962 DEBUG oslo_concurrency.lockutils [req-cd092556-9ca0-404b-87b9-3b960a47ffc7 req-1c45a95a-5fd2-4658-b121-f78c37eb63b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.891 186962 DEBUG oslo_concurrency.lockutils [req-cd092556-9ca0-404b-87b9-3b960a47ffc7 req-1c45a95a-5fd2-4658-b121-f78c37eb63b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.891 186962 DEBUG oslo_concurrency.lockutils [req-cd092556-9ca0-404b-87b9-3b960a47ffc7 req-1c45a95a-5fd2-4658-b121-f78c37eb63b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.891 186962 DEBUG nova.compute.manager [req-cd092556-9ca0-404b-87b9-3b960a47ffc7 req-1c45a95a-5fd2-4658-b121-f78c37eb63b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] No waiting events found dispatching network-vif-unplugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:16:04 np0005539505 nova_compute[186958]: 2025-11-29 07:16:04.891 186962 DEBUG nova.compute.manager [req-cd092556-9ca0-404b-87b9-3b960a47ffc7 req-1c45a95a-5fd2-4658-b121-f78c37eb63b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Received event network-vif-unplugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:16:05 np0005539505 nova_compute[186958]: 2025-11-29 07:16:05.105 186962 INFO nova.compute.manager [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Took 0.69 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:16:05 np0005539505 nova_compute[186958]: 2025-11-29 07:16:05.106 186962 DEBUG oslo.service.loopingcall [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:16:05 np0005539505 nova_compute[186958]: 2025-11-29 07:16:05.106 186962 DEBUG nova.compute.manager [-] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:16:05 np0005539505 nova_compute[186958]: 2025-11-29 07:16:05.107 186962 DEBUG nova.network.neutron [-] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:16:05 np0005539505 nova_compute[186958]: 2025-11-29 07:16:05.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:06 np0005539505 nova_compute[186958]: 2025-11-29 07:16:06.619 186962 DEBUG nova.network.neutron [-] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:06 np0005539505 nova_compute[186958]: 2025-11-29 07:16:06.635 186962 INFO nova.compute.manager [-] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Took 1.53 seconds to deallocate network for instance.#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.005 186962 DEBUG nova.compute.manager [req-42035b6d-3e00-4b28-a33d-a378584b55d6 req-e82b95dd-4eb4-44b6-b1d5-b93809334f31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Received event network-vif-deleted-398d5fef-7a9f-4bbe-8db0-754814939ba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.006 186962 DEBUG nova.compute.manager [req-b75db2a2-df46-4f30-9449-ebb034db8504 req-c6c03bed-7f62-415c-bcb8-f2a8e6cee066 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Received event network-vif-plugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.007 186962 DEBUG oslo_concurrency.lockutils [req-b75db2a2-df46-4f30-9449-ebb034db8504 req-c6c03bed-7f62-415c-bcb8-f2a8e6cee066 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.007 186962 DEBUG oslo_concurrency.lockutils [req-b75db2a2-df46-4f30-9449-ebb034db8504 req-c6c03bed-7f62-415c-bcb8-f2a8e6cee066 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.007 186962 DEBUG oslo_concurrency.lockutils [req-b75db2a2-df46-4f30-9449-ebb034db8504 req-c6c03bed-7f62-415c-bcb8-f2a8e6cee066 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.008 186962 DEBUG nova.compute.manager [req-b75db2a2-df46-4f30-9449-ebb034db8504 req-c6c03bed-7f62-415c-bcb8-f2a8e6cee066 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] No waiting events found dispatching network-vif-plugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.008 186962 WARNING nova.compute.manager [req-b75db2a2-df46-4f30-9449-ebb034db8504 req-c6c03bed-7f62-415c-bcb8-f2a8e6cee066 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Received unexpected event network-vif-plugged-398d5fef-7a9f-4bbe-8db0-754814939ba5 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.013 186962 DEBUG oslo_concurrency.lockutils [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.014 186962 DEBUG oslo_concurrency.lockutils [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.096 186962 DEBUG nova.compute.provider_tree [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.130 186962 DEBUG nova.scheduler.client.report [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.179 186962 DEBUG oslo_concurrency.lockutils [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.272 186962 INFO nova.scheduler.client.report [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Deleted allocations for instance a7c751db-add0-45f3-8b76-5f5474c66e46#033[00m
Nov 29 02:16:07 np0005539505 nova_compute[186958]: 2025-11-29 07:16:07.447 186962 DEBUG oslo_concurrency.lockutils [None req-557f7025-9117-40d8-8f53-039a37d87a06 1192c8e43d154d7496b324c217093f43 95d2550fe1b64fe3b009f5592973da32 - - default default] Lock "a7c751db-add0-45f3-8b76-5f5474c66e46" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:09Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:c1:22 10.100.0.5
Nov 29 02:16:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:09Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:c1:22 10.100.0.5
Nov 29 02:16:09 np0005539505 nova_compute[186958]: 2025-11-29 07:16:09.829 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:09 np0005539505 nova_compute[186958]: 2025-11-29 07:16:09.833 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.533 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.534 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.534 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.534 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.708 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.770 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.771 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.830 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.987 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.988 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5517MB free_disk=73.1960678100586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.989 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:10 np0005539505 nova_compute[186958]: 2025-11-29 07:16:10.989 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:11 np0005539505 nova_compute[186958]: 2025-11-29 07:16:11.076 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 7da96eef-5195-4fe9-8421-3b8b79420a86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:16:11 np0005539505 nova_compute[186958]: 2025-11-29 07:16:11.076 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:16:11 np0005539505 nova_compute[186958]: 2025-11-29 07:16:11.076 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:16:11 np0005539505 nova_compute[186958]: 2025-11-29 07:16:11.122 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:16:11 np0005539505 nova_compute[186958]: 2025-11-29 07:16:11.138 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:16:11 np0005539505 nova_compute[186958]: 2025-11-29 07:16:11.162 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:16:11 np0005539505 nova_compute[186958]: 2025-11-29 07:16:11.162 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:12 np0005539505 nova_compute[186958]: 2025-11-29 07:16:12.162 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:12 np0005539505 nova_compute[186958]: 2025-11-29 07:16:12.163 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:16:12 np0005539505 nova_compute[186958]: 2025-11-29 07:16:12.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:12 np0005539505 podman[232866]: 2025-11-29 07:16:12.721869865 +0000 UTC m=+0.054064705 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:16:12 np0005539505 podman[232867]: 2025-11-29 07:16:12.766737117 +0000 UTC m=+0.096795627 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 02:16:13 np0005539505 nova_compute[186958]: 2025-11-29 07:16:13.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:13 np0005539505 nova_compute[186958]: 2025-11-29 07:16:13.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:16:13 np0005539505 nova_compute[186958]: 2025-11-29 07:16:13.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:16:13 np0005539505 nova_compute[186958]: 2025-11-29 07:16:13.833 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:13 np0005539505 nova_compute[186958]: 2025-11-29 07:16:13.833 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:13 np0005539505 nova_compute[186958]: 2025-11-29 07:16:13.833 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:16:13 np0005539505 nova_compute[186958]: 2025-11-29 07:16:13.834 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:14 np0005539505 nova_compute[186958]: 2025-11-29 07:16:14.831 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:17 np0005539505 nova_compute[186958]: 2025-11-29 07:16:17.568 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:17 np0005539505 nova_compute[186958]: 2025-11-29 07:16:17.590 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:17 np0005539505 nova_compute[186958]: 2025-11-29 07:16:17.591 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:16:17 np0005539505 nova_compute[186958]: 2025-11-29 07:16:17.591 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:17 np0005539505 nova_compute[186958]: 2025-11-29 07:16:17.792 186962 DEBUG oslo_concurrency.lockutils [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:17 np0005539505 nova_compute[186958]: 2025-11-29 07:16:17.793 186962 DEBUG oslo_concurrency.lockutils [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:17 np0005539505 nova_compute[186958]: 2025-11-29 07:16:17.793 186962 DEBUG nova.objects.instance [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'flavor' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:17 np0005539505 nova_compute[186958]: 2025-11-29 07:16:17.825 186962 DEBUG nova.objects.instance [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'pci_requests' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:17 np0005539505 nova_compute[186958]: 2025-11-29 07:16:17.843 186962 DEBUG nova.network.neutron [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:16:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:17Z|00419|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:16:17 np0005539505 nova_compute[186958]: 2025-11-29 07:16:17.986 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:18 np0005539505 nova_compute[186958]: 2025-11-29 07:16:18.488 186962 DEBUG nova.policy [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:16:18 np0005539505 podman[232917]: 2025-11-29 07:16:18.718048344 +0000 UTC m=+0.054806476 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:16:18 np0005539505 podman[232918]: 2025-11-29 07:16:18.747710146 +0000 UTC m=+0.080117044 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:16:19 np0005539505 nova_compute[186958]: 2025-11-29 07:16:19.711 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400564.7081506, a7c751db-add0-45f3-8b76-5f5474c66e46 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:16:19 np0005539505 nova_compute[186958]: 2025-11-29 07:16:19.712 186962 INFO nova.compute.manager [-] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:16:19 np0005539505 nova_compute[186958]: 2025-11-29 07:16:19.732 186962 DEBUG nova.compute.manager [None req-e82090dd-fdb4-471e-a9d5-dc2cbc0627c6 - - - - - -] [instance: a7c751db-add0-45f3-8b76-5f5474c66e46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:16:19 np0005539505 nova_compute[186958]: 2025-11-29 07:16:19.833 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:20 np0005539505 nova_compute[186958]: 2025-11-29 07:16:20.272 186962 DEBUG nova.network.neutron [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Successfully created port: b04bb5a4-4610-4151-a86a-f1f55b164195 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:16:22 np0005539505 nova_compute[186958]: 2025-11-29 07:16:22.140 186962 DEBUG nova.network.neutron [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Successfully updated port: b04bb5a4-4610-4151-a86a-f1f55b164195 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:16:22 np0005539505 nova_compute[186958]: 2025-11-29 07:16:22.163 186962 DEBUG oslo_concurrency.lockutils [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:22 np0005539505 nova_compute[186958]: 2025-11-29 07:16:22.163 186962 DEBUG oslo_concurrency.lockutils [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:22 np0005539505 nova_compute[186958]: 2025-11-29 07:16:22.164 186962 DEBUG nova.network.neutron [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:16:22 np0005539505 nova_compute[186958]: 2025-11-29 07:16:22.277 186962 DEBUG nova.compute.manager [req-47d5deb4-5fcf-4a2f-b0c2-73b4ac478884 req-b2262711-9baf-45c0-ada1-0803ea2ce690 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-changed-b04bb5a4-4610-4151-a86a-f1f55b164195 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:22 np0005539505 nova_compute[186958]: 2025-11-29 07:16:22.278 186962 DEBUG nova.compute.manager [req-47d5deb4-5fcf-4a2f-b0c2-73b4ac478884 req-b2262711-9baf-45c0-ada1-0803ea2ce690 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Refreshing instance network info cache due to event network-changed-b04bb5a4-4610-4151-a86a-f1f55b164195. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:16:22 np0005539505 nova_compute[186958]: 2025-11-29 07:16:22.278 186962 DEBUG oslo_concurrency.lockutils [req-47d5deb4-5fcf-4a2f-b0c2-73b4ac478884 req-b2262711-9baf-45c0-ada1-0803ea2ce690 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:22 np0005539505 nova_compute[186958]: 2025-11-29 07:16:22.427 186962 WARNING nova.network.neutron [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] 90812230-35cb-4e21-b16b-75b900100d8b already exists in list: networks containing: ['90812230-35cb-4e21-b16b-75b900100d8b']. ignoring it#033[00m
Nov 29 02:16:24 np0005539505 nova_compute[186958]: 2025-11-29 07:16:24.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:16:24 np0005539505 nova_compute[186958]: 2025-11-29 07:16:24.836 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.276 186962 DEBUG nova.network.neutron [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.297 186962 DEBUG oslo_concurrency.lockutils [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.298 186962 DEBUG oslo_concurrency.lockutils [req-47d5deb4-5fcf-4a2f-b0c2-73b4ac478884 req-b2262711-9baf-45c0-ada1-0803ea2ce690 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.299 186962 DEBUG nova.network.neutron [req-47d5deb4-5fcf-4a2f-b0c2-73b4ac478884 req-b2262711-9baf-45c0-ada1-0803ea2ce690 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Refreshing network info cache for port b04bb5a4-4610-4151-a86a-f1f55b164195 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.302 186962 DEBUG nova.virt.libvirt.vif [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.302 186962 DEBUG nova.network.os_vif_util [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.303 186962 DEBUG nova.network.os_vif_util [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.304 186962 DEBUG os_vif [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.305 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.305 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.305 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.308 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.308 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb04bb5a4-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.309 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb04bb5a4-46, col_values=(('external_ids', {'iface-id': 'b04bb5a4-4610-4151-a86a-f1f55b164195', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:d4:98', 'vm-uuid': '7da96eef-5195-4fe9-8421-3b8b79420a86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.310 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:26 np0005539505 NetworkManager[55134]: <info>  [1764400586.3117] manager: (tapb04bb5a4-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/216)
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.312 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.318 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.320 186962 INFO os_vif [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46')#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.321 186962 DEBUG nova.virt.libvirt.vif [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.321 186962 DEBUG nova.network.os_vif_util [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.322 186962 DEBUG nova.network.os_vif_util [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.325 186962 DEBUG nova.virt.libvirt.guest [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] attach device xml: <interface type="ethernet">
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:4e:d4:98"/>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <target dev="tapb04bb5a4-46"/>
Nov 29 02:16:26 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:16:26 np0005539505 nova_compute[186958]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 29 02:16:26 np0005539505 kernel: tapb04bb5a4-46: entered promiscuous mode
Nov 29 02:16:26 np0005539505 NetworkManager[55134]: <info>  [1764400586.3355] manager: (tapb04bb5a4-46): new Tun device (/org/freedesktop/NetworkManager/Devices/217)
Nov 29 02:16:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:26Z|00420|binding|INFO|Claiming lport b04bb5a4-4610-4151-a86a-f1f55b164195 for this chassis.
Nov 29 02:16:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:26Z|00421|binding|INFO|b04bb5a4-4610-4151-a86a-f1f55b164195: Claiming fa:16:3e:4e:d4:98 10.100.0.13
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.337 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.345 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:d4:98 10.100.0.13'], port_security=['fa:16:3e:4e:d4:98 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '2', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b04bb5a4-4610-4151-a86a-f1f55b164195) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.346 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b04bb5a4-4610-4151-a86a-f1f55b164195 in datapath 90812230-35cb-4e21-b16b-75b900100d8b bound to our chassis#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.348 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:16:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:26Z|00422|binding|INFO|Setting lport b04bb5a4-4610-4151-a86a-f1f55b164195 ovn-installed in OVS
Nov 29 02:16:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:26Z|00423|binding|INFO|Setting lport b04bb5a4-4610-4151-a86a-f1f55b164195 up in Southbound
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.353 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.355 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.363 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8b7ef5a9-3eb9-4a11-87fb-fa6074a73c7e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:26 np0005539505 systemd-udevd[232965]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:16:26 np0005539505 NetworkManager[55134]: <info>  [1764400586.3812] device (tapb04bb5a4-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:16:26 np0005539505 NetworkManager[55134]: <info>  [1764400586.3821] device (tapb04bb5a4-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.395 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[818e2642-73cb-4314-bb23-332c8d889a6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.400 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c00517-8598-4f31-b8f8-1a848117b481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.431 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1f906c5d-7d00-454f-945a-7226708528ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.446 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1d344505-e240-4301-bad3-f261ac2b1336]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600202, 'reachable_time': 25646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232972, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.451 186962 DEBUG nova.virt.libvirt.driver [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.451 186962 DEBUG nova.virt.libvirt.driver [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.451 186962 DEBUG nova.virt.libvirt.driver [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:c1:c1:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.452 186962 DEBUG nova.virt.libvirt.driver [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:4e:d4:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.460 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8fbcd96e-137e-4ab4-a0a0-9d1fac94c85d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600212, 'tstamp': 600212}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232973, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600215, 'tstamp': 600215}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232973, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.462 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.464 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.464 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.466 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.466 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.467 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.467 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.472 186962 DEBUG nova.virt.libvirt.guest [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:16:26</nova:creationTime>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:16:26 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:16:26 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:    <nova:port uuid="b04bb5a4-4610-4151-a86a-f1f55b164195">
Nov 29 02:16:26 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:16:26 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:16:26 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:16:26 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:16:26 np0005539505 nova_compute[186958]: 2025-11-29 07:16:26.502 186962 DEBUG oslo_concurrency.lockutils [None req-1591ac3c-fa0a-407a-92f8-6625aac9e7dc 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.955 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.956 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:26.956 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:27 np0005539505 nova_compute[186958]: 2025-11-29 07:16:27.615 186962 DEBUG nova.compute.manager [req-809aa6c3-f5ee-4daf-9848-f706fb71093c req-6d05a3c0-c70e-4a0c-8152-4cdcc59c91f4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-plugged-b04bb5a4-4610-4151-a86a-f1f55b164195 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:27 np0005539505 nova_compute[186958]: 2025-11-29 07:16:27.615 186962 DEBUG oslo_concurrency.lockutils [req-809aa6c3-f5ee-4daf-9848-f706fb71093c req-6d05a3c0-c70e-4a0c-8152-4cdcc59c91f4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:27 np0005539505 nova_compute[186958]: 2025-11-29 07:16:27.616 186962 DEBUG oslo_concurrency.lockutils [req-809aa6c3-f5ee-4daf-9848-f706fb71093c req-6d05a3c0-c70e-4a0c-8152-4cdcc59c91f4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:27 np0005539505 nova_compute[186958]: 2025-11-29 07:16:27.616 186962 DEBUG oslo_concurrency.lockutils [req-809aa6c3-f5ee-4daf-9848-f706fb71093c req-6d05a3c0-c70e-4a0c-8152-4cdcc59c91f4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:27 np0005539505 nova_compute[186958]: 2025-11-29 07:16:27.616 186962 DEBUG nova.compute.manager [req-809aa6c3-f5ee-4daf-9848-f706fb71093c req-6d05a3c0-c70e-4a0c-8152-4cdcc59c91f4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] No waiting events found dispatching network-vif-plugged-b04bb5a4-4610-4151-a86a-f1f55b164195 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:16:27 np0005539505 nova_compute[186958]: 2025-11-29 07:16:27.616 186962 WARNING nova.compute.manager [req-809aa6c3-f5ee-4daf-9848-f706fb71093c req-6d05a3c0-c70e-4a0c-8152-4cdcc59c91f4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received unexpected event network-vif-plugged-b04bb5a4-4610-4151-a86a-f1f55b164195 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:16:28 np0005539505 nova_compute[186958]: 2025-11-29 07:16:28.279 186962 DEBUG nova.network.neutron [req-47d5deb4-5fcf-4a2f-b0c2-73b4ac478884 req-b2262711-9baf-45c0-ada1-0803ea2ce690 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updated VIF entry in instance network info cache for port b04bb5a4-4610-4151-a86a-f1f55b164195. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:16:28 np0005539505 nova_compute[186958]: 2025-11-29 07:16:28.280 186962 DEBUG nova.network.neutron [req-47d5deb4-5fcf-4a2f-b0c2-73b4ac478884 req-b2262711-9baf-45c0-ada1-0803ea2ce690 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:28 np0005539505 nova_compute[186958]: 2025-11-29 07:16:28.298 186962 DEBUG oslo_concurrency.lockutils [req-47d5deb4-5fcf-4a2f-b0c2-73b4ac478884 req-b2262711-9baf-45c0-ada1-0803ea2ce690 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:28 np0005539505 nova_compute[186958]: 2025-11-29 07:16:28.329 186962 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Creating tmpfile /var/lib/nova/instances/tmpgy45ripl to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Nov 29 02:16:28 np0005539505 nova_compute[186958]: 2025-11-29 07:16:28.472 186962 DEBUG nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgy45ripl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Nov 29 02:16:28 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:28Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:d4:98 10.100.0.13
Nov 29 02:16:28 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:28Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:d4:98 10.100.0.13
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.016 186962 DEBUG oslo_concurrency.lockutils [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.016 186962 DEBUG oslo_concurrency.lockutils [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.017 186962 DEBUG nova.objects.instance [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'flavor' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.425 186962 DEBUG nova.objects.instance [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'pci_requests' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.440 186962 DEBUG nova.network.neutron [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.550 186962 DEBUG nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgy45ripl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='704c4aa7-3239-4ecc-bfdc-c72642678363',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.577 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.577 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquired lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.578 186962 DEBUG nova.network.neutron [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.755 186962 DEBUG nova.compute.manager [req-7d21bb8d-f487-4245-a662-98bba39d0465 req-5622859b-52f3-4c87-b7d4-c9e7005cc6ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-plugged-b04bb5a4-4610-4151-a86a-f1f55b164195 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.756 186962 DEBUG oslo_concurrency.lockutils [req-7d21bb8d-f487-4245-a662-98bba39d0465 req-5622859b-52f3-4c87-b7d4-c9e7005cc6ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.756 186962 DEBUG oslo_concurrency.lockutils [req-7d21bb8d-f487-4245-a662-98bba39d0465 req-5622859b-52f3-4c87-b7d4-c9e7005cc6ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.756 186962 DEBUG oslo_concurrency.lockutils [req-7d21bb8d-f487-4245-a662-98bba39d0465 req-5622859b-52f3-4c87-b7d4-c9e7005cc6ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.756 186962 DEBUG nova.compute.manager [req-7d21bb8d-f487-4245-a662-98bba39d0465 req-5622859b-52f3-4c87-b7d4-c9e7005cc6ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] No waiting events found dispatching network-vif-plugged-b04bb5a4-4610-4151-a86a-f1f55b164195 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.757 186962 WARNING nova.compute.manager [req-7d21bb8d-f487-4245-a662-98bba39d0465 req-5622859b-52f3-4c87-b7d4-c9e7005cc6ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received unexpected event network-vif-plugged-b04bb5a4-4610-4151-a86a-f1f55b164195 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.837 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:29 np0005539505 nova_compute[186958]: 2025-11-29 07:16:29.901 186962 DEBUG nova.policy [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.596 186962 DEBUG nova.network.neutron [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Successfully created port: 816e158d-4c1c-4ea8-ae90-eb4e66048a31 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.814 186962 DEBUG nova.network.neutron [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updating instance_info_cache with network_info: [{"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.828 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Releasing lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.843 186962 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgy45ripl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='704c4aa7-3239-4ecc-bfdc-c72642678363',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.843 186962 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Creating instance directory: /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.844 186962 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Creating disk.info with the contents: {'/var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk': 'qcow2', '/var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.844 186962 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.844 186962 DEBUG nova.objects.instance [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'trusted_certs' on Instance uuid 704c4aa7-3239-4ecc-bfdc-c72642678363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.873 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.927 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.928 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.929 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.940 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.996 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:30 np0005539505 nova_compute[186958]: 2025-11-29 07:16:30.997 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.231 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk 1073741824" returned: 0 in 0.233s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.231 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.232 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.288 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.289 186962 DEBUG nova.virt.disk.api [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Checking if we can resize image /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.289 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.313 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.344 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.345 186962 DEBUG nova.virt.disk.api [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Cannot resize image /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.345 186962 DEBUG nova.objects.instance [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lazy-loading 'migration_context' on Instance uuid 704c4aa7-3239-4ecc-bfdc-c72642678363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:31 np0005539505 podman[232990]: 2025-11-29 07:16:31.716548082 +0000 UTC m=+0.047117258 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:16:31 np0005539505 podman[232989]: 2025-11-29 07:16:31.72423911 +0000 UTC m=+0.057153333 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.814 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.836 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.config 485376" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.837 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.config to /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:16:31 np0005539505 nova_compute[186958]: 2025-11-29 07:16:31.838 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.config /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.353 186962 DEBUG oslo_concurrency.processutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363/disk.config /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.354 186962 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.355 186962 DEBUG nova.virt.libvirt.vif [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1683200929',display_name='tempest-TestNetworkAdvancedServerOps-server-1683200929',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1683200929',id=102,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJpBdlQTrwm1jTLhsIWvBArp7FJbNV/DmsxpavKG+fSfuJYeopMQPEBt+TLRsvwJz1i5TrgMP98T/zGS4tH40QimuRAQV56ulySp5fCUrK73vauhbVZ7xUa0c5MPUYrHZg==',key_name='tempest-TestNetworkAdvancedServerOps-1740209866',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:16:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-v4kh9s3g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:16:02Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=704c4aa7-3239-4ecc-bfdc-c72642678363,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.356 186962 DEBUG nova.network.os_vif_util [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converting VIF {"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.357 186962 DEBUG nova.network.os_vif_util [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.357 186962 DEBUG os_vif [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.358 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.358 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.359 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.361 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.361 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29881f52-aa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.362 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap29881f52-aa, col_values=(('external_ids', {'iface-id': '29881f52-aa42-4a78-a87b-06e906811ff2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:9d:fe', 'vm-uuid': '704c4aa7-3239-4ecc-bfdc-c72642678363'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.363 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:32 np0005539505 NetworkManager[55134]: <info>  [1764400592.3655] manager: (tap29881f52-aa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/218)
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.366 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.371 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.372 186962 INFO os_vif [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa')#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.372 186962 DEBUG nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Nov 29 02:16:32 np0005539505 nova_compute[186958]: 2025-11-29 07:16:32.373 186962 DEBUG nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgy45ripl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='704c4aa7-3239-4ecc-bfdc-c72642678363',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Nov 29 02:16:34 np0005539505 nova_compute[186958]: 2025-11-29 07:16:34.312 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:34 np0005539505 nova_compute[186958]: 2025-11-29 07:16:34.513 186962 DEBUG nova.network.neutron [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Successfully updated port: 816e158d-4c1c-4ea8-ae90-eb4e66048a31 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:16:34 np0005539505 podman[233040]: 2025-11-29 07:16:34.729087886 +0000 UTC m=+0.056797932 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:16:34 np0005539505 nova_compute[186958]: 2025-11-29 07:16:34.840 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:34 np0005539505 nova_compute[186958]: 2025-11-29 07:16:34.844 186962 DEBUG nova.compute.manager [req-0c4e33d0-08b1-4aeb-8400-5ffb8671a4bf req-586ccbea-dbba-432e-8dd0-d6a9a6df78a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-changed-816e158d-4c1c-4ea8-ae90-eb4e66048a31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:34 np0005539505 nova_compute[186958]: 2025-11-29 07:16:34.845 186962 DEBUG nova.compute.manager [req-0c4e33d0-08b1-4aeb-8400-5ffb8671a4bf req-586ccbea-dbba-432e-8dd0-d6a9a6df78a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Refreshing instance network info cache due to event network-changed-816e158d-4c1c-4ea8-ae90-eb4e66048a31. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:16:34 np0005539505 nova_compute[186958]: 2025-11-29 07:16:34.845 186962 DEBUG oslo_concurrency.lockutils [req-0c4e33d0-08b1-4aeb-8400-5ffb8671a4bf req-586ccbea-dbba-432e-8dd0-d6a9a6df78a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:34 np0005539505 nova_compute[186958]: 2025-11-29 07:16:34.845 186962 DEBUG oslo_concurrency.lockutils [req-0c4e33d0-08b1-4aeb-8400-5ffb8671a4bf req-586ccbea-dbba-432e-8dd0-d6a9a6df78a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:34 np0005539505 nova_compute[186958]: 2025-11-29 07:16:34.845 186962 DEBUG nova.network.neutron [req-0c4e33d0-08b1-4aeb-8400-5ffb8671a4bf req-586ccbea-dbba-432e-8dd0-d6a9a6df78a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Refreshing network info cache for port 816e158d-4c1c-4ea8-ae90-eb4e66048a31 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:16:35 np0005539505 nova_compute[186958]: 2025-11-29 07:16:35.747 186962 DEBUG oslo_concurrency.lockutils [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:36 np0005539505 nova_compute[186958]: 2025-11-29 07:16:36.581 186962 DEBUG nova.network.neutron [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Port 29881f52-aa42-4a78-a87b-06e906811ff2 updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.291 186962 DEBUG nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgy45ripl',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='704c4aa7-3239-4ecc-bfdc-c72642678363',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.404 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:37 np0005539505 systemd[1]: Starting libvirt proxy daemon...
Nov 29 02:16:37 np0005539505 systemd[1]: Started libvirt proxy daemon.
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.573 186962 DEBUG nova.network.neutron [req-0c4e33d0-08b1-4aeb-8400-5ffb8671a4bf req-586ccbea-dbba-432e-8dd0-d6a9a6df78a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Added VIF to instance network info cache for port 816e158d-4c1c-4ea8-ae90-eb4e66048a31. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3489#033[00m
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.574 186962 DEBUG nova.network.neutron [req-0c4e33d0-08b1-4aeb-8400-5ffb8671a4bf req-586ccbea-dbba-432e-8dd0-d6a9a6df78a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.607 186962 DEBUG oslo_concurrency.lockutils [req-0c4e33d0-08b1-4aeb-8400-5ffb8671a4bf req-586ccbea-dbba-432e-8dd0-d6a9a6df78a4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.608 186962 DEBUG oslo_concurrency.lockutils [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.608 186962 DEBUG nova.network.neutron [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:16:37 np0005539505 kernel: tap29881f52-aa: entered promiscuous mode
Nov 29 02:16:37 np0005539505 NetworkManager[55134]: <info>  [1764400597.6215] manager: (tap29881f52-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/219)
Nov 29 02:16:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:37Z|00424|binding|INFO|Claiming lport 29881f52-aa42-4a78-a87b-06e906811ff2 for this additional chassis.
Nov 29 02:16:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:37Z|00425|binding|INFO|29881f52-aa42-4a78-a87b-06e906811ff2: Claiming fa:16:3e:44:9d:fe 10.100.0.6
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.624 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:37Z|00426|binding|INFO|Setting lport 29881f52-aa42-4a78-a87b-06e906811ff2 ovn-installed in OVS
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.641 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.644 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:37 np0005539505 systemd-udevd[233092]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:16:37 np0005539505 systemd-machined[153285]: New machine qemu-53-instance-00000066.
Nov 29 02:16:37 np0005539505 NetworkManager[55134]: <info>  [1764400597.6676] device (tap29881f52-aa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:16:37 np0005539505 NetworkManager[55134]: <info>  [1764400597.6686] device (tap29881f52-aa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:16:37 np0005539505 systemd[1]: Started Virtual Machine qemu-53-instance-00000066.
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.789 186962 WARNING nova.network.neutron [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] 90812230-35cb-4e21-b16b-75b900100d8b already exists in list: networks containing: ['90812230-35cb-4e21-b16b-75b900100d8b']. ignoring it#033[00m
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.790 186962 WARNING nova.network.neutron [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] 90812230-35cb-4e21-b16b-75b900100d8b already exists in list: networks containing: ['90812230-35cb-4e21-b16b-75b900100d8b']. ignoring it#033[00m
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.790 186962 WARNING nova.network.neutron [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] 90812230-35cb-4e21-b16b-75b900100d8b already exists in list: networks containing: ['90812230-35cb-4e21-b16b-75b900100d8b']. ignoring it#033[00m
Nov 29 02:16:37 np0005539505 nova_compute[186958]: 2025-11-29 07:16:37.790 186962 WARNING nova.network.neutron [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] 816e158d-4c1c-4ea8-ae90-eb4e66048a31 already exists in list: port_ids containing: ['816e158d-4c1c-4ea8-ae90-eb4e66048a31']. ignoring it#033[00m
Nov 29 02:16:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:38.302 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:16:38 np0005539505 nova_compute[186958]: 2025-11-29 07:16:38.303 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:38.304 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:16:38 np0005539505 nova_compute[186958]: 2025-11-29 07:16:38.608 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400598.6076283, 704c4aa7-3239-4ecc-bfdc-c72642678363 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:16:38 np0005539505 nova_compute[186958]: 2025-11-29 07:16:38.609 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] VM Started (Lifecycle Event)#033[00m
Nov 29 02:16:38 np0005539505 nova_compute[186958]: 2025-11-29 07:16:38.762 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:16:39 np0005539505 nova_compute[186958]: 2025-11-29 07:16:39.552 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400599.552482, 704c4aa7-3239-4ecc-bfdc-c72642678363 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:16:39 np0005539505 nova_compute[186958]: 2025-11-29 07:16:39.553 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:16:39 np0005539505 nova_compute[186958]: 2025-11-29 07:16:39.570 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:16:39 np0005539505 nova_compute[186958]: 2025-11-29 07:16:39.573 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:16:39 np0005539505 nova_compute[186958]: 2025-11-29 07:16:39.589 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Nov 29 02:16:39 np0005539505 nova_compute[186958]: 2025-11-29 07:16:39.841 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:42 np0005539505 nova_compute[186958]: 2025-11-29 07:16:42.407 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:43 np0005539505 podman[233124]: 2025-11-29 07:16:43.726123101 +0000 UTC m=+0.055946499 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:16:43 np0005539505 podman[233125]: 2025-11-29 07:16:43.75110794 +0000 UTC m=+0.079827296 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:16:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:44Z|00427|binding|INFO|Claiming lport 29881f52-aa42-4a78-a87b-06e906811ff2 for this chassis.
Nov 29 02:16:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:44Z|00428|binding|INFO|29881f52-aa42-4a78-a87b-06e906811ff2: Claiming fa:16:3e:44:9d:fe 10.100.0.6
Nov 29 02:16:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:44Z|00429|binding|INFO|Setting lport 29881f52-aa42-4a78-a87b-06e906811ff2 up in Southbound
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.163 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:9d:fe 10.100.0.6'], port_security=['fa:16:3e:44:9d:fe 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8ce59f3-d777-4899-bf5b-171901097199', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '9', 'neutron:security_group_ids': '0aded770-2a08-4693-9d94-82fba33c50bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8856130-4f24-493d-8324-579a0d608efb, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=29881f52-aa42-4a78-a87b-06e906811ff2) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.165 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 29881f52-aa42-4a78-a87b-06e906811ff2 in datapath f8ce59f3-d777-4899-bf5b-171901097199 bound to our chassis#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.169 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f8ce59f3-d777-4899-bf5b-171901097199#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.181 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[94e0b914-9473-48fd-ab84-7a4491ae2f0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.183 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf8ce59f3-d1 in ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.185 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf8ce59f3-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.185 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5515d766-deb4-4b6a-a731-787323c32881]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.186 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[85b4983d-7ca7-4dbe-a230-70057157fa38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.201 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c7cefe-4479-4df1-a66b-a39e64db1785]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.230 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2f671e10-98e1-4974-a807-40a3eafdd147]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.261 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b9d2ed-f2b1-425a-b4e8-17294a9d8afd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.268 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[81de614f-272c-4f63-bb0d-a5f3fcd7951d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 NetworkManager[55134]: <info>  [1764400604.2692] manager: (tapf8ce59f3-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/220)
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.297 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[fe376e9d-55d2-4017-932d-887be8c61fb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.300 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[87f27234-cf67-4bdc-b6ca-7e2bb6404f17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 systemd-udevd[233180]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:16:44 np0005539505 NetworkManager[55134]: <info>  [1764400604.3251] device (tapf8ce59f3-d0): carrier: link connected
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.331 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f239f34b-50ab-4900-8e5d-34d2aac6042a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.351 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[454fbdbc-aca0-449b-9e14-aad0198b0344]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8ce59f3-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:b0:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605190, 'reachable_time': 31642, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233199, 'error': None, 'target': 'ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.366 186962 INFO nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Post operation of migration started#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.372 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ae9df2da-c683-4d5b-afdb-f2b3b2f66789]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef7:b020'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 605190, 'tstamp': 605190}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233200, 'error': None, 'target': 'ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.391 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[be67a838-ef7a-4907-8cce-e692a799b6e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf8ce59f3-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f7:b0:20'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605190, 'reachable_time': 31642, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233201, 'error': None, 'target': 'ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.424 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[506bb0aa-9174-4ca8-ba03-95edffaedfc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.486 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c5f75d-0292-458f-a720-1b446d317eae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.487 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8ce59f3-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.487 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.488 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8ce59f3-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.489 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 NetworkManager[55134]: <info>  [1764400604.4902] manager: (tapf8ce59f3-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Nov 29 02:16:44 np0005539505 kernel: tapf8ce59f3-d0: entered promiscuous mode
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.492 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.493 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf8ce59f3-d0, col_values=(('external_ids', {'iface-id': 'c10b6573-55b8-4259-8949-c467435d65c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.493 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:44Z|00430|binding|INFO|Releasing lport c10b6573-55b8-4259-8949-c467435d65c0 from this chassis (sb_readonly=0)
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.505 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.505 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f8ce59f3-d777-4899-bf5b-171901097199.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f8ce59f3-d777-4899-bf5b-171901097199.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.506 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bcce16db-d1b1-4a1d-b9f6-db1b164ab742]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.507 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-f8ce59f3-d777-4899-bf5b-171901097199
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/f8ce59f3-d777-4899-bf5b-171901097199.pid.haproxy
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID f8ce59f3-d777-4899-bf5b-171901097199
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.507 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199', 'env', 'PROCESS_TAG=haproxy-f8ce59f3-d777-4899-bf5b-171901097199', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f8ce59f3-d777-4899-bf5b-171901097199.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.751 186962 DEBUG nova.network.neutron [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.756 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.809 186962 DEBUG oslo_concurrency.lockutils [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.813 186962 DEBUG nova.virt.libvirt.vif [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.814 186962 DEBUG nova.network.os_vif_util [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.815 186962 DEBUG nova.network.os_vif_util [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:52:e0,bridge_name='br-int',has_traffic_filtering=True,id=816e158d-4c1c-4ea8-ae90-eb4e66048a31,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap816e158d-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.815 186962 DEBUG os_vif [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:52:e0,bridge_name='br-int',has_traffic_filtering=True,id=816e158d-4c1c-4ea8-ae90-eb4e66048a31,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap816e158d-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.816 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.817 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.817 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.820 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.820 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap816e158d-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.821 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap816e158d-4c, col_values=(('external_ids', {'iface-id': '816e158d-4c1c-4ea8-ae90-eb4e66048a31', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:52:e0', 'vm-uuid': '7da96eef-5195-4fe9-8421-3b8b79420a86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.822 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 NetworkManager[55134]: <info>  [1764400604.8240] manager: (tap816e158d-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/222)
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.826 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.827 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquired lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.827 186962 DEBUG nova.network.neutron [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.830 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.831 186962 INFO os_vif [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:52:e0,bridge_name='br-int',has_traffic_filtering=True,id=816e158d-4c1c-4ea8-ae90-eb4e66048a31,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap816e158d-4c')#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.833 186962 DEBUG nova.virt.libvirt.vif [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.833 186962 DEBUG nova.network.os_vif_util [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.835 186962 DEBUG nova.network.os_vif_util [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:52:e0,bridge_name='br-int',has_traffic_filtering=True,id=816e158d-4c1c-4ea8-ae90-eb4e66048a31,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap816e158d-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.840 186962 DEBUG nova.virt.libvirt.guest [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] attach device xml: <interface type="ethernet">
Nov 29 02:16:44 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:0a:52:e0"/>
Nov 29 02:16:44 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:16:44 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:16:44 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:16:44 np0005539505 nova_compute[186958]:  <target dev="tap816e158d-4c"/>
Nov 29 02:16:44 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:16:44 np0005539505 nova_compute[186958]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.843 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 kernel: tap816e158d-4c: entered promiscuous mode
Nov 29 02:16:44 np0005539505 NetworkManager[55134]: <info>  [1764400604.8536] manager: (tap816e158d-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/223)
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.855 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:44Z|00431|binding|INFO|Claiming lport 816e158d-4c1c-4ea8-ae90-eb4e66048a31 for this chassis.
Nov 29 02:16:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:44Z|00432|binding|INFO|816e158d-4c1c-4ea8-ae90-eb4e66048a31: Claiming fa:16:3e:0a:52:e0 10.100.0.8
Nov 29 02:16:44 np0005539505 systemd-udevd[233198]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:16:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:44Z|00433|binding|INFO|Setting lport 816e158d-4c1c-4ea8-ae90-eb4e66048a31 ovn-installed in OVS
Nov 29 02:16:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:44Z|00434|binding|INFO|Setting lport 816e158d-4c1c-4ea8-ae90-eb4e66048a31 up in Southbound
Nov 29 02:16:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:44.868 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:52:e0 10.100.0.8'], port_security=['fa:16:3e:0a:52:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '2', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=816e158d-4c1c-4ea8-ae90-eb4e66048a31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.869 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 NetworkManager[55134]: <info>  [1764400604.8710] device (tap816e158d-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:16:44 np0005539505 NetworkManager[55134]: <info>  [1764400604.8727] device (tap816e158d-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:16:44 np0005539505 nova_compute[186958]: 2025-11-29 07:16:44.874 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:44 np0005539505 podman[233235]: 2025-11-29 07:16:44.859187029 +0000 UTC m=+0.033777229 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.205 186962 DEBUG nova.virt.libvirt.driver [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.206 186962 DEBUG nova.virt.libvirt.driver [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.206 186962 DEBUG nova.virt.libvirt.driver [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:c1:c1:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.207 186962 DEBUG nova.virt.libvirt.driver [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:4e:d4:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.208 186962 DEBUG nova.virt.libvirt.driver [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:0a:52:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.274 186962 DEBUG nova.virt.libvirt.guest [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:16:45 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:16:45</nova:creationTime>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:16:45 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    <nova:port uuid="b04bb5a4-4610-4151-a86a-f1f55b164195">
Nov 29 02:16:45 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    <nova:port uuid="816e158d-4c1c-4ea8-ae90-eb4e66048a31">
Nov 29 02:16:45 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:16:45 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:16:45 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:16:45 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:16:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:45.306 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.326 186962 DEBUG oslo_concurrency.lockutils [None req-f9ef3b90-6295-4d4b-ad75-b63db226d10b 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 16.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.471 186962 DEBUG nova.compute.manager [req-f2fe8a4e-7228-4c49-b6bb-502be7798273 req-1ff8bae2-3dd0-4b3a-bb8e-73cb2b4b9141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-plugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.472 186962 DEBUG oslo_concurrency.lockutils [req-f2fe8a4e-7228-4c49-b6bb-502be7798273 req-1ff8bae2-3dd0-4b3a-bb8e-73cb2b4b9141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.472 186962 DEBUG oslo_concurrency.lockutils [req-f2fe8a4e-7228-4c49-b6bb-502be7798273 req-1ff8bae2-3dd0-4b3a-bb8e-73cb2b4b9141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.473 186962 DEBUG oslo_concurrency.lockutils [req-f2fe8a4e-7228-4c49-b6bb-502be7798273 req-1ff8bae2-3dd0-4b3a-bb8e-73cb2b4b9141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.473 186962 DEBUG nova.compute.manager [req-f2fe8a4e-7228-4c49-b6bb-502be7798273 req-1ff8bae2-3dd0-4b3a-bb8e-73cb2b4b9141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] No waiting events found dispatching network-vif-plugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:16:45 np0005539505 nova_compute[186958]: 2025-11-29 07:16:45.474 186962 WARNING nova.compute.manager [req-f2fe8a4e-7228-4c49-b6bb-502be7798273 req-1ff8bae2-3dd0-4b3a-bb8e-73cb2b4b9141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received unexpected event network-vif-plugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:16:45 np0005539505 podman[233235]: 2025-11-29 07:16:45.888163274 +0000 UTC m=+1.062753424 container create ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:16:46 np0005539505 systemd[1]: Started libpod-conmon-ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f.scope.
Nov 29 02:16:46 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:16:46 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe0b007e7e140a4e2fd73a4d8818e53e23209cc52bf92b819138dac1b56f67c5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:16:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:46Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0a:52:e0 10.100.0.8
Nov 29 02:16:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:16:46Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:52:e0 10.100.0.8
Nov 29 02:16:47 np0005539505 podman[233235]: 2025-11-29 07:16:47.042693401 +0000 UTC m=+2.217283581 container init ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:16:47 np0005539505 podman[233235]: 2025-11-29 07:16:47.048014432 +0000 UTC m=+2.222604602 container start ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:16:47 np0005539505 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[233255]: [NOTICE]   (233259) : New worker (233261) forked
Nov 29 02:16:47 np0005539505 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[233255]: [NOTICE]   (233259) : Loading success.
Nov 29 02:16:47 np0005539505 nova_compute[186958]: 2025-11-29 07:16:47.410 186962 DEBUG nova.network.neutron [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updating instance_info_cache with network_info: [{"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.467 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 816e158d-4c1c-4ea8-ae90-eb4e66048a31 in datapath 90812230-35cb-4e21-b16b-75b900100d8b unbound from our chassis#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.469 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.483 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3bfeea21-33fc-47ee-9e93-b7d9532734c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.518 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[6030e40c-eddb-448d-bb05-685e53ccbd54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.523 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3544d4-5e49-4dd0-8ab1-56ce741f01b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.558 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[0a67f5eb-fee9-4da8-a5dc-d23d3195ddee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.579 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4419d2df-0fa5-427d-bc9a-2d91bca2d92b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600202, 'reachable_time': 25646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233275, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.596 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[40b70198-5c52-4994-99be-e1c9363d9a13]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600212, 'tstamp': 600212}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233276, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600215, 'tstamp': 600215}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233276, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.599 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:47 np0005539505 nova_compute[186958]: 2025-11-29 07:16:47.601 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:47 np0005539505 nova_compute[186958]: 2025-11-29 07:16:47.603 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.603 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.603 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.604 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:16:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:16:47.604 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.037 186962 DEBUG nova.compute.manager [req-98394974-5e8d-4294-b506-fb64ca3a49c7 req-f87ce9ff-4e54-4e94-be8e-ad11d8c226d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-plugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.037 186962 DEBUG oslo_concurrency.lockutils [req-98394974-5e8d-4294-b506-fb64ca3a49c7 req-f87ce9ff-4e54-4e94-be8e-ad11d8c226d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.038 186962 DEBUG oslo_concurrency.lockutils [req-98394974-5e8d-4294-b506-fb64ca3a49c7 req-f87ce9ff-4e54-4e94-be8e-ad11d8c226d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.038 186962 DEBUG oslo_concurrency.lockutils [req-98394974-5e8d-4294-b506-fb64ca3a49c7 req-f87ce9ff-4e54-4e94-be8e-ad11d8c226d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.038 186962 DEBUG nova.compute.manager [req-98394974-5e8d-4294-b506-fb64ca3a49c7 req-f87ce9ff-4e54-4e94-be8e-ad11d8c226d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] No waiting events found dispatching network-vif-plugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.039 186962 WARNING nova.compute.manager [req-98394974-5e8d-4294-b506-fb64ca3a49c7 req-f87ce9ff-4e54-4e94-be8e-ad11d8c226d6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received unexpected event network-vif-plugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.041 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Releasing lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.090 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000065', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '16d7af1670ea460db3d0422f176b6f98', 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'hostId': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.093 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000066', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c231e63624d44fc19e0989abfb1afb22', 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'hostId': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.122 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.write.requests volume: 327 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.123 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.135 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.136 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.136 186962 DEBUG oslo_concurrency.lockutils [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.142 186962 INFO nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Nov 29 02:16:48 np0005539505 virtqemud[186353]: Domain id=53 name='instance-00000066' uuid=704c4aa7-3239-4ecc-bfdc-c72642678363 is tainted: custom-monitor
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.145 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.write.requests volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.145 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fc281ee-0bd3-40cd-9704-9612b6e21cf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 327, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-vda', 'timestamp': '2025-11-29T07:16:48.094552', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e642cfe-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': 'ae4e2bd3dcd7c3cecc5aac4664fbc4e482def0bd5b2170287da73cb8db452510'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-sda', 'timestamp': '2025-11-29T07:16:48.094552', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e643b36-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': 'aca739d727c8159724875c7c75efad3e9665568af3ec4466c8aac3a4a6913da1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 6, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-vda', 'timestamp': '2025-11-29T07:16:48.094552', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e67aa0a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': '5ceb8349e7da2c67b3966c7224cb32b995fa47d7c84c22947b5e20bad9cc763a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-sda', 'timestamp': '2025-11-29T07:16:48.094552', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e67b590-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': 'e949dd514e1fe9e0133ac6e953a12c1ee7567cff62c17a33a89c6ef0c5ff81d7'}]}, 'timestamp': '2025-11-29 07:16:48.146076', '_unique_id': '09a983697f444a8b95bbdf5a6ae0a94b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.148 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.149 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.write.latency volume: 25067455189 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.149 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.149 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.write.latency volume: 8849701 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.149 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '910d2963-ef2d-4d7e-8469-7a002201c1fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25067455189, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-vda', 'timestamp': '2025-11-29T07:16:48.149025', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e6839b6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': 'bdda93c9d30245bc27efeddfb73a5f4c29c98ef5c6d153403c7f7be87c7ee815'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-sda', 'timestamp': '2025-11-29T07:16:48.149025', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e68441a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': 'e450b65b044f88a1102498a141ae5dc1d7f42b13e25b4557343c2bda1aa4e07c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8849701, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-vda', 'timestamp': '2025-11-29T07:16:48.149025', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e684d70-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': 'de6f2de5322c63d7c309660164da5fdee0879eb72e1588fcbc3c3547a6454bf9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-sda', 'timestamp': '2025-11-29T07:16:48.149025', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e685676-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': '037faaa3cdc295bbb3f74263f742f2f16ba54fd72067b699dccf5baaaec392fd'}]}, 'timestamp': '2025-11-29 07:16:48.150178', '_unique_id': '2a133852cd2c4e478c583c0952b3b774'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.151 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.read.bytes volume: 30300672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.152 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.152 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.152 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f590def8-98e9-424b-bfa2-22acf773958a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30300672, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-vda', 'timestamp': '2025-11-29T07:16:48.151939', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e68a554-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': '7cedc3c14660b0604d4d4cde5908d613782f17096c62c577d2b5771ac10d0b14'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-sda', 'timestamp': '2025-11-29T07:16:48.151939', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e68b012-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': 'ac4f7a7c297f74553786f3fbda736a5e5b23f6cf6fa76b647be34beaecf27dbf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-vda', 'timestamp': '2025-11-29T07:16:48.151939', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e68ba76-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': '47ec5727a58ce05205d60117f0b2a825fa3c3ee802441fc3f64ed0943408a1db'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-sda', 'timestamp': '2025-11-29T07:16:48.151939', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e68c37c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': '8828c9b941d3c927f0d11c44470310fbbad023afffed4ec140ce7180bec1f49e'}]}, 'timestamp': '2025-11-29 07:16:48.152965', '_unique_id': '311622e5d54a4182a4109f1d6c4e1c9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.154 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.read.requests volume: 1089 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.154 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.155 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.155 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5afcf9d5-546f-413b-8c63-981c0fe0fac6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1089, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-vda', 'timestamp': '2025-11-29T07:16:48.154544', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e690b70-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': '8b0e09ffe2fc5a10ecd5c3f8d748aca2e435d819fc61c930d3f1f7bf5bd37d7f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-sda', 'timestamp': '2025-11-29T07:16:48.154544', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e6914da-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': 'c7ea484d0b12aa09b02c105f6deae0e652f40aa80719b97a70be23bf74656b13'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-vda', 'timestamp': '2025-11-29T07:16:48.154544', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e691de0-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': '5895c413856e4dca113d67f456576ece05b10c0148287a6d2c8ea3c57007a83c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-sda', 'timestamp': '2025-11-29T07:16:48.154544', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e6928c6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': 'e77a7d7fb61d363bba874e0fd88d769e001e888f432ace27978b39326c0021a3'}]}, 'timestamp': '2025-11-29 07:16:48.155559', '_unique_id': '8548037b5fec4be29a2458335ae079f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.159 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7da96eef-5195-4fe9-8421-3b8b79420a86 / tap7f253c88-5c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.160 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7da96eef-5195-4fe9-8421-3b8b79420a86 / tapb04bb5a4-46 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.160 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7da96eef-5195-4fe9-8421-3b8b79420a86 / tap816e158d-4c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.161 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.161 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.161 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.163 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 704c4aa7-3239-4ecc-bfdc-c72642678363 / tap29881f52-aa inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.164 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '306e2330-4964-4793-8710-8e5d06105b63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap7f253c88-5c', 'timestamp': '2025-11-29T07:16:48.157207', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap7f253c88-5c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:c1:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f253c88-5c'}, 'message_id': '5e6a0d36-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': 'ed40163bce92cea58408572d98927f8e8e354076dc87f5dc03818388abb5b768'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tapb04bb5a4-46', 'timestamp': '2025-11-29T07:16:48.157207', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tapb04bb5a4-46', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:d4:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb04bb5a4-46'}, 'message_id': '5e6a17ea-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '05f6debc4658fc5ee16bafb914ba7f4c690bc64fe8f6902b408f1fba9bdc4a9b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap816e158d-4c', 'timestamp': '2025-11-29T07:16:48.157207', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap816e158d-4c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:52:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap816e158d-4c'}, 'message_id': '5e6a21b8-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '0cb8c481ba9028b048597e996cf9684ca6b0708f80ecf9c00121190e4bfb863a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000066-704c4aa7-3239-4ecc-bfdc-c72642678363-tap29881f52-aa', 'timestamp': '2025-11-29T07:16:48.157207', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'tap29881f52-aa', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:9d:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29881f52-aa'}, 'message_id': '5e6a846e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.802745234, 'message_signature': '477a1e957b6758fd7c2dbfe225ac5398105d6521ea32d7bb8931d0dc682fd07c'}]}, 'timestamp': '2025-11-29 07:16:48.164514', '_unique_id': 'd96e663f2fa945e6ac926a3bf565acb0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.174 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.174 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.181 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.182 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7f788c2-81bc-47f0-9afc-18d875de7f99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-vda', 'timestamp': '2025-11-29T07:16:48.166422', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e6c09ce-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.807255182, 'message_signature': '87e65a558ea89f7ebe79b0ec9a164efbc6fd84f96116271d4b572662c1c19dea'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-sda', 'timestamp': '2025-11-29T07:16:48.166422', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e6c164e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.807255182, 'message_signature': 'ff15531204e3563b9cfa223b003281aa61d80bcaf61d587cdb993dbadadbf80c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-vda', 'timestamp': '2025-11-29T07:16:48.166422', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e6d3a24-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.815586458, 'message_signature': 'a09eb1391db1500ab4d3a8b80bc12c959fd6845687de1f4450ec971914428960'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-sda', 'timestamp': '2025-11-29T07:16:48.166422', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e6d458c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.815586458, 'message_signature': 'a2fb3d183a35c0b17018a9966a961d67884367e765cb40b2ebf8831d18283e81'}]}, 'timestamp': '2025-11-29 07:16:48.182523', '_unique_id': 'e7d23c6ce2a141d3bc685319d65fd7e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.184 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.184 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.185 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.185 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acd65040-887d-452a-a395-91fd0c698ec1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap7f253c88-5c', 'timestamp': '2025-11-29T07:16:48.184528', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap7f253c88-5c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:c1:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f253c88-5c'}, 'message_id': '5e6da0c2-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': 'ee09ec824f3f40a710aac28f3d473563d65d31bcc3a96d8e18ac5971eeda60ac'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tapb04bb5a4-46', 'timestamp': '2025-11-29T07:16:48.184528', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tapb04bb5a4-46', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:d4:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb04bb5a4-46'}, 'message_id': '5e6dac20-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': 'ca463d78602e8c450c6484a40ab8d50470a4b2102dc2415c7f87233b444268a7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap816e158d-4c', 'timestamp': '2025-11-29T07:16:48.184528', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap816e158d-4c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:52:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap816e158d-4c'}, 'message_id': '5e6db95e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': 'd625e7b6be062f5cc1415a9b4c1f9b5b0c8ba1d7652b27b4b930e0e2c31ea21a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000066-704c4aa7-3239-4ecc-bfdc-c72642678363-tap29881f52-aa', 'timestamp': '2025-11-29T07:16:48.184528', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'tap29881f52-aa', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:9d:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29881f52-aa'}, 'message_id': '5e6dc624-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.802745234, 'message_signature': '2224f7e97060e8b56878d85c270f8850d6835edc1706538be3407a20ccc2bdb5'}]}, 'timestamp': '2025-11-29 07:16:48.185854', '_unique_id': 'f01821863c0e4873ad104fd54ed04eb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.187 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.read.latency volume: 207912967 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.187 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.read.latency volume: 21055550 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.188 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.188 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7697a5e4-3944-4432-ba5b-4558b0ff8db0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 207912967, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-vda', 'timestamp': '2025-11-29T07:16:48.187668', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e6e1976-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': '2f286b6eda4bac4c09f8034a70378a2373998320a096a11e418901476cb58978'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21055550, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-sda', 'timestamp': '2025-11-29T07:16:48.187668', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e6e239e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': '3cc4c139bf9a9c77397421c5ed2cc37a046b0499bcd48a569cb6b04e8b67bd0d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-vda', 'timestamp': '2025-11-29T07:16:48.187668', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e6e3064-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': 'e4f5ccb5a5bd38a1e192a6fed1ab9ed0143b755fee5780c118403842fbc0c95a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-sda', 'timestamp': '2025-11-29T07:16:48.187668', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e6e3b90-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': '726ff4cd6664a93c36dc99f570b05484c4fdda163c80e6ce54c9809d391a8373'}]}, 'timestamp': '2025-11-29 07:16:48.188810', '_unique_id': '147175d1adb549559b7db8a97e872aab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.190 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.190 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1539987896>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1683200929>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1539987896>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1683200929>]
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.191 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.191 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1539987896>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1683200929>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1539987896>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1683200929>]
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.191 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.192 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.192 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.192 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '413b257e-b6e9-4f26-9fa5-2e2ecd8bd79a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap7f253c88-5c', 'timestamp': '2025-11-29T07:16:48.191786', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap7f253c88-5c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:c1:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f253c88-5c'}, 'message_id': '5e6ebb4c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '28d6a55475e9ecee9d2671bbbf8af9131767dfd0a750a483b709bac6cdf127d6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tapb04bb5a4-46', 'timestamp': '2025-11-29T07:16:48.191786', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tapb04bb5a4-46', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:d4:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb04bb5a4-46'}, 'message_id': '5e6ec84e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '266da2270f9720147fd5a258a73e42967fbb5f539462087600d2c02f975bbf60'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap816e158d-4c', 'timestamp': '2025-11-29T07:16:48.191786', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap816e158d-4c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:52:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap816e158d-4c'}, 'message_id': '5e6ed4ec-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': 'd75f1a0b249d537adff2e65b29d4ba0a3c3fe4157f187e700f412527ebd3ed4f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000066-704c4aa7-3239-4ecc-bfdc-c72642678363-tap29881f52-aa', 'timestamp': '2025-11-29T07:16:48.191786', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'tap29881f52-aa', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:9d:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29881f52-aa'}, 'message_id': '5e6ee4e6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.802745234, 'message_signature': '7798fa199e93c3cb19beb8d4acd826a3cf2edd2ae953be8ec664cfdbe5a75922'}]}, 'timestamp': '2025-11-29 07:16:48.193186', '_unique_id': 'b7773cf73f614209913699305c01bd77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.195 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.195 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.195 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.196 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24517785-dac6-411d-8eae-6f081f0a2c8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap7f253c88-5c', 'timestamp': '2025-11-29T07:16:48.195243', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap7f253c88-5c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:c1:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f253c88-5c'}, 'message_id': '5e6f430a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '15168956d153778f0fa28c680eca829db1854fbecff897e27c5aead022373b6a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tapb04bb5a4-46', 'timestamp': '2025-11-29T07:16:48.195243', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tapb04bb5a4-46', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:d4:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb04bb5a4-46'}, 'message_id': '5e6f4fb2-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': 'd3123a892bbcdde1d4eab296072290665af06c2c3a0dfbd618b6348a7cfb4fe0'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap816e158d-4c', 'timestamp': '2025-11-29T07:16:48.195243', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap816e158d-4c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:52:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap816e158d-4c'}, 'message_id': '5e6f59da-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '663787d058178cb282be1d6b7f9b6726f2f4fe269973197967d5724c786be56d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000066-704c4aa7-3239-4ecc-bfdc-c72642678363-tap29881f52-aa', 'timestamp': '2025-11-29T07:16:48.195243', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'tap29881f52-aa', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:9d:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29881f52-aa'}, 'message_id': '5e6f68b2-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.802745234, 'message_signature': 'c22d23d3a6df28bdef0e612d840ccaef78c44910d0b5d1e057957a72c7e33cb5'}]}, 'timestamp': '2025-11-29 07:16:48.196557', '_unique_id': '2fb5d5f777ea4c6d94a38d3ba52538de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.198 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.198 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.199 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.199 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/network.outgoing.bytes volume: 2313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '946a1fc8-855c-44dc-8e23-3bd0bcdfd9d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap7f253c88-5c', 'timestamp': '2025-11-29T07:16:48.198560', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap7f253c88-5c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:c1:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f253c88-5c'}, 'message_id': '5e6fc276-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': 'b1bddf62a01b25c23e8733a32e6a6d83d1990502f2311ab8fb884b798eb4bdbd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tapb04bb5a4-46', 'timestamp': '2025-11-29T07:16:48.198560', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tapb04bb5a4-46', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:d4:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb04bb5a4-46'}, 'message_id': '5e6fcc3a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': 'fdaece019a9c1a9d7227e38126feecb54495adeda4169eda4196660e89ee1972'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap816e158d-4c', 'timestamp': '2025-11-29T07:16:48.198560', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap816e158d-4c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:52:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap816e158d-4c'}, 'message_id': '5e6fd590-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '75ba4475968ba07a6d02ed2c9e4b7630722eecd27c6a5de63bfe38a4fe86018f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2313, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000066-704c4aa7-3239-4ecc-bfdc-c72642678363-tap29881f52-aa', 'timestamp': '2025-11-29T07:16:48.198560', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'tap29881f52-aa', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:9d:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29881f52-aa'}, 'message_id': '5e6fe1f2-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.802745234, 'message_signature': 'c88917737ab5b69e84e71c03cd9e4fbdb8d41461fbcde8121fcbcea20a23bd68'}]}, 'timestamp': '2025-11-29 07:16:48.199632', '_unique_id': '483681efa4c2420799becc52d16fc170'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.201 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.201 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.201 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.201 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5c24a05-0d92-469a-a6f7-0a0cc1b19218', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-vda', 'timestamp': '2025-11-29T07:16:48.201164', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e7028ec-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.807255182, 'message_signature': '08e23b8bec9a571a819ec8c9cba5b7a9f441139284c27b807d960729b51f86d3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-sda', 'timestamp': '2025-11-29T07:16:48.201164', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e703256-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.807255182, 'message_signature': '06d4ddddc8683e44fb0c42ba3378cd10e6c860bd396f9558e9967939b4ec5d18'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-vda', 'timestamp': '2025-11-29T07:16:48.201164', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e703c92-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.815586458, 'message_signature': 'fc8298dffa8b85560b19fc358299165fdeb7c228d992433b740c3a353b84cc67'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-sda', 'timestamp': '2025-11-29T07:16:48.201164', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e70458e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.815586458, 'message_signature': '85d2320bfbbfcddaeb47ce39f246b7517a48d6e94bf39d3305f2811d45064ed6'}]}, 'timestamp': '2025-11-29 07:16:48.202171', '_unique_id': '462e8b9446e940daac5de4f3dbfc6a21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.202 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.203 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.bytes volume: 4447 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.203 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.bytes volume: 1430 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.204 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.bytes volume: 1346 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.204 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/network.incoming.bytes volume: 1267 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37e47abc-67ab-48a3-93fb-b82502ae6ea3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4447, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap7f253c88-5c', 'timestamp': '2025-11-29T07:16:48.203691', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap7f253c88-5c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:c1:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f253c88-5c'}, 'message_id': '5e708ada-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '638479683bfee96bc49eef0933b18eec9680ab7a374187e2c0d0d739d61b962e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1430, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tapb04bb5a4-46', 'timestamp': '2025-11-29T07:16:48.203691', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tapb04bb5a4-46', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:d4:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb04bb5a4-46'}, 'message_id': '5e70948a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '8332b9b3e9c49756df144191f2837eb4d12e792649a13b78732783c52f363144'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1346, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap816e158d-4c', 'timestamp': '2025-11-29T07:16:48.203691', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap816e158d-4c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:52:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap816e158d-4c'}, 'message_id': '5e709f84-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '4576217297b2eb921c24a83e955275cbc4bedceeacafc768ba75580eda741335'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1267, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000066-704c4aa7-3239-4ecc-bfdc-c72642678363-tap29881f52-aa', 'timestamp': '2025-11-29T07:16:48.203691', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'tap29881f52-aa', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:9d:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29881f52-aa'}, 'message_id': '5e70a8f8-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.802745234, 'message_signature': 'b264002c73fb3fc0b32c51f63f444a785648976b6b77715213962f31cb3c45c7'}]}, 'timestamp': '2025-11-29 07:16:48.204725', '_unique_id': '05d7a4e4e3d24cd6868ea50db9180132'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.219 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/cpu volume: 11910000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.233 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/cpu volume: 50000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9050d4ef-abd2-47b7-88c1-79b08a43ed3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11910000000, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'timestamp': '2025-11-29T07:16:48.206290', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5e7302ce-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.860605855, 'message_signature': '2c5b3287d8baa6c25d6e95eb2a7b029845708dcf2bc9a351effaf475d313203f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 50000000, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'timestamp': '2025-11-29T07:16:48.206290', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5e751690-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.87417535, 'message_signature': '390ea441938a65e1a29340db2a75c47138eaf8fe2e9dd0500693f6a17c24d235'}]}, 'timestamp': '2025-11-29 07:16:48.233752', '_unique_id': '1495b912de0a46a994a8734bcb67b167'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.234 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.235 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.235 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/memory.usage volume: 44.4375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.235 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/memory.usage volume: 42.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a54d03be-9477-4eae-a288-5e696ac5e074', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 44.4375, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'timestamp': '2025-11-29T07:16:48.235701', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5e756d3e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.860605855, 'message_signature': '89e56c0e3a20dfa187afb7e0d5ad0ce17592ad4847beb57b0b5d1d5db5037d1e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.60546875, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'timestamp': '2025-11-29T07:16:48.235701', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '5e75778e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.87417535, 'message_signature': 'b4d9530ef998eae3c0008b187bad9b3243a1a6837e133438b103fe02a842111f'}]}, 'timestamp': '2025-11-29 07:16:48.236260', '_unique_id': 'b3bec3716c3b4164bb5876d7147e5cba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.236 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.237 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.238 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.238 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.238 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c77ea3a7-a72a-4bb2-aaf9-1b0029617eef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap7f253c88-5c', 'timestamp': '2025-11-29T07:16:48.237815', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap7f253c88-5c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:c1:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f253c88-5c'}, 'message_id': '5e75bffa-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '9742b46343e6de0d4512cfc505fd71f358e1f54ac547faa73753f6ac173bdce0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tapb04bb5a4-46', 'timestamp': '2025-11-29T07:16:48.237815', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tapb04bb5a4-46', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:d4:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb04bb5a4-46'}, 'message_id': '5e75ca54-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '6c0dcee1c28c965e42e15ae82064d4e78ef5754d22a24ee5e22f3c8219acb903'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap816e158d-4c', 'timestamp': '2025-11-29T07:16:48.237815', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap816e158d-4c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:52:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap816e158d-4c'}, 'message_id': '5e75d40e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '09a54a0b14442c762d8278f3164994327ef9b39c2f16d11a80d0d3decd095119'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000066-704c4aa7-3239-4ecc-bfdc-c72642678363-tap29881f52-aa', 'timestamp': '2025-11-29T07:16:48.237815', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'tap29881f52-aa', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:9d:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29881f52-aa'}, 'message_id': '5e75deea-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.802745234, 'message_signature': 'd50e1940aa99efb172718108fa8336913f85cb02fb7dc10bdf59b80b76756529'}]}, 'timestamp': '2025-11-29 07:16:48.238873', '_unique_id': 'fbe3f6216dcd4e359e9233e865fe7aa6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.239 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.240 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.240 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.241 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.241 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0f6d0f2-680e-4e7b-a079-87092d0f044c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-vda', 'timestamp': '2025-11-29T07:16:48.240597', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e762df0-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.807255182, 'message_signature': '27d841d7285a656635aa9060abdbc945b19f4c62f7dff5bb86e16cc56085c545'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-sda', 'timestamp': '2025-11-29T07:16:48.240597', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e76378c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.807255182, 'message_signature': '1f9bae1ca535908217519e3b4686889cf8e0145cb43ec8062a99b720296dea59'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-vda', 'timestamp': '2025-11-29T07:16:48.240597', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e76415a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.815586458, 'message_signature': '53cc7480096d411f48766535b90887774135073032b566aab684f8f3534f41fe'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-sda', 'timestamp': '2025-11-29T07:16:48.240597', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e764a4c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.815586458, 'message_signature': '8ff117594f7a56e3ca5ccec058f705457911279286e1a5b84e5946f9e4409c51'}]}, 'timestamp': '2025-11-29 07:16:48.241613', '_unique_id': '981ab59afd24463995fd5e260b16ab80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.242 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.243 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.243 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.243 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.243 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/network.outgoing.packets volume: 39 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b48f0a0d-f264-428a-8d67-695f84f97464', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap7f253c88-5c', 'timestamp': '2025-11-29T07:16:48.243155', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap7f253c88-5c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:c1:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f253c88-5c'}, 'message_id': '5e76910a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '207243f51df2683112b81f9084513288fe8a5a920c4be65637dae8543d4685f3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tapb04bb5a4-46', 'timestamp': '2025-11-29T07:16:48.243155', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tapb04bb5a4-46', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:d4:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb04bb5a4-46'}, 'message_id': '5e769ace-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '17f45c44e3d22f5bca2173d62375af0863d08398a279e7307fac6fc286f35990'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap816e158d-4c', 'timestamp': '2025-11-29T07:16:48.243155', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap816e158d-4c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:52:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap816e158d-4c'}, 'message_id': '5e76a406-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '88344030ada0991879d0c0c7b99cdddc49102d511a5751c71229ac19220d44a7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 39, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000066-704c4aa7-3239-4ecc-bfdc-c72642678363-tap29881f52-aa', 'timestamp': '2025-11-29T07:16:48.243155', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'tap29881f52-aa', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:9d:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29881f52-aa'}, 'message_id': '5e76ad5c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.802745234, 'message_signature': 'b703f312bfe95b18fce89eec51f1229e6fbbc4c7b95662ef3634c322163d0ec2'}]}, 'timestamp': '2025-11-29 07:16:48.244155', '_unique_id': '44a89062f5da4f63ad364ac8696d9808'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.245 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.245 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1539987896>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1683200929>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1539987896>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1683200929>]
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.246 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.246 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1539987896>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1683200929>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesTestJSON-server-1539987896>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1683200929>]
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.246 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.write.bytes volume: 73027584 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.246 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.247 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.write.bytes volume: 28672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.247 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cc60430-064d-4807-b8ba-de3ed6a447c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73027584, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-vda', 'timestamp': '2025-11-29T07:16:48.246541', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e771468-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': '35c86b6de59a95a7c42d59f06287df2d6d57b67d15b562ee8250f5265ba918af'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': '7da96eef-5195-4fe9-8421-3b8b79420a86-sda', 'timestamp': '2025-11-29T07:16:48.246541', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'instance-00000065', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e771db4-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.735399743, 'message_signature': '26ef77f50515a77c8083ff1523cbefff27fc69c36b0012ec50596a298f800f4c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28672, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-vda', 'timestamp': '2025-11-29T07:16:48.246541', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5e7726d8-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': '0b33d12b0fdd4aa3e965deb75ef44f00be394379cb6f825823a62c86968da446'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': '704c4aa7-3239-4ecc-bfdc-c72642678363-sda', 'timestamp': '2025-11-29T07:16:48.246541', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'instance-00000066', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5e7730c4-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.764125228, 'message_signature': 'ccaaf383fb01a8d62b0aeaf1ecf4eb6753d5fa9e93f57af026ebefbe74293532'}]}, 'timestamp': '2025-11-29 07:16:48.247515', '_unique_id': '35625a0baeb942dd88f7d8286a7e3924'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.248 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.249 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.249 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.249 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.249 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '533ab4b4-0438-4209-a631-8db4b35b5015', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap7f253c88-5c', 'timestamp': '2025-11-29T07:16:48.249108', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap7f253c88-5c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:c1:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f253c88-5c'}, 'message_id': '5e7779e4-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '4bf5787e6ff849aa0d62bb51ade57c5bf77facda24505f619bb0afb13cf76fa6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tapb04bb5a4-46', 'timestamp': '2025-11-29T07:16:48.249108', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tapb04bb5a4-46', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:d4:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb04bb5a4-46'}, 'message_id': '5e778538-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '26d4817d1b51fdab101af07765c3e0af35e9feec9fd576270ede9372c481ba15'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap816e158d-4c', 'timestamp': '2025-11-29T07:16:48.249108', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap816e158d-4c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:52:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap816e158d-4c'}, 'message_id': '5e778f74-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': 'd117841a755fe833ee8ceea9f60dfcc9d2bfaa7ed6157ee4faf306caba9579ed'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000066-704c4aa7-3239-4ecc-bfdc-c72642678363-tap29881f52-aa', 'timestamp': '2025-11-29T07:16:48.249108', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'tap29881f52-aa', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:9d:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29881f52-aa'}, 'message_id': '5e7798b6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.802745234, 'message_signature': '20912b2fa65aa6727e5953ee90a4a0207d2d0e113387ebf969e3494ef7df14fa'}]}, 'timestamp': '2025-11-29 07:16:48.250181', '_unique_id': '2aab61b32ecd4e70b306b182efd9ef8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.251 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.packets volume: 32 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.252 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.252 12 DEBUG ceilometer.compute.pollsters [-] 7da96eef-5195-4fe9-8421-3b8b79420a86/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.252 12 DEBUG ceilometer.compute.pollsters [-] 704c4aa7-3239-4ecc-bfdc-c72642678363/network.incoming.packets volume: 22 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfc480c0-7d4a-4614-843e-00b573d9a4be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 32, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap7f253c88-5c', 'timestamp': '2025-11-29T07:16:48.251907', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap7f253c88-5c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c1:c1:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f253c88-5c'}, 'message_id': '5e77e636-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': '8577223af6d9f91ec2625d045560bcf24e97c56c4332b7fa02a8209d62deaf2a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tapb04bb5a4-46', 'timestamp': '2025-11-29T07:16:48.251907', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tapb04bb5a4-46', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:d4:98', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb04bb5a4-46'}, 'message_id': '5e77f144-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': 'c725f6aed091bbd889c2517e775ece2a8a9892f5814fd4eb5604cb4bbf3703ef'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-00000065-7da96eef-5195-4fe9-8421-3b8b79420a86-tap816e158d-4c', 'timestamp': '2025-11-29T07:16:48.251907', 'resource_metadata': {'display_name': 'tempest-AttachInterfacesTestJSON-server-1539987896', 'name': 'tap816e158d-4c', 'instance_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0a:52:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap816e158d-4c'}, 'message_id': '5e77fac2-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.798075511, 'message_signature': 'b3e540551b38fc18dced4bdc2ecf02353ba9893c118aafc8e35f8c94c260137b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 22, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_name': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_name': None, 'resource_id': 'instance-00000066-704c4aa7-3239-4ecc-bfdc-c72642678363-tap29881f52-aa', 'timestamp': '2025-11-29T07:16:48.251907', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1683200929', 'name': 'tap29881f52-aa', 'instance_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'instance_type': 'm1.nano', 'host': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:9d:fe', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap29881f52-aa'}, 'message_id': '5e780404-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6055.802745234, 'message_signature': '3a3d7156f0e375a9f7e568b4b0b183874bfee6188b141d8552b1e569a814ac4b'}]}, 'timestamp': '2025-11-29 07:16:48.252927', '_unique_id': 'afb983899be44faeb4f388f12f413062'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:16:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:16:48.253 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.851 186962 DEBUG oslo_concurrency.lockutils [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-bfb6aa55-8943-44ec-93c0-037a3c64e742" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.851 186962 DEBUG oslo_concurrency.lockutils [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-bfb6aa55-8943-44ec-93c0-037a3c64e742" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:48 np0005539505 nova_compute[186958]: 2025-11-29 07:16:48.852 186962 DEBUG nova.objects.instance [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'flavor' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.148 186962 INFO nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.337 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.338 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.356 186962 DEBUG nova.compute.manager [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.487 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.487 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.493 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.493 186962 INFO nova.compute.claims [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.563 186962 DEBUG nova.objects.instance [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'pci_requests' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.580 186962 DEBUG nova.network.neutron [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.707 186962 DEBUG nova.compute.provider_tree [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:16:49 np0005539505 podman[233277]: 2025-11-29 07:16:49.753575727 +0000 UTC m=+0.054813096 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:16:49 np0005539505 podman[233278]: 2025-11-29 07:16:49.75827581 +0000 UTC m=+0.060084505 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.823 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.846 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.946 186962 DEBUG nova.scheduler.client.report [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.971 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:49 np0005539505 nova_compute[186958]: 2025-11-29 07:16:49.972 186962 DEBUG nova.compute.manager [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.062 186962 DEBUG nova.compute.manager [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.062 186962 DEBUG nova.network.neutron [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.083 186962 INFO nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.112 186962 DEBUG nova.compute.manager [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.154 186962 DEBUG nova.policy [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.157 186962 INFO nova.virt.libvirt.driver [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.161 186962 DEBUG nova.compute.manager [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.181 186962 DEBUG nova.objects.instance [None req-da7c62b3-94c8-40ed-a93a-19d037a70c10 f1415f82cd0e4e9bb07aba94d430b9b7 5f0687dbac67422094b55a5950d8cceb - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.249 186962 DEBUG nova.compute.manager [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.251 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.252 186962 INFO nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Creating image(s)#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.253 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.253 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.254 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.278 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.365 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.368 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.369 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.385 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.450 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.451 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.652 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk 1073741824" returned: 0 in 0.201s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.653 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.654 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.707 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.708 186962 DEBUG nova.virt.disk.api [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Checking if we can resize image /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.709 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.762 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.764 186962 DEBUG nova.virt.disk.api [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Cannot resize image /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.764 186962 DEBUG nova.objects.instance [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'migration_context' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.777 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.777 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Ensure instance console log exists: /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.778 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.778 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.778 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:16:50 np0005539505 nova_compute[186958]: 2025-11-29 07:16:50.924 186962 DEBUG nova.policy [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:16:52 np0005539505 nova_compute[186958]: 2025-11-29 07:16:52.878 186962 DEBUG nova.network.neutron [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Successfully created port: c373f1d7-168e-494b-8e6f-c8af44b0db68 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:16:53 np0005539505 nova_compute[186958]: 2025-11-29 07:16:53.061 186962 DEBUG nova.network.neutron [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Successfully updated port: bfb6aa55-8943-44ec-93c0-037a3c64e742 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:16:53 np0005539505 nova_compute[186958]: 2025-11-29 07:16:53.081 186962 DEBUG oslo_concurrency.lockutils [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:53 np0005539505 nova_compute[186958]: 2025-11-29 07:16:53.082 186962 DEBUG oslo_concurrency.lockutils [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:53 np0005539505 nova_compute[186958]: 2025-11-29 07:16:53.082 186962 DEBUG nova.network.neutron [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:16:53 np0005539505 nova_compute[186958]: 2025-11-29 07:16:53.161 186962 DEBUG nova.compute.manager [req-4a24a65a-5b2a-48bf-aec8-35abf6d024a8 req-c6b21d77-874c-470c-b79a-ccc4070a9668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-changed-bfb6aa55-8943-44ec-93c0-037a3c64e742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:53 np0005539505 nova_compute[186958]: 2025-11-29 07:16:53.162 186962 DEBUG nova.compute.manager [req-4a24a65a-5b2a-48bf-aec8-35abf6d024a8 req-c6b21d77-874c-470c-b79a-ccc4070a9668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Refreshing instance network info cache due to event network-changed-bfb6aa55-8943-44ec-93c0-037a3c64e742. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:16:53 np0005539505 nova_compute[186958]: 2025-11-29 07:16:53.162 186962 DEBUG oslo_concurrency.lockutils [req-4a24a65a-5b2a-48bf-aec8-35abf6d024a8 req-c6b21d77-874c-470c-b79a-ccc4070a9668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:53 np0005539505 nova_compute[186958]: 2025-11-29 07:16:53.462 186962 WARNING nova.network.neutron [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] 90812230-35cb-4e21-b16b-75b900100d8b already exists in list: networks containing: ['90812230-35cb-4e21-b16b-75b900100d8b']. ignoring it#033[00m
Nov 29 02:16:53 np0005539505 nova_compute[186958]: 2025-11-29 07:16:53.462 186962 WARNING nova.network.neutron [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] 90812230-35cb-4e21-b16b-75b900100d8b already exists in list: networks containing: ['90812230-35cb-4e21-b16b-75b900100d8b']. ignoring it#033[00m
Nov 29 02:16:53 np0005539505 nova_compute[186958]: 2025-11-29 07:16:53.463 186962 WARNING nova.network.neutron [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] 90812230-35cb-4e21-b16b-75b900100d8b already exists in list: networks containing: ['90812230-35cb-4e21-b16b-75b900100d8b']. ignoring it#033[00m
Nov 29 02:16:54 np0005539505 nova_compute[186958]: 2025-11-29 07:16:54.825 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:54 np0005539505 nova_compute[186958]: 2025-11-29 07:16:54.849 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:56 np0005539505 nova_compute[186958]: 2025-11-29 07:16:56.394 186962 DEBUG nova.network.neutron [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Successfully updated port: c373f1d7-168e-494b-8e6f-c8af44b0db68 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:16:57 np0005539505 nova_compute[186958]: 2025-11-29 07:16:57.209 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:57 np0005539505 nova_compute[186958]: 2025-11-29 07:16:57.210 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquired lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:16:57 np0005539505 nova_compute[186958]: 2025-11-29 07:16:57.210 186962 DEBUG nova.network.neutron [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:16:57 np0005539505 nova_compute[186958]: 2025-11-29 07:16:57.476 186962 DEBUG nova.compute.manager [req-1a8c6c0c-1635-4301-8298-d1e14425a85d req-b2701fc7-65db-4a23-ad97-5ffe15943088 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-changed-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:16:57 np0005539505 nova_compute[186958]: 2025-11-29 07:16:57.476 186962 DEBUG nova.compute.manager [req-1a8c6c0c-1635-4301-8298-d1e14425a85d req-b2701fc7-65db-4a23-ad97-5ffe15943088 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Refreshing instance network info cache due to event network-changed-c373f1d7-168e-494b-8e6f-c8af44b0db68. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:16:57 np0005539505 nova_compute[186958]: 2025-11-29 07:16:57.476 186962 DEBUG oslo_concurrency.lockutils [req-1a8c6c0c-1635-4301-8298-d1e14425a85d req-b2701fc7-65db-4a23-ad97-5ffe15943088 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:16:58 np0005539505 nova_compute[186958]: 2025-11-29 07:16:58.202 186962 INFO nova.compute.manager [None req-e4a6dd71-6690-47cf-8039-8b67a8c06535 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Get console output#033[00m
Nov 29 02:16:58 np0005539505 nova_compute[186958]: 2025-11-29 07:16:58.208 213540 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:16:58 np0005539505 nova_compute[186958]: 2025-11-29 07:16:58.667 186962 DEBUG nova.network.neutron [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:16:59 np0005539505 nova_compute[186958]: 2025-11-29 07:16:59.828 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:16:59 np0005539505 nova_compute[186958]: 2025-11-29 07:16:59.852 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:00 np0005539505 nova_compute[186958]: 2025-11-29 07:17:00.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.419 186962 DEBUG nova.network.neutron [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating instance_info_cache with network_info: [{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.821 186962 DEBUG nova.compute.manager [req-5ecadf78-17ee-4a42-a0a7-c41e20d96494 req-7deaba99-d214-48be-997f-74b771ed8632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-changed-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.822 186962 DEBUG nova.compute.manager [req-5ecadf78-17ee-4a42-a0a7-c41e20d96494 req-7deaba99-d214-48be-997f-74b771ed8632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Refreshing instance network info cache due to event network-changed-29881f52-aa42-4a78-a87b-06e906811ff2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.822 186962 DEBUG oslo_concurrency.lockutils [req-5ecadf78-17ee-4a42-a0a7-c41e20d96494 req-7deaba99-d214-48be-997f-74b771ed8632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.822 186962 DEBUG oslo_concurrency.lockutils [req-5ecadf78-17ee-4a42-a0a7-c41e20d96494 req-7deaba99-d214-48be-997f-74b771ed8632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.822 186962 DEBUG nova.network.neutron [req-5ecadf78-17ee-4a42-a0a7-c41e20d96494 req-7deaba99-d214-48be-997f-74b771ed8632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Refreshing network info cache for port 29881f52-aa42-4a78-a87b-06e906811ff2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.824 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Releasing lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.825 186962 DEBUG nova.compute.manager [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance network_info: |[{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.825 186962 DEBUG oslo_concurrency.lockutils [req-1a8c6c0c-1635-4301-8298-d1e14425a85d req-b2701fc7-65db-4a23-ad97-5ffe15943088 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.825 186962 DEBUG nova.network.neutron [req-1a8c6c0c-1635-4301-8298-d1e14425a85d req-b2701fc7-65db-4a23-ad97-5ffe15943088 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Refreshing network info cache for port c373f1d7-168e-494b-8e6f-c8af44b0db68 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.828 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Start _get_guest_xml network_info=[{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.833 186962 WARNING nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.843 186962 DEBUG nova.virt.libvirt.host [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.844 186962 DEBUG nova.virt.libvirt.host [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.848 186962 DEBUG nova.virt.libvirt.host [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.848 186962 DEBUG nova.virt.libvirt.host [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.849 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.850 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.850 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.850 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.850 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.851 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.851 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.851 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.851 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.852 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.852 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.852 186962 DEBUG nova.virt.hardware [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.856 186962 DEBUG nova.virt.libvirt.vif [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1506153238',display_name='tempest-ServersNegativeTestJSON-server-1506153238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1506153238',id=104,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='329bbbdd41424742b3045e77150a498e',ramdisk_id='',reservation_id='r-widkbdap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1191192320',owner_user_name='tempest-ServersNegativeTestJSON-1191192320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:16:50Z,user_data=None,user_id='2647a3e4fc214b4a85db1283eb7ef117',uuid=aa4795d1-71b1-415f-ac22-5bb11775bc84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.856 186962 DEBUG nova.network.os_vif_util [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converting VIF {"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.857 186962 DEBUG nova.network.os_vif_util [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.858 186962 DEBUG nova.objects.instance [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'pci_devices' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.902 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  <uuid>aa4795d1-71b1-415f-ac22-5bb11775bc84</uuid>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  <name>instance-00000068</name>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersNegativeTestJSON-server-1506153238</nova:name>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:17:01</nova:creationTime>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:        <nova:user uuid="2647a3e4fc214b4a85db1283eb7ef117">tempest-ServersNegativeTestJSON-1191192320-project-member</nova:user>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:        <nova:project uuid="329bbbdd41424742b3045e77150a498e">tempest-ServersNegativeTestJSON-1191192320</nova:project>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:        <nova:port uuid="c373f1d7-168e-494b-8e6f-c8af44b0db68">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <entry name="serial">aa4795d1-71b1-415f-ac22-5bb11775bc84</entry>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <entry name="uuid">aa4795d1-71b1-415f-ac22-5bb11775bc84</entry>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.config"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:1b:c4:96"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <target dev="tapc373f1d7-16"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/console.log" append="off"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:17:01 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:17:01 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:17:01 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:17:01 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.903 186962 DEBUG nova.compute.manager [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Preparing to wait for external event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.903 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.903 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.904 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.904 186962 DEBUG nova.virt.libvirt.vif [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1506153238',display_name='tempest-ServersNegativeTestJSON-server-1506153238',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1506153238',id=104,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='329bbbdd41424742b3045e77150a498e',ramdisk_id='',reservation_id='r-widkbdap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1191192320',owner_user_name='tempest-ServersNegativeTestJSON-1191192320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:16:50Z,user_data=None,user_id='2647a3e4fc214b4a85db1283eb7ef117',uuid=aa4795d1-71b1-415f-ac22-5bb11775bc84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.905 186962 DEBUG nova.network.os_vif_util [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converting VIF {"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.905 186962 DEBUG nova.network.os_vif_util [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.906 186962 DEBUG os_vif [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.911 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.911 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.912 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.915 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.916 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc373f1d7-16, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.916 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc373f1d7-16, col_values=(('external_ids', {'iface-id': 'c373f1d7-168e-494b-8e6f-c8af44b0db68', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:c4:96', 'vm-uuid': 'aa4795d1-71b1-415f-ac22-5bb11775bc84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.918 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:01 np0005539505 NetworkManager[55134]: <info>  [1764400621.9196] manager: (tapc373f1d7-16): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.920 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.925 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:01 np0005539505 nova_compute[186958]: 2025-11-29 07:17:01.926 186962 INFO os_vif [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16')#033[00m
Nov 29 02:17:02 np0005539505 podman[233340]: 2025-11-29 07:17:02.013059507 +0000 UTC m=+0.052047078 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:17:02 np0005539505 podman[233338]: 2025-11-29 07:17:02.013059047 +0000 UTC m=+0.055711902 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, version=9.6)
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.112 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.113 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.113 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] No VIF found with MAC fa:16:3e:1b:c4:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.113 186962 INFO nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Using config drive#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.117 186962 DEBUG nova.network.neutron [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.275 186962 DEBUG oslo_concurrency.lockutils [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.275 186962 DEBUG oslo_concurrency.lockutils [req-4a24a65a-5b2a-48bf-aec8-35abf6d024a8 req-c6b21d77-874c-470c-b79a-ccc4070a9668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.276 186962 DEBUG nova.network.neutron [req-4a24a65a-5b2a-48bf-aec8-35abf6d024a8 req-c6b21d77-874c-470c-b79a-ccc4070a9668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Refreshing network info cache for port bfb6aa55-8943-44ec-93c0-037a3c64e742 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.278 186962 DEBUG nova.virt.libvirt.vif [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.279 186962 DEBUG nova.network.os_vif_util [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.279 186962 DEBUG nova.network.os_vif_util [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:45:c4,bridge_name='br-int',has_traffic_filtering=True,id=bfb6aa55-8943-44ec-93c0-037a3c64e742,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbfb6aa55-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.279 186962 DEBUG os_vif [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:45:c4,bridge_name='br-int',has_traffic_filtering=True,id=bfb6aa55-8943-44ec-93c0-037a3c64e742,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbfb6aa55-89') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.280 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.280 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.280 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.282 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.282 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbfb6aa55-89, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.283 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbfb6aa55-89, col_values=(('external_ids', {'iface-id': 'bfb6aa55-8943-44ec-93c0-037a3c64e742', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:63:45:c4', 'vm-uuid': '7da96eef-5195-4fe9-8421-3b8b79420a86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.284 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:02 np0005539505 NetworkManager[55134]: <info>  [1764400622.2854] manager: (tapbfb6aa55-89): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.285 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.291 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.292 186962 INFO os_vif [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:45:c4,bridge_name='br-int',has_traffic_filtering=True,id=bfb6aa55-8943-44ec-93c0-037a3c64e742,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbfb6aa55-89')#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.292 186962 DEBUG nova.virt.libvirt.vif [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.293 186962 DEBUG nova.network.os_vif_util [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.293 186962 DEBUG nova.network.os_vif_util [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:45:c4,bridge_name='br-int',has_traffic_filtering=True,id=bfb6aa55-8943-44ec-93c0-037a3c64e742,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbfb6aa55-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.295 186962 DEBUG nova.virt.libvirt.guest [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] attach device xml: <interface type="ethernet">
Nov 29 02:17:02 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:63:45:c4"/>
Nov 29 02:17:02 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:17:02 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:17:02 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:17:02 np0005539505 nova_compute[186958]:  <target dev="tapbfb6aa55-89"/>
Nov 29 02:17:02 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:17:02 np0005539505 nova_compute[186958]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 29 02:17:02 np0005539505 kernel: tapbfb6aa55-89: entered promiscuous mode
Nov 29 02:17:02 np0005539505 NetworkManager[55134]: <info>  [1764400622.3082] manager: (tapbfb6aa55-89): new Tun device (/org/freedesktop/NetworkManager/Devices/226)
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.312 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:02 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:02Z|00435|binding|INFO|Claiming lport bfb6aa55-8943-44ec-93c0-037a3c64e742 for this chassis.
Nov 29 02:17:02 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:02Z|00436|binding|INFO|bfb6aa55-8943-44ec-93c0-037a3c64e742: Claiming fa:16:3e:63:45:c4 10.100.0.12
Nov 29 02:17:02 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:02Z|00437|binding|INFO|Setting lport bfb6aa55-8943-44ec-93c0-037a3c64e742 ovn-installed in OVS
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.331 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.336 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:02 np0005539505 systemd-udevd[233393]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:17:02 np0005539505 NetworkManager[55134]: <info>  [1764400622.3564] device (tapbfb6aa55-89): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:17:02 np0005539505 NetworkManager[55134]: <info>  [1764400622.3570] device (tapbfb6aa55-89): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.548 186962 INFO nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Creating config drive at /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.config#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.553 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3p_t3_mo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.608 186962 DEBUG oslo_concurrency.lockutils [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.608 186962 DEBUG oslo_concurrency.lockutils [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.609 186962 DEBUG oslo_concurrency.lockutils [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.609 186962 DEBUG oslo_concurrency.lockutils [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.609 186962 DEBUG oslo_concurrency.lockutils [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.691 186962 DEBUG oslo_concurrency.processutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3p_t3_mo" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:02 np0005539505 kernel: tapc373f1d7-16: entered promiscuous mode
Nov 29 02:17:02 np0005539505 NetworkManager[55134]: <info>  [1764400622.7400] manager: (tapc373f1d7-16): new Tun device (/org/freedesktop/NetworkManager/Devices/227)
Nov 29 02:17:02 np0005539505 systemd-udevd[233395]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.741 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:02 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:02Z|00438|if_status|INFO|Not updating pb chassis for c373f1d7-168e-494b-8e6f-c8af44b0db68 now as sb is readonly
Nov 29 02:17:02 np0005539505 NetworkManager[55134]: <info>  [1764400622.7532] device (tapc373f1d7-16): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:17:02 np0005539505 NetworkManager[55134]: <info>  [1764400622.7541] device (tapc373f1d7-16): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.755 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:02 np0005539505 nova_compute[186958]: 2025-11-29 07:17:02.759 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:02 np0005539505 systemd-machined[153285]: New machine qemu-54-instance-00000068.
Nov 29 02:17:02 np0005539505 systemd[1]: Started Virtual Machine qemu-54-instance-00000068.
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.017 186962 DEBUG nova.network.neutron [req-5ecadf78-17ee-4a42-a0a7-c41e20d96494 req-7deaba99-d214-48be-997f-74b771ed8632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updated VIF entry in instance network info cache for port 29881f52-aa42-4a78-a87b-06e906811ff2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.018 186962 DEBUG nova.network.neutron [req-5ecadf78-17ee-4a42-a0a7-c41e20d96494 req-7deaba99-d214-48be-997f-74b771ed8632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updating instance_info_cache with network_info: [{"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.064 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400623.0633757, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.064 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Started (Lifecycle Event)#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.122 186962 DEBUG nova.network.neutron [req-1a8c6c0c-1635-4301-8298-d1e14425a85d req-b2701fc7-65db-4a23-ad97-5ffe15943088 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updated VIF entry in instance network info cache for port c373f1d7-168e-494b-8e6f-c8af44b0db68. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.123 186962 DEBUG nova.network.neutron [req-1a8c6c0c-1635-4301-8298-d1e14425a85d req-b2701fc7-65db-4a23-ad97-5ffe15943088 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating instance_info_cache with network_info: [{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:03 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:03Z|00439|binding|INFO|Claiming lport c373f1d7-168e-494b-8e6f-c8af44b0db68 for this chassis.
Nov 29 02:17:03 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:03Z|00440|binding|INFO|c373f1d7-168e-494b-8e6f-c8af44b0db68: Claiming fa:16:3e:1b:c4:96 10.100.0.5
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.162 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:45:c4 10.100.0.12'], port_security=['fa:16:3e:63:45:c4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-175058931', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-175058931', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '2', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=bfb6aa55-8943-44ec-93c0-037a3c64e742) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:17:03 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:03Z|00441|binding|INFO|Setting lport bfb6aa55-8943-44ec-93c0-037a3c64e742 up in Southbound
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.163 104094 INFO neutron.agent.ovn.metadata.agent [-] Port bfb6aa55-8943-44ec-93c0-037a3c64e742 in datapath 90812230-35cb-4e21-b16b-75b900100d8b bound to our chassis#033[00m
Nov 29 02:17:03 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:03Z|00442|binding|INFO|Setting lport c373f1d7-168e-494b-8e6f-c8af44b0db68 ovn-installed in OVS
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.165 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.166 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.181 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fdfeefb1-46d3-4b2e-8123-d30b3b1eceb2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.208 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b9bde4ea-8d15-4030-a63b-cc96fa3261a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.213 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[5bf252d4-1847-478b-b64d-97e6f45cdb55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.239 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a34925-92f2-4d33-9f25-dc92267f2ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.256 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8495f35a-f6df-43ce-8723-af94781c5972]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600202, 'reachable_time': 25646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233434, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.273 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e0726c34-8cf0-48b3-86fa-13d2f911fc5e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600212, 'tstamp': 600212}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233435, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600215, 'tstamp': 600215}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233435, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.274 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.276 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.277 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.277 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.277 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.278 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.278 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.353 186962 INFO nova.compute.manager [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Terminating instance#033[00m
Nov 29 02:17:03 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:03Z|00443|binding|INFO|Setting lport c373f1d7-168e-494b-8e6f-c8af44b0db68 up in Southbound
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.357 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:c4:96 10.100.0.5'], port_security=['fa:16:3e:1b:c4:96 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14d61e69-b152-4adc-a95c-58748969e299', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '329bbbdd41424742b3045e77150a498e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '24db58f8-235a-4b76-869f-efe13404b22a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61c05e4b-7426-41e7-9cd6-8f37a87e832e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=c373f1d7-168e-494b-8e6f-c8af44b0db68) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.358 104094 INFO neutron.agent.ovn.metadata.agent [-] Port c373f1d7-168e-494b-8e6f-c8af44b0db68 in datapath 14d61e69-b152-4adc-a95c-58748969e299 unbound from our chassis#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.361 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 14d61e69-b152-4adc-a95c-58748969e299#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.373 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[47bfdbe2-64dc-4681-a861-c0f8f3b9401e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.374 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap14d61e69-b1 in ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.376 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap14d61e69-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.376 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[839cb9da-4346-4b72-808e-64275293ba64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.377 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3e049e4e-e837-44cc-8409-a5a28229654a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.383 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.390 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[25fd2c33-5eb6-45c5-940f-64d43629df46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.391 186962 DEBUG oslo_concurrency.lockutils [req-1a8c6c0c-1635-4301-8298-d1e14425a85d req-b2701fc7-65db-4a23-ad97-5ffe15943088 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.395 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400623.0636077, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.396 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.404 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[000d8c30-24cb-4480-b720-0f3719f8a954]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.407 186962 DEBUG nova.compute.manager [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.421 186962 DEBUG oslo_concurrency.lockutils [req-5ecadf78-17ee-4a42-a0a7-c41e20d96494 req-7deaba99-d214-48be-997f-74b771ed8632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-704c4aa7-3239-4ecc-bfdc-c72642678363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:17:03 np0005539505 kernel: tap29881f52-aa (unregistering): left promiscuous mode
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.426 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:17:03 np0005539505 NetworkManager[55134]: <info>  [1764400623.4291] device (tap29881f52-aa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.433 186962 DEBUG nova.virt.libvirt.driver [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.434 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b487797b-654b-47b0-abee-8313d2d5ced1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.434 186962 DEBUG nova.virt.libvirt.driver [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:17:03 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:03Z|00444|binding|INFO|Releasing lport 29881f52-aa42-4a78-a87b-06e906811ff2 from this chassis (sb_readonly=0)
Nov 29 02:17:03 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:03Z|00445|binding|INFO|Setting lport 29881f52-aa42-4a78-a87b-06e906811ff2 down in Southbound
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.436 186962 DEBUG nova.virt.libvirt.driver [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:c1:c1:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.436 186962 DEBUG nova.virt.libvirt.driver [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:4e:d4:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.436 186962 DEBUG nova.virt.libvirt.driver [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:0a:52:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.436 186962 DEBUG nova.virt.libvirt.driver [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:63:45:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:17:03 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:03Z|00446|binding|INFO|Removing iface tap29881f52-aa ovn-installed in OVS
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.440 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.442 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:17:03 np0005539505 NetworkManager[55134]: <info>  [1764400623.4442] manager: (tap14d61e69-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/228)
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.446 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b3285abe-544f-45d5-824f-e6e2a43e2799]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.458 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.475 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[101e5d3c-0fc3-421d-9344-07e208c72e6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.479 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[9745e94e-4051-4e60-bfae-2cddb332aeb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.483 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:9d:fe 10.100.0.6'], port_security=['fa:16:3e:44:9d:fe 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '704c4aa7-3239-4ecc-bfdc-c72642678363', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f8ce59f3-d777-4899-bf5b-171901097199', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '11', 'neutron:security_group_ids': '0aded770-2a08-4693-9d94-82fba33c50bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8856130-4f24-493d-8324-579a0d608efb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=29881f52-aa42-4a78-a87b-06e906811ff2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:17:03 np0005539505 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000066.scope: Deactivated successfully.
Nov 29 02:17:03 np0005539505 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000066.scope: Consumed 2.664s CPU time.
Nov 29 02:17:03 np0005539505 systemd-machined[153285]: Machine qemu-53-instance-00000066 terminated.
Nov 29 02:17:03 np0005539505 NetworkManager[55134]: <info>  [1764400623.5028] device (tap14d61e69-b0): carrier: link connected
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.508 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[9758d811-92dd-412e-8fa2-e5817e836b59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.510 186962 DEBUG nova.virt.libvirt.guest [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:03 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:17:03</nova:creationTime>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:17:03 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    <nova:port uuid="b04bb5a4-4610-4151-a86a-f1f55b164195">
Nov 29 02:17:03 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    <nova:port uuid="816e158d-4c1c-4ea8-ae90-eb4e66048a31">
Nov 29 02:17:03 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    <nova:port uuid="bfb6aa55-8943-44ec-93c0-037a3c64e742">
Nov 29 02:17:03 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:03 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:17:03 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:17:03 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.511 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.525 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[188b4f94-1347-464d-b18b-86bda919e960]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14d61e69-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:42:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 607108, 'reachable_time': 39927, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233467, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.539 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c851ae40-f273-420d-9b21-1c1893a9f91e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:42d7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 607108, 'tstamp': 607108}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233468, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.553 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[597d37c2-8ca2-4998-9913-b09bfe904f57]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14d61e69-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:42:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 145], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 607108, 'reachable_time': 39927, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233469, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.585 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[74195c12-849d-442e-9323-e0e9a98eb097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.615 186962 DEBUG oslo_concurrency.lockutils [None req-09b60210-226d-417f-9a8b-73b782a991e2 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-bfb6aa55-8943-44ec-93c0-037a3c64e742" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 14.764s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:03 np0005539505 NetworkManager[55134]: <info>  [1764400623.6256] manager: (tap29881f52-aa): new Tun device (/org/freedesktop/NetworkManager/Devices/229)
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.628 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.632 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.635 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c84ed6f9-a9b9-4b56-8397-6ad2226deb06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.635 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14d61e69-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.636 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.636 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14d61e69-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.637 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 NetworkManager[55134]: <info>  [1764400623.6384] manager: (tap14d61e69-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Nov 29 02:17:03 np0005539505 kernel: tap14d61e69-b0: entered promiscuous mode
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.646 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.647 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap14d61e69-b0, col_values=(('external_ids', {'iface-id': '17905b79-5cd7-4b55-9191-5d935325b1f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.648 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:03Z|00447|binding|INFO|Releasing lport 17905b79-5cd7-4b55-9191-5d935325b1f0 from this chassis (sb_readonly=0)
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.662 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.666 186962 INFO nova.virt.libvirt.driver [-] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Instance destroyed successfully.#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.666 186962 DEBUG nova.objects.instance [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid 704c4aa7-3239-4ecc-bfdc-c72642678363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.667 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.668 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.669 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9e065664-a3f2-4f3d-a98b-b493c36e032e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.670 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-14d61e69-b152-4adc-a95c-58748969e299
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 14d61e69-b152-4adc-a95c-58748969e299
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:17:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:03.670 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'env', 'PROCESS_TAG=haproxy-14d61e69-b152-4adc-a95c-58748969e299', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/14d61e69-b152-4adc-a95c-58748969e299.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.764 186962 DEBUG nova.virt.libvirt.vif [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:15:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1683200929',display_name='tempest-TestNetworkAdvancedServerOps-server-1683200929',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1683200929',id=102,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJpBdlQTrwm1jTLhsIWvBArp7FJbNV/DmsxpavKG+fSfuJYeopMQPEBt+TLRsvwJz1i5TrgMP98T/zGS4tH40QimuRAQV56ulySp5fCUrK73vauhbVZ7xUa0c5MPUYrHZg==',key_name='tempest-TestNetworkAdvancedServerOps-1740209866',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:16:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-v4kh9s3g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:16:50Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=704c4aa7-3239-4ecc-bfdc-c72642678363,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.765 186962 DEBUG nova.network.os_vif_util [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "29881f52-aa42-4a78-a87b-06e906811ff2", "address": "fa:16:3e:44:9d:fe", "network": {"id": "f8ce59f3-d777-4899-bf5b-171901097199", "bridge": "br-int", "label": "tempest-network-smoke--1303008278", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap29881f52-aa", "ovs_interfaceid": "29881f52-aa42-4a78-a87b-06e906811ff2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.766 186962 DEBUG nova.network.os_vif_util [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.766 186962 DEBUG os_vif [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.768 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.768 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29881f52-aa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.813 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.816 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.819 186962 INFO os_vif [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:9d:fe,bridge_name='br-int',has_traffic_filtering=True,id=29881f52-aa42-4a78-a87b-06e906811ff2,network=Network(f8ce59f3-d777-4899-bf5b-171901097199),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap29881f52-aa')#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.821 186962 INFO nova.virt.libvirt.driver [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Deleting instance files /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363_del#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.822 186962 INFO nova.virt.libvirt.driver [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Deletion of /var/lib/nova/instances/704c4aa7-3239-4ecc-bfdc-c72642678363_del complete#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.918 186962 INFO nova.compute.manager [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Took 0.51 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.919 186962 DEBUG oslo.service.loopingcall [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.920 186962 DEBUG nova.compute.manager [-] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:17:03 np0005539505 nova_compute[186958]: 2025-11-29 07:17:03.920 186962 DEBUG nova.network.neutron [-] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:17:04 np0005539505 podman[233520]: 2025-11-29 07:17:04.011441077 +0000 UTC m=+0.053008075 container create 84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:17:04 np0005539505 systemd[1]: Started libpod-conmon-84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9.scope.
Nov 29 02:17:04 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:17:04 np0005539505 podman[233520]: 2025-11-29 07:17:03.98369573 +0000 UTC m=+0.025262758 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:17:04 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1dba7ac76e171b26d7c4d0c26c5cc2b44a89bc51ef048df2852b0927c97a510d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:17:04 np0005539505 podman[233520]: 2025-11-29 07:17:04.09366324 +0000 UTC m=+0.135230258 container init 84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:17:04 np0005539505 podman[233520]: 2025-11-29 07:17:04.099083334 +0000 UTC m=+0.140650332 container start 84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 02:17:04 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[233535]: [NOTICE]   (233539) : New worker (233541) forked
Nov 29 02:17:04 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[233535]: [NOTICE]   (233539) : Loading success.
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.153 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 29881f52-aa42-4a78-a87b-06e906811ff2 in datapath f8ce59f3-d777-4899-bf5b-171901097199 unbound from our chassis#033[00m
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.155 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f8ce59f3-d777-4899-bf5b-171901097199, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.156 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e168d5d8-b2b1-4704-a906-3ebbbf54780e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.157 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199 namespace which is not needed anymore#033[00m
Nov 29 02:17:04 np0005539505 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[233255]: [NOTICE]   (233259) : haproxy version is 2.8.14-c23fe91
Nov 29 02:17:04 np0005539505 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[233255]: [NOTICE]   (233259) : path to executable is /usr/sbin/haproxy
Nov 29 02:17:04 np0005539505 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[233255]: [WARNING]  (233259) : Exiting Master process...
Nov 29 02:17:04 np0005539505 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[233255]: [ALERT]    (233259) : Current worker (233261) exited with code 143 (Terminated)
Nov 29 02:17:04 np0005539505 neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199[233255]: [WARNING]  (233259) : All workers exited. Exiting... (0)
Nov 29 02:17:04 np0005539505 systemd[1]: libpod-ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f.scope: Deactivated successfully.
Nov 29 02:17:04 np0005539505 podman[233566]: 2025-11-29 07:17:04.272942887 +0000 UTC m=+0.039289966 container died ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 02:17:04 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f-userdata-shm.mount: Deactivated successfully.
Nov 29 02:17:04 np0005539505 systemd[1]: var-lib-containers-storage-overlay-fe0b007e7e140a4e2fd73a4d8818e53e23209cc52bf92b819138dac1b56f67c5-merged.mount: Deactivated successfully.
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.312 186962 DEBUG nova.compute.manager [req-73cd26d7-ab69-4484-babd-86eb3de729ff req-445c963a-4388-49f9-bca3-a5243c23dc26 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-plugged-bfb6aa55-8943-44ec-93c0-037a3c64e742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.312 186962 DEBUG oslo_concurrency.lockutils [req-73cd26d7-ab69-4484-babd-86eb3de729ff req-445c963a-4388-49f9-bca3-a5243c23dc26 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.312 186962 DEBUG oslo_concurrency.lockutils [req-73cd26d7-ab69-4484-babd-86eb3de729ff req-445c963a-4388-49f9-bca3-a5243c23dc26 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.313 186962 DEBUG oslo_concurrency.lockutils [req-73cd26d7-ab69-4484-babd-86eb3de729ff req-445c963a-4388-49f9-bca3-a5243c23dc26 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:04 np0005539505 podman[233566]: 2025-11-29 07:17:04.313333193 +0000 UTC m=+0.079680272 container cleanup ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.313 186962 DEBUG nova.compute.manager [req-73cd26d7-ab69-4484-babd-86eb3de729ff req-445c963a-4388-49f9-bca3-a5243c23dc26 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] No waiting events found dispatching network-vif-plugged-bfb6aa55-8943-44ec-93c0-037a3c64e742 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.313 186962 WARNING nova.compute.manager [req-73cd26d7-ab69-4484-babd-86eb3de729ff req-445c963a-4388-49f9-bca3-a5243c23dc26 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received unexpected event network-vif-plugged-bfb6aa55-8943-44ec-93c0-037a3c64e742 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:17:04 np0005539505 systemd[1]: libpod-conmon-ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f.scope: Deactivated successfully.
Nov 29 02:17:04 np0005539505 podman[233595]: 2025-11-29 07:17:04.378192303 +0000 UTC m=+0.042606520 container remove ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.386 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[12965343-dbcc-46b1-8527-3979d404cf3f]: (4, ('Sat Nov 29 07:17:04 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199 (ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f)\ned9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f\nSat Nov 29 07:17:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199 (ed9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f)\ned9fe363f029126648068b8d7b2b78d2ac9ff23b176331a88521d233b48cfe3f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.388 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[16304b56-3bbf-45b3-982a-203e9f59fcdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.390 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8ce59f3-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.392 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:04 np0005539505 kernel: tapf8ce59f3-d0: left promiscuous mode
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.411 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.415 186962 DEBUG nova.compute.manager [req-b2659f8a-8500-47de-acea-6fe8c7415d14 req-43587c52-b40f-4895-9a53-f3eba74ef8c7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.415 186962 DEBUG oslo_concurrency.lockutils [req-b2659f8a-8500-47de-acea-6fe8c7415d14 req-43587c52-b40f-4895-9a53-f3eba74ef8c7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.415 186962 DEBUG oslo_concurrency.lockutils [req-b2659f8a-8500-47de-acea-6fe8c7415d14 req-43587c52-b40f-4895-9a53-f3eba74ef8c7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.416 186962 DEBUG oslo_concurrency.lockutils [req-b2659f8a-8500-47de-acea-6fe8c7415d14 req-43587c52-b40f-4895-9a53-f3eba74ef8c7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.415 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bc576f32-196d-4c5c-b6e8-04679b556f23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.416 186962 DEBUG nova.compute.manager [req-b2659f8a-8500-47de-acea-6fe8c7415d14 req-43587c52-b40f-4895-9a53-f3eba74ef8c7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Processing event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.416 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.417 186962 DEBUG nova.compute.manager [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.424 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400624.4242709, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.424 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.427 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.430 186962 INFO nova.virt.libvirt.driver [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance spawned successfully.#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.431 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.433 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9a675d96-d967-468a-b0f1-b6cad2775eae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.435 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[566da363-d3a7-444d-ae13-2deacccd766d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.451 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[59f912e3-e23a-499e-a3b6-a0c60e458d27]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 605184, 'reachable_time': 24150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233610, 'error': None, 'target': 'ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.454 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f8ce59f3-d777-4899-bf5b-171901097199 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:17:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:04.454 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[48059dd9-fcb4-4206-bb13-a3dbe31915e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:04 np0005539505 systemd[1]: run-netns-ovnmeta\x2df8ce59f3\x2dd777\x2d4899\x2dbf5b\x2d171901097199.mount: Deactivated successfully.
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.456 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.462 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.465 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.465 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.465 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.466 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.466 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.466 186962 DEBUG nova.virt.libvirt.driver [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.510 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.572 186962 INFO nova.compute.manager [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Took 14.32 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.573 186962 DEBUG nova.compute.manager [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.665 186962 INFO nova.compute.manager [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Took 15.22 seconds to build instance.#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.700 186962 DEBUG nova.network.neutron [-] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.706 186962 DEBUG oslo_concurrency.lockutils [None req-ac126202-afd3-4733-8163-87321909e8d8 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.728 186962 INFO nova.compute.manager [-] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Took 0.81 seconds to deallocate network for instance.#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.748 186962 DEBUG nova.compute.manager [req-6a74c14d-0289-4dff-ace4-b4462562abf8 req-dc8f7391-4bcb-4a00-b339-c985961ddbf2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-deleted-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.835 186962 DEBUG oslo_concurrency.lockutils [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.835 186962 DEBUG oslo_concurrency.lockutils [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.840 186962 DEBUG oslo_concurrency.lockutils [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.882 186962 INFO nova.scheduler.client.report [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Deleted allocations for instance 704c4aa7-3239-4ecc-bfdc-c72642678363#033[00m
Nov 29 02:17:04 np0005539505 nova_compute[186958]: 2025-11-29 07:17:04.888 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:04Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:63:45:c4 10.100.0.12
Nov 29 02:17:05 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:05Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:63:45:c4 10.100.0.12
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.023 186962 DEBUG oslo_concurrency.lockutils [None req-a59751f4-b505-4ad1-9963-67a3c83c38e7 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.414s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.083 186962 DEBUG nova.network.neutron [req-4a24a65a-5b2a-48bf-aec8-35abf6d024a8 req-c6b21d77-874c-470c-b79a-ccc4070a9668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updated VIF entry in instance network info cache for port bfb6aa55-8943-44ec-93c0-037a3c64e742. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.083 186962 DEBUG nova.network.neutron [req-4a24a65a-5b2a-48bf-aec8-35abf6d024a8 req-c6b21d77-874c-470c-b79a-ccc4070a9668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.107 186962 DEBUG oslo_concurrency.lockutils [req-4a24a65a-5b2a-48bf-aec8-35abf6d024a8 req-c6b21d77-874c-470c-b79a-ccc4070a9668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.336 186962 DEBUG oslo_concurrency.lockutils [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-b04bb5a4-4610-4151-a86a-f1f55b164195" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.336 186962 DEBUG oslo_concurrency.lockutils [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-b04bb5a4-4610-4151-a86a-f1f55b164195" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.355 186962 DEBUG nova.objects.instance [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'flavor' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.386 186962 DEBUG nova.virt.libvirt.vif [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.386 186962 DEBUG nova.network.os_vif_util [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.387 186962 DEBUG nova.network.os_vif_util [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.389 186962 DEBUG nova.virt.libvirt.guest [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4e:d4:98"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb04bb5a4-46"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.392 186962 DEBUG nova.virt.libvirt.guest [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4e:d4:98"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb04bb5a4-46"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.394 186962 DEBUG nova.virt.libvirt.driver [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Attempting to detach device tapb04bb5a4-46 from instance 7da96eef-5195-4fe9-8421-3b8b79420a86 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.394 186962 DEBUG nova.virt.libvirt.guest [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:4e:d4:98"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <target dev="tapb04bb5a4-46"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.400 186962 DEBUG nova.virt.libvirt.guest [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4e:d4:98"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb04bb5a4-46"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.402 186962 DEBUG nova.virt.libvirt.guest [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4e:d4:98"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb04bb5a4-46"/></interface>not found in domain: <domain type='kvm' id='51'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <name>instance-00000065</name>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <uuid>7da96eef-5195-4fe9-8421-3b8b79420a86</uuid>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:17:03</nova:creationTime>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:port uuid="b04bb5a4-4610-4151-a86a-f1f55b164195">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:port uuid="816e158d-4c1c-4ea8-ae90-eb4e66048a31">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:port uuid="bfb6aa55-8943-44ec-93c0-037a3c64e742">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='serial'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='uuid'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk' index='2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.config' index='1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:c1:c1:22'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target dev='tap7f253c88-5c'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:4e:d4:98'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target dev='tapb04bb5a4-46'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='net1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:0a:52:e0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target dev='tap816e158d-4c'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='net2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:63:45:c4'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target dev='tapbfb6aa55-89'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='net3'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c219,c799</label>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c219,c799</imagelabel>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.403 186962 INFO nova.virt.libvirt.driver [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully detached device tapb04bb5a4-46 from instance 7da96eef-5195-4fe9-8421-3b8b79420a86 from the persistent domain config.#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.403 186962 DEBUG nova.virt.libvirt.driver [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] (1/8): Attempting to detach device tapb04bb5a4-46 with device alias net1 from instance 7da96eef-5195-4fe9-8421-3b8b79420a86 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.403 186962 DEBUG nova.virt.libvirt.guest [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:4e:d4:98"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <target dev="tapb04bb5a4-46"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:17:05 np0005539505 kernel: tapb04bb5a4-46 (unregistering): left promiscuous mode
Nov 29 02:17:05 np0005539505 NetworkManager[55134]: <info>  [1764400625.5076] device (tapb04bb5a4-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:17:05 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:05Z|00448|binding|INFO|Releasing lport b04bb5a4-4610-4151-a86a-f1f55b164195 from this chassis (sb_readonly=0)
Nov 29 02:17:05 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:05Z|00449|binding|INFO|Setting lport b04bb5a4-4610-4151-a86a-f1f55b164195 down in Southbound
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.519 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:05 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:05Z|00450|binding|INFO|Removing iface tapb04bb5a4-46 ovn-installed in OVS
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.521 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.527 186962 DEBUG nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Received event <DeviceRemovedEvent: 1764400625.5248659, 7da96eef-5195-4fe9-8421-3b8b79420a86 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.529 186962 DEBUG nova.virt.libvirt.driver [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Start waiting for the detach event from libvirt for device tapb04bb5a4-46 with device alias net1 for instance 7da96eef-5195-4fe9-8421-3b8b79420a86 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.529 186962 DEBUG nova.virt.libvirt.guest [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4e:d4:98"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb04bb5a4-46"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.536 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.538 186962 DEBUG nova.virt.libvirt.guest [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4e:d4:98"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb04bb5a4-46"/></interface>not found in domain: <domain type='kvm' id='51'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <name>instance-00000065</name>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <uuid>7da96eef-5195-4fe9-8421-3b8b79420a86</uuid>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:17:03</nova:creationTime>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:port uuid="b04bb5a4-4610-4151-a86a-f1f55b164195">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:port uuid="816e158d-4c1c-4ea8-ae90-eb4e66048a31">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:port uuid="bfb6aa55-8943-44ec-93c0-037a3c64e742">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='serial'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='uuid'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk' index='2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.config' index='1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:c1:c1:22'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target dev='tap7f253c88-5c'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:0a:52:e0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target dev='tap816e158d-4c'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='net2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:63:45:c4'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target dev='tapbfb6aa55-89'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='net3'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c219,c799</label>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c219,c799</imagelabel>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.539 186962 INFO nova.virt.libvirt.driver [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully detached device tapb04bb5a4-46 from instance 7da96eef-5195-4fe9-8421-3b8b79420a86 from the live domain config.#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.539 186962 DEBUG nova.virt.libvirt.vif [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.540 186962 DEBUG nova.network.os_vif_util [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.540 186962 DEBUG nova.network.os_vif_util [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.540 186962 DEBUG os_vif [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.542 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.543 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb04bb5a4-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.546 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.548 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.554 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.556 186962 INFO os_vif [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46')#033[00m
Nov 29 02:17:05 np0005539505 nova_compute[186958]: 2025-11-29 07:17:05.557 186962 DEBUG nova.virt.libvirt.guest [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:17:05</nova:creationTime>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:port uuid="816e158d-4c1c-4ea8-ae90-eb4e66048a31">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    <nova:port uuid="bfb6aa55-8943-44ec-93c0-037a3c64e742">
Nov 29 02:17:05 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:05 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:17:05 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:17:05 np0005539505 podman[233611]: 2025-11-29 07:17:05.600174185 +0000 UTC m=+0.052514001 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.054 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:d4:98 10.100.0.13'], port_security=['fa:16:3e:4e:d4:98 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '4', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b04bb5a4-4610-4151-a86a-f1f55b164195) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.056 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b04bb5a4-4610-4151-a86a-f1f55b164195 in datapath 90812230-35cb-4e21-b16b-75b900100d8b unbound from our chassis#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.061 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.075 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ec0b2fa7-4e18-4088-89fd-93a22c786bd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.105 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a627627b-3c8e-4470-8a7b-f8a60b9aac3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.109 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[987d6e49-48bf-4b15-a37d-a8ca83938f10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.150 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[97ad23cd-5b7d-49a7-ad0d-3835532adc74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.180 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[93312815-4a26-427f-b7f1-1e34f2676a81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600202, 'reachable_time': 25646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233642, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.196 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb012b2-0844-4f4e-bbc5-b0a964af5b88]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600212, 'tstamp': 600212}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233643, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600215, 'tstamp': 600215}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233643, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.198 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.200 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.201 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.201 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.202 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.202 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:06.203 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.315 186962 DEBUG oslo_concurrency.lockutils [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.316 186962 DEBUG oslo_concurrency.lockutils [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.318 186962 DEBUG nova.network.neutron [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.385 186962 DEBUG nova.compute.manager [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-deleted-b04bb5a4-4610-4151-a86a-f1f55b164195 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.386 186962 INFO nova.compute.manager [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Neutron deleted interface b04bb5a4-4610-4151-a86a-f1f55b164195; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.386 186962 DEBUG nova.network.neutron [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.415 186962 DEBUG nova.compute.manager [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-plugged-bfb6aa55-8943-44ec-93c0-037a3c64e742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.415 186962 DEBUG oslo_concurrency.lockutils [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.416 186962 DEBUG oslo_concurrency.lockutils [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.416 186962 DEBUG oslo_concurrency.lockutils [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.416 186962 DEBUG nova.compute.manager [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] No waiting events found dispatching network-vif-plugged-bfb6aa55-8943-44ec-93c0-037a3c64e742 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.417 186962 WARNING nova.compute.manager [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received unexpected event network-vif-plugged-bfb6aa55-8943-44ec-93c0-037a3c64e742 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.417 186962 DEBUG nova.compute.manager [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-unplugged-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.417 186962 DEBUG oslo_concurrency.lockutils [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.417 186962 DEBUG oslo_concurrency.lockutils [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.417 186962 DEBUG oslo_concurrency.lockutils [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.418 186962 DEBUG nova.compute.manager [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] No waiting events found dispatching network-vif-unplugged-29881f52-aa42-4a78-a87b-06e906811ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.418 186962 WARNING nova.compute.manager [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received unexpected event network-vif-unplugged-29881f52-aa42-4a78-a87b-06e906811ff2 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.418 186962 DEBUG nova.compute.manager [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.418 186962 DEBUG oslo_concurrency.lockutils [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.419 186962 DEBUG oslo_concurrency.lockutils [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.419 186962 DEBUG oslo_concurrency.lockutils [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "704c4aa7-3239-4ecc-bfdc-c72642678363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.419 186962 DEBUG nova.compute.manager [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] No waiting events found dispatching network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.419 186962 WARNING nova.compute.manager [req-1a641b6b-d454-4905-ac81-d5883a74cced req-5ce27e37-062b-4f8e-87c2-3499761b773e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Received unexpected event network-vif-plugged-29881f52-aa42-4a78-a87b-06e906811ff2 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.421 186962 DEBUG nova.objects.instance [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lazy-loading 'system_metadata' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.445 186962 DEBUG nova.objects.instance [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lazy-loading 'flavor' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.465 186962 DEBUG nova.virt.libvirt.vif [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.465 186962 DEBUG nova.network.os_vif_util [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converting VIF {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.466 186962 DEBUG nova.network.os_vif_util [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.468 186962 DEBUG nova.virt.libvirt.guest [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4e:d4:98"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb04bb5a4-46"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.472 186962 DEBUG nova.virt.libvirt.guest [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4e:d4:98"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb04bb5a4-46"/></interface>not found in domain: <domain type='kvm' id='51'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <name>instance-00000065</name>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <uuid>7da96eef-5195-4fe9-8421-3b8b79420a86</uuid>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:17:05</nova:creationTime>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:port uuid="816e158d-4c1c-4ea8-ae90-eb4e66048a31">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:port uuid="bfb6aa55-8943-44ec-93c0-037a3c64e742">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:17:06 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='serial'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='uuid'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk' index='2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.config' index='1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:c1:c1:22'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target dev='tap7f253c88-5c'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:0a:52:e0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target dev='tap816e158d-4c'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='net2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:63:45:c4'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target dev='tapbfb6aa55-89'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='net3'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c219,c799</label>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c219,c799</imagelabel>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:06 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:17:06 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.472 186962 DEBUG nova.virt.libvirt.guest [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4e:d4:98"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb04bb5a4-46"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.478 186962 DEBUG nova.virt.libvirt.guest [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4e:d4:98"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb04bb5a4-46"/></interface>not found in domain: <domain type='kvm' id='51'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <name>instance-00000065</name>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <uuid>7da96eef-5195-4fe9-8421-3b8b79420a86</uuid>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:17:05</nova:creationTime>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:port uuid="816e158d-4c1c-4ea8-ae90-eb4e66048a31">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:port uuid="bfb6aa55-8943-44ec-93c0-037a3c64e742">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:17:06 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='serial'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='uuid'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk' index='2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.config' index='1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:c1:c1:22'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target dev='tap7f253c88-5c'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:0a:52:e0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target dev='tap816e158d-4c'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='net2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:63:45:c4'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target dev='tapbfb6aa55-89'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='net3'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c219,c799</label>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c219,c799</imagelabel>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:06 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:17:06 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.479 186962 WARNING nova.virt.libvirt.driver [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Detaching interface fa:16:3e:4e:d4:98 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapb04bb5a4-46' not found.#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.479 186962 DEBUG nova.virt.libvirt.vif [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.480 186962 DEBUG nova.network.os_vif_util [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converting VIF {"id": "b04bb5a4-4610-4151-a86a-f1f55b164195", "address": "fa:16:3e:4e:d4:98", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb04bb5a4-46", "ovs_interfaceid": "b04bb5a4-4610-4151-a86a-f1f55b164195", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.480 186962 DEBUG nova.network.os_vif_util [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.480 186962 DEBUG os_vif [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.481 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.482 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb04bb5a4-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.482 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.484 186962 INFO os_vif [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:d4:98,bridge_name='br-int',has_traffic_filtering=True,id=b04bb5a4-4610-4151-a86a-f1f55b164195,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb04bb5a4-46')#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.485 186962 DEBUG nova.virt.libvirt.guest [req-088bf3e6-4350-4909-83d6-e3d1c368e08b req-816e74a9-71b1-40ed-8f1a-c2f996681b31 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:17:06</nova:creationTime>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:port uuid="816e158d-4c1c-4ea8-ae90-eb4e66048a31">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    <nova:port uuid="bfb6aa55-8943-44ec-93c0-037a3c64e742">
Nov 29 02:17:06 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:06 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:17:06 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:17:06 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.500 186962 DEBUG nova.compute.manager [req-01136b20-06ee-4229-9b80-893e76ae8f9b req-d7667150-4e3c-4574-af55-c4c6029fc02a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.500 186962 DEBUG oslo_concurrency.lockutils [req-01136b20-06ee-4229-9b80-893e76ae8f9b req-d7667150-4e3c-4574-af55-c4c6029fc02a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.500 186962 DEBUG oslo_concurrency.lockutils [req-01136b20-06ee-4229-9b80-893e76ae8f9b req-d7667150-4e3c-4574-af55-c4c6029fc02a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.501 186962 DEBUG oslo_concurrency.lockutils [req-01136b20-06ee-4229-9b80-893e76ae8f9b req-d7667150-4e3c-4574-af55-c4c6029fc02a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.501 186962 DEBUG nova.compute.manager [req-01136b20-06ee-4229-9b80-893e76ae8f9b req-d7667150-4e3c-4574-af55-c4c6029fc02a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] No waiting events found dispatching network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:06 np0005539505 nova_compute[186958]: 2025-11-29 07:17:06.501 186962 WARNING nova.compute.manager [req-01136b20-06ee-4229-9b80-893e76ae8f9b req-d7667150-4e3c-4574-af55-c4c6029fc02a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received unexpected event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:17:07 np0005539505 nova_compute[186958]: 2025-11-29 07:17:07.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:08 np0005539505 nova_compute[186958]: 2025-11-29 07:17:08.428 186962 INFO nova.network.neutron [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Port b04bb5a4-4610-4151-a86a-f1f55b164195 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.592 104094 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port bc75b2f0-888b-4bd0-a61d-24bc541e954a with type ""#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.594 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:45:c4 10.100.0.12'], port_security=['fa:16:3e:63:45:c4 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-175058931', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-175058931', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '4', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=bfb6aa55-8943-44ec-93c0-037a3c64e742) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.596 104094 INFO neutron.agent.ovn.metadata.agent [-] Port bfb6aa55-8943-44ec-93c0-037a3c64e742 in datapath 90812230-35cb-4e21-b16b-75b900100d8b unbound from our chassis#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.599 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:17:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:08Z|00451|binding|INFO|Removing iface tapbfb6aa55-89 ovn-installed in OVS
Nov 29 02:17:08 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:08Z|00452|binding|INFO|Removing lport bfb6aa55-8943-44ec-93c0-037a3c64e742 ovn-installed in OVS
Nov 29 02:17:08 np0005539505 nova_compute[186958]: 2025-11-29 07:17:08.638 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.644 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[893aee7d-dfea-4f9f-bd2e-d855fc7600fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.678 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[333adb71-fd80-445e-b8f8-1a76251fb1e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.682 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1035c6-4e0a-4dc8-b6e1-5c97381e513f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.719 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1c1c1ddf-1780-45a7-96fb-2e52ad930dfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.736 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5778ca7e-c821-4585-9aaa-e2129e6f2fbf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 784, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600202, 'reachable_time': 25646, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233649, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.754 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[56f66002-d706-4ed3-b4c7-6196d6c27609]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600212, 'tstamp': 600212}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233650, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600215, 'tstamp': 600215}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233650, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.755 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:08 np0005539505 nova_compute[186958]: 2025-11-29 07:17:08.759 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.762 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.762 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:08 np0005539505 nova_compute[186958]: 2025-11-29 07:17:08.763 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.763 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:08.763 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.013 186962 DEBUG nova.compute.manager [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-deleted-bfb6aa55-8943-44ec-93c0-037a3c64e742 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.013 186962 INFO nova.compute.manager [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Neutron deleted interface bfb6aa55-8943-44ec-93c0-037a3c64e742; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.013 186962 DEBUG nova.network.neutron [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.096 186962 DEBUG nova.objects.instance [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lazy-loading 'system_metadata' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.132 186962 DEBUG nova.objects.instance [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lazy-loading 'flavor' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.171 186962 DEBUG nova.virt.libvirt.vif [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.171 186962 DEBUG nova.network.os_vif_util [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converting VIF {"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.171 186962 DEBUG nova.network.os_vif_util [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:45:c4,bridge_name='br-int',has_traffic_filtering=True,id=bfb6aa55-8943-44ec-93c0-037a3c64e742,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbfb6aa55-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.174 186962 DEBUG nova.virt.libvirt.guest [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:63:45:c4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbfb6aa55-89"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.175 186962 DEBUG nova.virt.libvirt.guest [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:63:45:c4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbfb6aa55-89"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.177 186962 DEBUG nova.virt.libvirt.driver [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Attempting to detach device tapbfb6aa55-89 from instance 7da96eef-5195-4fe9-8421-3b8b79420a86 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.177 186962 DEBUG nova.virt.libvirt.guest [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:63:45:c4"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <target dev="tapbfb6aa55-89"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.182 186962 DEBUG nova.virt.libvirt.guest [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:63:45:c4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbfb6aa55-89"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.184 186962 DEBUG nova.virt.libvirt.guest [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:63:45:c4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbfb6aa55-89"/></interface>not found in domain: <domain type='kvm' id='51'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <name>instance-00000065</name>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <uuid>7da96eef-5195-4fe9-8421-3b8b79420a86</uuid>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:17:06</nova:creationTime>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:port uuid="816e158d-4c1c-4ea8-ae90-eb4e66048a31">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:port uuid="bfb6aa55-8943-44ec-93c0-037a3c64e742">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='serial'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='uuid'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk' index='2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.config' index='1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:c1:c1:22'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target dev='tap7f253c88-5c'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:0a:52:e0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target dev='tap816e158d-4c'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='net2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:63:45:c4'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target dev='tapbfb6aa55-89'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='net3'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c219,c799</label>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c219,c799</imagelabel>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.184 186962 INFO nova.virt.libvirt.driver [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Successfully detached device tapbfb6aa55-89 from instance 7da96eef-5195-4fe9-8421-3b8b79420a86 from the persistent domain config.#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.185 186962 DEBUG nova.virt.libvirt.driver [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] (1/8): Attempting to detach device tapbfb6aa55-89 with device alias net3 from instance 7da96eef-5195-4fe9-8421-3b8b79420a86 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.185 186962 DEBUG nova.virt.libvirt.guest [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:63:45:c4"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <target dev="tapbfb6aa55-89"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:17:09 np0005539505 kernel: tapbfb6aa55-89 (unregistering): left promiscuous mode
Nov 29 02:17:09 np0005539505 NetworkManager[55134]: <info>  [1764400629.2969] device (tapbfb6aa55-89): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.302 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.312 186962 DEBUG nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Received event <DeviceRemovedEvent: 1764400629.312352, 7da96eef-5195-4fe9-8421-3b8b79420a86 => net3> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.315 186962 DEBUG nova.virt.libvirt.driver [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Start waiting for the detach event from libvirt for device tapbfb6aa55-89 with device alias net3 for instance 7da96eef-5195-4fe9-8421-3b8b79420a86 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.315 186962 DEBUG nova.virt.libvirt.guest [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:63:45:c4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbfb6aa55-89"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.320 186962 DEBUG nova.virt.libvirt.guest [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:63:45:c4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbfb6aa55-89"/></interface>not found in domain: <domain type='kvm' id='51'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <name>instance-00000065</name>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <uuid>7da96eef-5195-4fe9-8421-3b8b79420a86</uuid>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:17:06</nova:creationTime>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:port uuid="816e158d-4c1c-4ea8-ae90-eb4e66048a31">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:port uuid="bfb6aa55-8943-44ec-93c0-037a3c64e742">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='serial'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='uuid'>7da96eef-5195-4fe9-8421-3b8b79420a86</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk' index='2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/disk.config' index='1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:c1:c1:22'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target dev='tap7f253c88-5c'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:0a:52:e0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target dev='tap816e158d-4c'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='net2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86/console.log' append='off'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c219,c799</label>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c219,c799</imagelabel>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.321 186962 INFO nova.virt.libvirt.driver [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Successfully detached device tapbfb6aa55-89 from instance 7da96eef-5195-4fe9-8421-3b8b79420a86 from the live domain config.#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.321 186962 DEBUG nova.virt.libvirt.vif [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.322 186962 DEBUG nova.network.os_vif_util [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converting VIF {"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.322 186962 DEBUG nova.network.os_vif_util [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:45:c4,bridge_name='br-int',has_traffic_filtering=True,id=bfb6aa55-8943-44ec-93c0-037a3c64e742,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbfb6aa55-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.322 186962 DEBUG os_vif [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:45:c4,bridge_name='br-int',has_traffic_filtering=True,id=bfb6aa55-8943-44ec-93c0-037a3c64e742,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbfb6aa55-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.324 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.324 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfb6aa55-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.325 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.326 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.328 186962 INFO os_vif [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:45:c4,bridge_name='br-int',has_traffic_filtering=True,id=bfb6aa55-8943-44ec-93c0-037a3c64e742,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbfb6aa55-89')#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.329 186962 DEBUG nova.virt.libvirt.guest [req-b91d3f3e-2622-4301-9e03-991a4ad5c9a5 req-54447fe2-b297-46c4-a7df-e331df6d0233 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1539987896</nova:name>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:17:09</nova:creationTime>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:port uuid="7f253c88-5c90-410c-bbe6-a152ae7c3a63">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    <nova:port uuid="816e158d-4c1c-4ea8-ae90-eb4e66048a31">
Nov 29 02:17:09 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:17:09 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:17:09 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:17:09 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.601 186962 DEBUG oslo_concurrency.lockutils [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.601 186962 DEBUG oslo_concurrency.lockutils [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.601 186962 DEBUG oslo_concurrency.lockutils [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.602 186962 DEBUG oslo_concurrency.lockutils [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.602 186962 DEBUG oslo_concurrency.lockutils [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:09 np0005539505 nova_compute[186958]: 2025-11-29 07:17:09.893 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.389 186962 INFO nova.compute.manager [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Terminating instance#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.490 186962 DEBUG nova.compute.manager [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:17:10 np0005539505 kernel: tap7f253c88-5c (unregistering): left promiscuous mode
Nov 29 02:17:10 np0005539505 NetworkManager[55134]: <info>  [1764400630.5563] device (tap7f253c88-5c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.565 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00453|binding|INFO|Releasing lport 7f253c88-5c90-410c-bbe6-a152ae7c3a63 from this chassis (sb_readonly=0)
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00454|binding|INFO|Setting lport 7f253c88-5c90-410c-bbe6-a152ae7c3a63 down in Southbound
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00455|binding|INFO|Removing iface tap7f253c88-5c ovn-installed in OVS
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.579 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 kernel: tap816e158d-4c (unregistering): left promiscuous mode
Nov 29 02:17:10 np0005539505 NetworkManager[55134]: <info>  [1764400630.5971] device (tap816e158d-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00456|binding|INFO|Releasing lport 816e158d-4c1c-4ea8-ae90-eb4e66048a31 from this chassis (sb_readonly=1)
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00457|binding|INFO|Removing iface tap816e158d-4c ovn-installed in OVS
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00458|if_status|INFO|Dropped 1 log messages in last 66 seconds (most recently, 66 seconds ago) due to excessive rate
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00459|if_status|INFO|Not setting lport 816e158d-4c1c-4ea8-ae90-eb4e66048a31 down as sb is readonly
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.619 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.625 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00460|binding|INFO|Setting lport 816e158d-4c1c-4ea8-ae90-eb4e66048a31 down in Southbound
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.659 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:c1:22 10.100.0.5'], port_security=['fa:16:3e:c1:c1:22 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '4', 'neutron:security_group_ids': '07f51098-ec31-4030-87d3-0b3bc87fde1f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=7f253c88-5c90-410c-bbe6-a152ae7c3a63) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.660 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 7f253c88-5c90-410c-bbe6-a152ae7c3a63 in datapath 90812230-35cb-4e21-b16b-75b900100d8b unbound from our chassis#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.662 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:17:10 np0005539505 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000065.scope: Deactivated successfully.
Nov 29 02:17:10 np0005539505 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000065.scope: Consumed 15.684s CPU time.
Nov 29 02:17:10 np0005539505 systemd-machined[153285]: Machine qemu-51-instance-00000065 terminated.
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.679 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fe649277-e36e-443e-81e3-af606ae44d78]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.713 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[53dd6c0c-7d22-4b20-86ed-2c2f860e81c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.720 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b7e13241-c30c-4ce3-935f-7a1480254d39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:10 np0005539505 NetworkManager[55134]: <info>  [1764400630.7256] manager: (tap816e158d-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/231)
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.733 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:52:e0 10.100.0.8'], port_security=['fa:16:3e:0a:52:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '7da96eef-5195-4fe9-8421-3b8b79420a86', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '4', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=816e158d-4c1c-4ea8-ae90-eb4e66048a31) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.769 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2df2809c-94f1-4bec-8c19-fa9eed1fd72c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.773 186962 INFO nova.virt.libvirt.driver [-] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Instance destroyed successfully.#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.773 186962 DEBUG nova.objects.instance [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'resources' on Instance uuid 7da96eef-5195-4fe9-8421-3b8b79420a86 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00461|binding|INFO|Releasing lport 17905b79-5cd7-4b55-9191-5d935325b1f0 from this chassis (sb_readonly=0)
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00462|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.790 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.788 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4404cf-2ee0-4c44-ad18-c05871fdf8ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600202, 'reachable_time': 32912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233700, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.807 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1487a1e0-7b62-466d-9670-590927c9863e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600212, 'tstamp': 600212}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233701, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600215, 'tstamp': 600215}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233701, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.813 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.851 186962 DEBUG nova.virt.libvirt.vif [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.852 186962 DEBUG nova.network.os_vif_util [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.852 186962 DEBUG nova.network.os_vif_util [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:c1:22,bridge_name='br-int',has_traffic_filtering=True,id=7f253c88-5c90-410c-bbe6-a152ae7c3a63,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f253c88-5c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.853 186962 DEBUG os_vif [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:c1:22,bridge_name='br-int',has_traffic_filtering=True,id=7f253c88-5c90-410c-bbe6-a152ae7c3a63,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f253c88-5c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.854 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.854 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f253c88-5c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.857 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.858 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.961 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.961 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.962 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.962 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.963 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 816e158d-4c1c-4ea8-ae90-eb4e66048a31 in datapath 90812230-35cb-4e21-b16b-75b900100d8b unbound from our chassis#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.965 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.968 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90812230-35cb-4e21-b16b-75b900100d8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.969 186962 INFO os_vif [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:c1:22,bridge_name='br-int',has_traffic_filtering=True,id=7f253c88-5c90-410c-bbe6-a152ae7c3a63,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f253c88-5c')#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.969 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a51f0196-fbd1-44f6-b0fa-f78111a42e4a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.970 186962 DEBUG nova.virt.libvirt.vif [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.970 186962 DEBUG nova.network.os_vif_util [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:10.970 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b namespace which is not needed anymore#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.970 186962 DEBUG nova.network.os_vif_util [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:52:e0,bridge_name='br-int',has_traffic_filtering=True,id=816e158d-4c1c-4ea8-ae90-eb4e66048a31,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap816e158d-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.971 186962 DEBUG os_vif [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:52:e0,bridge_name='br-int',has_traffic_filtering=True,id=816e158d-4c1c-4ea8-ae90-eb4e66048a31,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap816e158d-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.972 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.973 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap816e158d-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.974 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00463|binding|INFO|Releasing lport 17905b79-5cd7-4b55-9191-5d935325b1f0 from this chassis (sb_readonly=0)
Nov 29 02:17:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:10Z|00464|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.975 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.977 186962 INFO os_vif [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:52:e0,bridge_name='br-int',has_traffic_filtering=True,id=816e158d-4c1c-4ea8-ae90-eb4e66048a31,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap816e158d-4c')#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.978 186962 DEBUG nova.virt.libvirt.vif [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:15:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1539987896',display_name='tempest-AttachInterfacesTestJSON-server-1539987896',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1539987896',id=101,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBII2YYwxWkt73SgpvAmsDyKOHB9VWoRCm91CvMkDoEDeynxXPZ4Qk6OVRhPtomsaWXRdCwkFZpA539VN244BtvopWTF5cQ+bB2ByLpYI6vzc620toLv3pN/Ifm8jwGQItQ==',key_name='tempest-keypair-850376853',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:15:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-g0o27syg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:15:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=7da96eef-5195-4fe9-8421-3b8b79420a86,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.978 186962 DEBUG nova.network.os_vif_util [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.978 186962 DEBUG nova.network.os_vif_util [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:45:c4,bridge_name='br-int',has_traffic_filtering=True,id=bfb6aa55-8943-44ec-93c0-037a3c64e742,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbfb6aa55-89') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.979 186962 DEBUG os_vif [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:45:c4,bridge_name='br-int',has_traffic_filtering=True,id=bfb6aa55-8943-44ec-93c0-037a3c64e742,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbfb6aa55-89') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.980 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.980 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfb6aa55-89, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.980 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.980 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.982 186962 INFO os_vif [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:45:c4,bridge_name='br-int',has_traffic_filtering=True,id=bfb6aa55-8943-44ec-93c0-037a3c64e742,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbfb6aa55-89')#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.982 186962 INFO nova.virt.libvirt.driver [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Deleting instance files /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86_del#033[00m
Nov 29 02:17:10 np0005539505 nova_compute[186958]: 2025-11-29 07:17:10.983 186962 INFO nova.virt.libvirt.driver [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Deletion of /var/lib/nova/instances/7da96eef-5195-4fe9-8421-3b8b79420a86_del complete#033[00m
Nov 29 02:17:11 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232561]: [NOTICE]   (232568) : haproxy version is 2.8.14-c23fe91
Nov 29 02:17:11 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232561]: [NOTICE]   (232568) : path to executable is /usr/sbin/haproxy
Nov 29 02:17:11 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232561]: [WARNING]  (232568) : Exiting Master process...
Nov 29 02:17:11 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232561]: [WARNING]  (232568) : Exiting Master process...
Nov 29 02:17:11 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232561]: [ALERT]    (232568) : Current worker (232570) exited with code 143 (Terminated)
Nov 29 02:17:11 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[232561]: [WARNING]  (232568) : All workers exited. Exiting... (0)
Nov 29 02:17:11 np0005539505 systemd[1]: libpod-c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c.scope: Deactivated successfully.
Nov 29 02:17:11 np0005539505 conmon[232561]: conmon c9c92f4a45eb5bf05fff <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c.scope/container/memory.events
Nov 29 02:17:11 np0005539505 podman[233720]: 2025-11-29 07:17:11.123908189 +0000 UTC m=+0.049371772 container died c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:17:11 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c-userdata-shm.mount: Deactivated successfully.
Nov 29 02:17:11 np0005539505 systemd[1]: var-lib-containers-storage-overlay-88e9b78e730c9b3d65dafb4ca93b833fdb4de6c29794669184b2ab67acfbc325-merged.mount: Deactivated successfully.
Nov 29 02:17:11 np0005539505 podman[233720]: 2025-11-29 07:17:11.166745775 +0000 UTC m=+0.092209318 container cleanup c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:17:11 np0005539505 systemd[1]: libpod-conmon-c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c.scope: Deactivated successfully.
Nov 29 02:17:11 np0005539505 podman[233751]: 2025-11-29 07:17:11.226116169 +0000 UTC m=+0.038394190 container remove c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:17:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:11.236 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[533e564a-52da-4009-8baa-6e5a99f8787b]: (4, ('Sat Nov 29 07:17:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b (c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c)\nc9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c\nSat Nov 29 07:17:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b (c9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c)\nc9c92f4a45eb5bf05fff0421d838303893761761b132ff7d48239c99936eb49c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:11.237 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[556efc58-de4a-48b5-9021-4665fc7b7ea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:11.238 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.240 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:11 np0005539505 kernel: tap90812230-30: left promiscuous mode
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.252 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:11.253 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[63dbb7a2-8540-4e4d-b937-761156abda89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:11.271 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7aaddcfa-1006-4dd2-8004-7aa164679543]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:11.272 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[21c655d9-db8b-463b-b1ea-8bc32127d4fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:11.285 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0f84666e-4df3-49e1-bb0c-3de74730f66f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600196, 'reachable_time': 30175, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233766, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:11.287 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:17:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:11.287 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[cbc67fea-88b3-4456-b38d-f6af4560cb0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:11 np0005539505 systemd[1]: run-netns-ovnmeta\x2d90812230\x2d35cb\x2d4e21\x2db16b\x2d75b900100d8b.mount: Deactivated successfully.
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.312 186962 DEBUG nova.compute.manager [req-918a269b-3df5-46fa-8f36-d31463da7b2e req-18068526-3dc1-437d-9920-40e450633ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-unplugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.313 186962 DEBUG oslo_concurrency.lockutils [req-918a269b-3df5-46fa-8f36-d31463da7b2e req-18068526-3dc1-437d-9920-40e450633ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.313 186962 DEBUG oslo_concurrency.lockutils [req-918a269b-3df5-46fa-8f36-d31463da7b2e req-18068526-3dc1-437d-9920-40e450633ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.313 186962 DEBUG oslo_concurrency.lockutils [req-918a269b-3df5-46fa-8f36-d31463da7b2e req-18068526-3dc1-437d-9920-40e450633ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.313 186962 DEBUG nova.compute.manager [req-918a269b-3df5-46fa-8f36-d31463da7b2e req-18068526-3dc1-437d-9920-40e450633ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] No waiting events found dispatching network-vif-unplugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.313 186962 DEBUG nova.compute.manager [req-918a269b-3df5-46fa-8f36-d31463da7b2e req-18068526-3dc1-437d-9920-40e450633ec8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-unplugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.315 186962 DEBUG nova.compute.manager [req-c59472ca-9f4f-4e93-9430-ca2c0f2bc408 req-67dfb606-ae4d-47f1-b0d1-1879a22d7f20 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-unplugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.315 186962 DEBUG oslo_concurrency.lockutils [req-c59472ca-9f4f-4e93-9430-ca2c0f2bc408 req-67dfb606-ae4d-47f1-b0d1-1879a22d7f20 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.315 186962 DEBUG oslo_concurrency.lockutils [req-c59472ca-9f4f-4e93-9430-ca2c0f2bc408 req-67dfb606-ae4d-47f1-b0d1-1879a22d7f20 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.316 186962 DEBUG oslo_concurrency.lockutils [req-c59472ca-9f4f-4e93-9430-ca2c0f2bc408 req-67dfb606-ae4d-47f1-b0d1-1879a22d7f20 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.316 186962 DEBUG nova.compute.manager [req-c59472ca-9f4f-4e93-9430-ca2c0f2bc408 req-67dfb606-ae4d-47f1-b0d1-1879a22d7f20 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] No waiting events found dispatching network-vif-unplugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.316 186962 DEBUG nova.compute.manager [req-c59472ca-9f4f-4e93-9430-ca2c0f2bc408 req-67dfb606-ae4d-47f1-b0d1-1879a22d7f20 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-unplugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.376 186962 INFO nova.compute.manager [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Took 0.89 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.377 186962 DEBUG oslo.service.loopingcall [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.377 186962 DEBUG nova.compute.manager [-] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.377 186962 DEBUG nova.network.neutron [-] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:17:11 np0005539505 nova_compute[186958]: 2025-11-29 07:17:11.931 186962 DEBUG nova.network.neutron [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "address": "fa:16:3e:c1:c1:22", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f253c88-5c", "ovs_interfaceid": "7f253c88-5c90-410c-bbe6-a152ae7c3a63", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:12 np0005539505 nova_compute[186958]: 2025-11-29 07:17:12.549 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:12 np0005539505 nova_compute[186958]: 2025-11-29 07:17:12.550 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:17:12 np0005539505 nova_compute[186958]: 2025-11-29 07:17:12.550 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.797 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.798 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.798 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.798 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.816 186962 DEBUG oslo_concurrency.lockutils [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-7da96eef-5195-4fe9-8421-3b8b79420a86" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:17:13 np0005539505 podman[233768]: 2025-11-29 07:17:13.918387157 +0000 UTC m=+0.076326107 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:17:13 np0005539505 podman[233769]: 2025-11-29 07:17:13.928056761 +0000 UTC m=+0.084167829 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.941 186962 DEBUG nova.compute.manager [req-5963cb16-bd5e-46f0-9839-46d4a052e3f2 req-055ec770-523e-4b51-95e7-a66565cec31b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-plugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.942 186962 DEBUG oslo_concurrency.lockutils [req-5963cb16-bd5e-46f0-9839-46d4a052e3f2 req-055ec770-523e-4b51-95e7-a66565cec31b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.942 186962 DEBUG oslo_concurrency.lockutils [req-5963cb16-bd5e-46f0-9839-46d4a052e3f2 req-055ec770-523e-4b51-95e7-a66565cec31b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.942 186962 DEBUG oslo_concurrency.lockutils [req-5963cb16-bd5e-46f0-9839-46d4a052e3f2 req-055ec770-523e-4b51-95e7-a66565cec31b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.943 186962 DEBUG nova.compute.manager [req-5963cb16-bd5e-46f0-9839-46d4a052e3f2 req-055ec770-523e-4b51-95e7-a66565cec31b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] No waiting events found dispatching network-vif-plugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.943 186962 WARNING nova.compute.manager [req-5963cb16-bd5e-46f0-9839-46d4a052e3f2 req-055ec770-523e-4b51-95e7-a66565cec31b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received unexpected event network-vif-plugged-7f253c88-5c90-410c-bbe6-a152ae7c3a63 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.947 186962 DEBUG oslo_concurrency.lockutils [None req-e0fd089b-1489-4372-97a7-659022be15a6 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-7da96eef-5195-4fe9-8421-3b8b79420a86-b04bb5a4-4610-4151-a86a-f1f55b164195" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 8.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:13 np0005539505 nova_compute[186958]: 2025-11-29 07:17:13.969 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:14 np0005539505 nova_compute[186958]: 2025-11-29 07:17:14.025 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:14 np0005539505 nova_compute[186958]: 2025-11-29 07:17:14.026 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:14 np0005539505 nova_compute[186958]: 2025-11-29 07:17:14.091 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:14 np0005539505 nova_compute[186958]: 2025-11-29 07:17:14.215 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:17:14 np0005539505 nova_compute[186958]: 2025-11-29 07:17:14.217 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5539MB free_disk=73.22386932373047GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:17:14 np0005539505 nova_compute[186958]: 2025-11-29 07:17:14.217 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:14 np0005539505 nova_compute[186958]: 2025-11-29 07:17:14.217 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:14 np0005539505 nova_compute[186958]: 2025-11-29 07:17:14.892 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.136 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 7da96eef-5195-4fe9-8421-3b8b79420a86 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.137 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance aa4795d1-71b1-415f-ac22-5bb11775bc84 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.137 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.138 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.179 186962 DEBUG nova.compute.manager [req-739d2948-9d56-4789-8601-b7f3f8ff0549 req-9e99d665-417a-4fdb-9058-beb6a3307561 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-plugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.180 186962 DEBUG oslo_concurrency.lockutils [req-739d2948-9d56-4789-8601-b7f3f8ff0549 req-9e99d665-417a-4fdb-9058-beb6a3307561 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.180 186962 DEBUG oslo_concurrency.lockutils [req-739d2948-9d56-4789-8601-b7f3f8ff0549 req-9e99d665-417a-4fdb-9058-beb6a3307561 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.180 186962 DEBUG oslo_concurrency.lockutils [req-739d2948-9d56-4789-8601-b7f3f8ff0549 req-9e99d665-417a-4fdb-9058-beb6a3307561 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.181 186962 DEBUG nova.compute.manager [req-739d2948-9d56-4789-8601-b7f3f8ff0549 req-9e99d665-417a-4fdb-9058-beb6a3307561 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] No waiting events found dispatching network-vif-plugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.181 186962 WARNING nova.compute.manager [req-739d2948-9d56-4789-8601-b7f3f8ff0549 req-9e99d665-417a-4fdb-9058-beb6a3307561 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received unexpected event network-vif-plugged-816e158d-4c1c-4ea8-ae90-eb4e66048a31 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.763 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.788 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.900 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.901 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.684s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.902 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.902 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:17:15 np0005539505 nova_compute[186958]: 2025-11-29 07:17:15.975 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:16 np0005539505 nova_compute[186958]: 2025-11-29 07:17:16.072 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:17:16 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:16Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1b:c4:96 10.100.0.5
Nov 29 02:17:16 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:16Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1b:c4:96 10.100.0.5
Nov 29 02:17:16 np0005539505 nova_compute[186958]: 2025-11-29 07:17:16.897 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:16 np0005539505 nova_compute[186958]: 2025-11-29 07:17:16.898 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:16 np0005539505 nova_compute[186958]: 2025-11-29 07:17:16.898 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:17:16 np0005539505 nova_compute[186958]: 2025-11-29 07:17:16.899 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:17:16 np0005539505 nova_compute[186958]: 2025-11-29 07:17:16.919 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 29 02:17:17 np0005539505 nova_compute[186958]: 2025-11-29 07:17:17.194 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:17:17 np0005539505 nova_compute[186958]: 2025-11-29 07:17:17.195 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:17:17 np0005539505 nova_compute[186958]: 2025-11-29 07:17:17.196 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:17:17 np0005539505 nova_compute[186958]: 2025-11-29 07:17:17.196 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:18 np0005539505 nova_compute[186958]: 2025-11-29 07:17:18.664 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400623.664296, 704c4aa7-3239-4ecc-bfdc-c72642678363 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:17:18 np0005539505 nova_compute[186958]: 2025-11-29 07:17:18.665 186962 INFO nova.compute.manager [-] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:17:18 np0005539505 nova_compute[186958]: 2025-11-29 07:17:18.694 186962 DEBUG nova.compute.manager [None req-2a6e6e03-f962-4bd4-85dc-ae8fed8bee2b - - - - - -] [instance: 704c4aa7-3239-4ecc-bfdc-c72642678363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:17:18 np0005539505 nova_compute[186958]: 2025-11-29 07:17:18.973 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating instance_info_cache with network_info: [{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:19 np0005539505 nova_compute[186958]: 2025-11-29 07:17:19.429 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:17:19 np0005539505 nova_compute[186958]: 2025-11-29 07:17:19.429 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:17:19 np0005539505 nova_compute[186958]: 2025-11-29 07:17:19.429 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:19 np0005539505 nova_compute[186958]: 2025-11-29 07:17:19.894 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:20 np0005539505 nova_compute[186958]: 2025-11-29 07:17:20.018 186962 DEBUG nova.compute.manager [req-9b161218-1693-4317-b113-670039937534 req-7d25068f-e13f-409c-9144-af069e7bd558 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-deleted-7f253c88-5c90-410c-bbe6-a152ae7c3a63 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:20 np0005539505 nova_compute[186958]: 2025-11-29 07:17:20.018 186962 INFO nova.compute.manager [req-9b161218-1693-4317-b113-670039937534 req-7d25068f-e13f-409c-9144-af069e7bd558 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Neutron deleted interface 7f253c88-5c90-410c-bbe6-a152ae7c3a63; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:17:20 np0005539505 nova_compute[186958]: 2025-11-29 07:17:20.018 186962 DEBUG nova.network.neutron [req-9b161218-1693-4317-b113-670039937534 req-7d25068f-e13f-409c-9144-af069e7bd558 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [{"id": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "address": "fa:16:3e:0a:52:e0", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap816e158d-4c", "ovs_interfaceid": "816e158d-4c1c-4ea8-ae90-eb4e66048a31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "address": "fa:16:3e:63:45:c4", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb6aa55-89", "ovs_interfaceid": "bfb6aa55-8943-44ec-93c0-037a3c64e742", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:20 np0005539505 nova_compute[186958]: 2025-11-29 07:17:20.046 186962 DEBUG nova.compute.manager [req-9b161218-1693-4317-b113-670039937534 req-7d25068f-e13f-409c-9144-af069e7bd558 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Detach interface failed, port_id=7f253c88-5c90-410c-bbe6-a152ae7c3a63, reason: Instance 7da96eef-5195-4fe9-8421-3b8b79420a86 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:17:20 np0005539505 podman[233840]: 2025-11-29 07:17:20.723488769 +0000 UTC m=+0.056533835 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:17:20 np0005539505 podman[233839]: 2025-11-29 07:17:20.727875084 +0000 UTC m=+0.060004854 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd)
Nov 29 02:17:20 np0005539505 systemd[1]: Starting dnf makecache...
Nov 29 02:17:20 np0005539505 nova_compute[186958]: 2025-11-29 07:17:20.863 186962 DEBUG nova.network.neutron [-] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:20 np0005539505 dnf[233881]: Metadata cache refreshed recently.
Nov 29 02:17:20 np0005539505 nova_compute[186958]: 2025-11-29 07:17:20.977 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:20 np0005539505 nova_compute[186958]: 2025-11-29 07:17:20.979 186962 INFO nova.compute.manager [-] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Took 9.60 seconds to deallocate network for instance.#033[00m
Nov 29 02:17:20 np0005539505 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 29 02:17:20 np0005539505 systemd[1]: Finished dnf makecache.
Nov 29 02:17:21 np0005539505 nova_compute[186958]: 2025-11-29 07:17:21.079 186962 DEBUG oslo_concurrency.lockutils [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:21 np0005539505 nova_compute[186958]: 2025-11-29 07:17:21.079 186962 DEBUG oslo_concurrency.lockutils [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:21 np0005539505 nova_compute[186958]: 2025-11-29 07:17:21.166 186962 DEBUG nova.compute.provider_tree [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:17:21 np0005539505 nova_compute[186958]: 2025-11-29 07:17:21.187 186962 DEBUG nova.scheduler.client.report [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:17:21 np0005539505 nova_compute[186958]: 2025-11-29 07:17:21.217 186962 DEBUG oslo_concurrency.lockutils [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:21 np0005539505 nova_compute[186958]: 2025-11-29 07:17:21.249 186962 INFO nova.scheduler.client.report [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Deleted allocations for instance 7da96eef-5195-4fe9-8421-3b8b79420a86#033[00m
Nov 29 02:17:21 np0005539505 nova_compute[186958]: 2025-11-29 07:17:21.335 186962 DEBUG oslo_concurrency.lockutils [None req-1cf2c7e8-f2c8-4716-a9d7-3e8cc0975ade 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "7da96eef-5195-4fe9-8421-3b8b79420a86" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 11.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:22 np0005539505 nova_compute[186958]: 2025-11-29 07:17:22.358 186962 DEBUG nova.compute.manager [req-d36497fa-6cd0-4d47-8f7a-0fd3e86205d1 req-cbb5870f-c3d9-472c-bfed-dacd1fcdac44 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Received event network-vif-deleted-816e158d-4c1c-4ea8-ae90-eb4e66048a31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:22 np0005539505 nova_compute[186958]: 2025-11-29 07:17:22.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:22 np0005539505 nova_compute[186958]: 2025-11-29 07:17:22.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:17:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:23.961 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:17:23 np0005539505 nova_compute[186958]: 2025-11-29 07:17:23.962 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:23.963 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:17:24 np0005539505 nova_compute[186958]: 2025-11-29 07:17:24.400 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:24 np0005539505 nova_compute[186958]: 2025-11-29 07:17:24.895 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:25 np0005539505 nova_compute[186958]: 2025-11-29 07:17:25.770 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400630.7697744, 7da96eef-5195-4fe9-8421-3b8b79420a86 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:17:25 np0005539505 nova_compute[186958]: 2025-11-29 07:17:25.771 186962 INFO nova.compute.manager [-] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:17:25 np0005539505 nova_compute[186958]: 2025-11-29 07:17:25.816 186962 DEBUG nova.compute.manager [None req-7ab5f541-68b4-4ae5-aa12-9136db9799e6 - - - - - -] [instance: 7da96eef-5195-4fe9-8421-3b8b79420a86] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:17:25 np0005539505 nova_compute[186958]: 2025-11-29 07:17:25.980 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:26.956 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:26.957 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:26.957 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:28.966 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:29 np0005539505 nova_compute[186958]: 2025-11-29 07:17:29.897 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:31 np0005539505 nova_compute[186958]: 2025-11-29 07:17:31.019 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:32 np0005539505 podman[233884]: 2025-11-29 07:17:32.743560185 +0000 UTC m=+0.059768267 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:17:32 np0005539505 podman[233883]: 2025-11-29 07:17:32.751361676 +0000 UTC m=+0.071533890 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter)
Nov 29 02:17:34 np0005539505 nova_compute[186958]: 2025-11-29 07:17:34.899 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:35 np0005539505 podman[233928]: 2025-11-29 07:17:35.709042785 +0000 UTC m=+0.043262909 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:17:36 np0005539505 nova_compute[186958]: 2025-11-29 07:17:36.022 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:36 np0005539505 nova_compute[186958]: 2025-11-29 07:17:36.946 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:36 np0005539505 nova_compute[186958]: 2025-11-29 07:17:36.947 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:37 np0005539505 nova_compute[186958]: 2025-11-29 07:17:37.400 186962 DEBUG nova.compute.manager [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:17:37 np0005539505 nova_compute[186958]: 2025-11-29 07:17:37.795 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:37 np0005539505 nova_compute[186958]: 2025-11-29 07:17:37.795 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:37 np0005539505 nova_compute[186958]: 2025-11-29 07:17:37.806 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:17:37 np0005539505 nova_compute[186958]: 2025-11-29 07:17:37.807 186962 INFO nova.compute.claims [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:17:38 np0005539505 nova_compute[186958]: 2025-11-29 07:17:38.520 186962 DEBUG nova.compute.provider_tree [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:17:38 np0005539505 nova_compute[186958]: 2025-11-29 07:17:38.579 186962 DEBUG nova.scheduler.client.report [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:17:38 np0005539505 nova_compute[186958]: 2025-11-29 07:17:38.727 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.931s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:38 np0005539505 nova_compute[186958]: 2025-11-29 07:17:38.728 186962 DEBUG nova.compute.manager [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:17:39 np0005539505 nova_compute[186958]: 2025-11-29 07:17:39.319 186962 DEBUG nova.compute.manager [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:17:39 np0005539505 nova_compute[186958]: 2025-11-29 07:17:39.319 186962 DEBUG nova.network.neutron [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:17:39 np0005539505 nova_compute[186958]: 2025-11-29 07:17:39.526 186962 INFO nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:17:39 np0005539505 nova_compute[186958]: 2025-11-29 07:17:39.747 186962 DEBUG nova.compute.manager [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:17:39 np0005539505 nova_compute[186958]: 2025-11-29 07:17:39.900 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:40 np0005539505 nova_compute[186958]: 2025-11-29 07:17:40.949 186962 DEBUG nova.policy [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.025 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.061 186962 DEBUG nova.compute.manager [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.063 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.064 186962 INFO nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Creating image(s)#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.065 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.065 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.067 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.103 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.184 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.185 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.186 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.198 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.253 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.254 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.303 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.304 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.305 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.357 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.358 186962 DEBUG nova.virt.disk.api [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Checking if we can resize image /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.359 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.413 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.414 186962 DEBUG nova.virt.disk.api [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Cannot resize image /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.415 186962 DEBUG nova.objects.instance [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'migration_context' on Instance uuid d71f022d-ac2d-48cb-bc26-3a9097ba969e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.430 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.430 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Ensure instance console log exists: /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.431 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.432 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:41 np0005539505 nova_compute[186958]: 2025-11-29 07:17:41.432 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:44 np0005539505 nova_compute[186958]: 2025-11-29 07:17:44.107 186962 DEBUG nova.network.neutron [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Successfully created port: 4c93f14a-a590-48c6-acc4-f7ec9a91f59f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:17:44 np0005539505 podman[233963]: 2025-11-29 07:17:44.734599738 +0000 UTC m=+0.065485389 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:17:44 np0005539505 podman[233964]: 2025-11-29 07:17:44.803134253 +0000 UTC m=+0.134544729 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:17:44 np0005539505 nova_compute[186958]: 2025-11-29 07:17:44.901 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:46 np0005539505 nova_compute[186958]: 2025-11-29 07:17:46.027 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:47 np0005539505 nova_compute[186958]: 2025-11-29 07:17:47.102 186962 DEBUG nova.network.neutron [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Successfully updated port: 4c93f14a-a590-48c6-acc4-f7ec9a91f59f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:17:47 np0005539505 nova_compute[186958]: 2025-11-29 07:17:47.123 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:17:47 np0005539505 nova_compute[186958]: 2025-11-29 07:17:47.123 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:17:47 np0005539505 nova_compute[186958]: 2025-11-29 07:17:47.123 186962 DEBUG nova.network.neutron [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:17:47 np0005539505 nova_compute[186958]: 2025-11-29 07:17:47.409 186962 DEBUG nova.compute.manager [req-786b9c33-0128-4b1e-8573-c1a3a0e6a403 req-929188b4-1175-4944-8a4c-198f55f724b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-changed-4c93f14a-a590-48c6-acc4-f7ec9a91f59f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:47 np0005539505 nova_compute[186958]: 2025-11-29 07:17:47.410 186962 DEBUG nova.compute.manager [req-786b9c33-0128-4b1e-8573-c1a3a0e6a403 req-929188b4-1175-4944-8a4c-198f55f724b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing instance network info cache due to event network-changed-4c93f14a-a590-48c6-acc4-f7ec9a91f59f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:17:47 np0005539505 nova_compute[186958]: 2025-11-29 07:17:47.411 186962 DEBUG oslo_concurrency.lockutils [req-786b9c33-0128-4b1e-8573-c1a3a0e6a403 req-929188b4-1175-4944-8a4c-198f55f724b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:17:47 np0005539505 nova_compute[186958]: 2025-11-29 07:17:47.596 186962 DEBUG nova.network.neutron [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:17:48 np0005539505 nova_compute[186958]: 2025-11-29 07:17:48.945 186962 DEBUG nova.network.neutron [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updating instance_info_cache with network_info: [{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.065 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.065 186962 DEBUG nova.compute.manager [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Instance network_info: |[{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.066 186962 DEBUG oslo_concurrency.lockutils [req-786b9c33-0128-4b1e-8573-c1a3a0e6a403 req-929188b4-1175-4944-8a4c-198f55f724b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.066 186962 DEBUG nova.network.neutron [req-786b9c33-0128-4b1e-8573-c1a3a0e6a403 req-929188b4-1175-4944-8a4c-198f55f724b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing network info cache for port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.068 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Start _get_guest_xml network_info=[{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.073 186962 WARNING nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.084 186962 DEBUG nova.virt.libvirt.host [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.084 186962 DEBUG nova.virt.libvirt.host [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.088 186962 DEBUG nova.virt.libvirt.host [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.088 186962 DEBUG nova.virt.libvirt.host [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.089 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.090 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.090 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.090 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.091 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.091 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.091 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.091 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.091 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.092 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.092 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.092 186962 DEBUG nova.virt.hardware [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.095 186962 DEBUG nova.virt.libvirt.vif [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:17:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1971408115',display_name='tempest-tempest.common.compute-instance-1971408115',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1971408115',id=107,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-9f0t20th',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:17:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d71f022d-ac2d-48cb-bc26-3a9097ba969e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.096 186962 DEBUG nova.network.os_vif_util [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.096 186962 DEBUG nova.network.os_vif_util [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:74:96,bridge_name='br-int',has_traffic_filtering=True,id=4c93f14a-a590-48c6-acc4-f7ec9a91f59f,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c93f14a-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.098 186962 DEBUG nova.objects.instance [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'pci_devices' on Instance uuid d71f022d-ac2d-48cb-bc26-3a9097ba969e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.201 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  <uuid>d71f022d-ac2d-48cb-bc26-3a9097ba969e</uuid>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  <name>instance-0000006b</name>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <nova:name>tempest-tempest.common.compute-instance-1971408115</nova:name>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:17:49</nova:creationTime>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:        <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:        <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:        <nova:port uuid="4c93f14a-a590-48c6-acc4-f7ec9a91f59f">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <entry name="serial">d71f022d-ac2d-48cb-bc26-3a9097ba969e</entry>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <entry name="uuid">d71f022d-ac2d-48cb-bc26-3a9097ba969e</entry>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.config"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:74:74:96"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <target dev="tap4c93f14a-a5"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/console.log" append="off"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:17:49 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:17:49 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:17:49 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:17:49 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.203 186962 DEBUG nova.compute.manager [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Preparing to wait for external event network-vif-plugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.203 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.204 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.204 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.205 186962 DEBUG nova.virt.libvirt.vif [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:17:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1971408115',display_name='tempest-tempest.common.compute-instance-1971408115',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1971408115',id=107,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-9f0t20th',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:17:39Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d71f022d-ac2d-48cb-bc26-3a9097ba969e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.205 186962 DEBUG nova.network.os_vif_util [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.206 186962 DEBUG nova.network.os_vif_util [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:74:96,bridge_name='br-int',has_traffic_filtering=True,id=4c93f14a-a590-48c6-acc4-f7ec9a91f59f,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c93f14a-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.206 186962 DEBUG os_vif [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:74:96,bridge_name='br-int',has_traffic_filtering=True,id=4c93f14a-a590-48c6-acc4-f7ec9a91f59f,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c93f14a-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.207 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.207 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.208 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.213 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.213 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c93f14a-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.213 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4c93f14a-a5, col_values=(('external_ids', {'iface-id': '4c93f14a-a590-48c6-acc4-f7ec9a91f59f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:74:96', 'vm-uuid': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.216 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:49 np0005539505 NetworkManager[55134]: <info>  [1764400669.2173] manager: (tap4c93f14a-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.217 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.223 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.224 186962 INFO os_vif [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:74:96,bridge_name='br-int',has_traffic_filtering=True,id=4c93f14a-a590-48c6-acc4-f7ec9a91f59f,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c93f14a-a5')#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.417 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.418 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.418 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:74:74:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.419 186962 INFO nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Using config drive#033[00m
Nov 29 02:17:49 np0005539505 nova_compute[186958]: 2025-11-29 07:17:49.903 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:50 np0005539505 nova_compute[186958]: 2025-11-29 07:17:50.225 186962 INFO nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Creating config drive at /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.config#033[00m
Nov 29 02:17:50 np0005539505 nova_compute[186958]: 2025-11-29 07:17:50.231 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppsgirh2e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:17:50 np0005539505 nova_compute[186958]: 2025-11-29 07:17:50.362 186962 DEBUG oslo_concurrency.processutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppsgirh2e" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:17:50 np0005539505 kernel: tap4c93f14a-a5: entered promiscuous mode
Nov 29 02:17:50 np0005539505 NetworkManager[55134]: <info>  [1764400670.4313] manager: (tap4c93f14a-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/233)
Nov 29 02:17:50 np0005539505 nova_compute[186958]: 2025-11-29 07:17:50.480 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:50Z|00465|binding|INFO|Claiming lport 4c93f14a-a590-48c6-acc4-f7ec9a91f59f for this chassis.
Nov 29 02:17:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:50Z|00466|binding|INFO|4c93f14a-a590-48c6-acc4-f7ec9a91f59f: Claiming fa:16:3e:74:74:96 10.100.0.11
Nov 29 02:17:50 np0005539505 systemd-udevd[234034]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:17:50 np0005539505 systemd-machined[153285]: New machine qemu-55-instance-0000006b.
Nov 29 02:17:50 np0005539505 NetworkManager[55134]: <info>  [1764400670.5254] device (tap4c93f14a-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:17:50 np0005539505 NetworkManager[55134]: <info>  [1764400670.5266] device (tap4c93f14a-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:17:50 np0005539505 systemd[1]: Started Virtual Machine qemu-55-instance-0000006b.
Nov 29 02:17:50 np0005539505 nova_compute[186958]: 2025-11-29 07:17:50.534 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:50Z|00467|binding|INFO|Setting lport 4c93f14a-a590-48c6-acc4-f7ec9a91f59f ovn-installed in OVS
Nov 29 02:17:50 np0005539505 nova_compute[186958]: 2025-11-29 07:17:50.538 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:50Z|00468|binding|INFO|Setting lport 4c93f14a-a590-48c6-acc4-f7ec9a91f59f up in Southbound
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.793 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:74:96 10.100.0.11'], port_security=['fa:16:3e:74:74:96 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a81715ba-eace-471d-9f71-9964fcbf6d85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=4c93f14a-a590-48c6-acc4-f7ec9a91f59f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.796 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f in datapath 90812230-35cb-4e21-b16b-75b900100d8b bound to our chassis#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.800 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.816 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ac182ec6-4317-42e2-8ec5-4dbaaf923d17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.817 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap90812230-31 in ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.819 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap90812230-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.819 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0b158db0-b174-499d-bdf7-6ad264f97aa6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.820 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[19a0f900-a7be-4fc4-a582-faf8a3e19d74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.834 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb4da78-dee9-4980-84dd-1c05f55c8ea5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.854 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec08079-85d5-4971-9b3b-d5340ea6d092]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.888 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[66ddd351-b26f-45da-a26a-8783d983a954]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.894 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1c6663c4-0c39-45ee-b846-0d3832d39f20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:50 np0005539505 NetworkManager[55134]: <info>  [1764400670.8953] manager: (tap90812230-30): new Veth device (/org/freedesktop/NetworkManager/Devices/234)
Nov 29 02:17:50 np0005539505 systemd-udevd[234037]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:17:50 np0005539505 podman[234048]: 2025-11-29 07:17:50.927390156 +0000 UTC m=+0.062956908 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.932 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[83fbeb68-69eb-49c9-b827-9788e8598b3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:50 np0005539505 podman[234046]: 2025-11-29 07:17:50.934130227 +0000 UTC m=+0.068241037 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.936 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[74847490-ec7a-4833-bc63-47a4a55ee5a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:50 np0005539505 NetworkManager[55134]: <info>  [1764400670.9586] device (tap90812230-30): carrier: link connected
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.966 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[49e1979b-5814-4ee6-a02a-d74ada0ab17d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.982 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[83e1fa55-131b-4b90-bcbb-d0eb1547fcb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 611854, 'reachable_time': 21480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234103, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:50.997 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cad66ca6-e443-4345-bf38-bb7fcdfa70aa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:5f07'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 611854, 'tstamp': 611854}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234104, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:51.014 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9358ed91-b920-4ba8-8948-f310eb7228ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 611854, 'reachable_time': 21480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234105, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:51.050 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ff28978f-ea4d-4f8d-a2d1-f1048cd60d5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:51.109 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cffe9649-ae1d-4dec-9b05-cb84922f82d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:51.110 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:51.110 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:51.111 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.112 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539505 kernel: tap90812230-30: entered promiscuous mode
Nov 29 02:17:51 np0005539505 NetworkManager[55134]: <info>  [1764400671.1133] manager: (tap90812230-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/235)
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.114 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:51.115 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.116 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539505 ovn_controller[95143]: 2025-11-29T07:17:51Z|00469|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.117 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:51.118 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:51.118 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cb0a4fe9-6470-469c-b54e-092648fae545]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:51.119 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-90812230-35cb-4e21-b16b-75b900100d8b
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/90812230-35cb-4e21-b16b-75b900100d8b.pid.haproxy
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 90812230-35cb-4e21-b16b-75b900100d8b
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:17:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:17:51.120 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'env', 'PROCESS_TAG=haproxy-90812230-35cb-4e21-b16b-75b900100d8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/90812230-35cb-4e21-b16b-75b900100d8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.128 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.164 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400671.1639004, d71f022d-ac2d-48cb-bc26-3a9097ba969e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.165 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] VM Started (Lifecycle Event)#033[00m
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.402 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.413 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400671.1641033, d71f022d-ac2d-48cb-bc26-3a9097ba969e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.414 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:17:51 np0005539505 podman[234144]: 2025-11-29 07:17:51.47840273 +0000 UTC m=+0.027852962 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:17:51 np0005539505 podman[234144]: 2025-11-29 07:17:51.707687385 +0000 UTC m=+0.257137597 container create 12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.729 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.734 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:17:51 np0005539505 systemd[1]: Started libpod-conmon-12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f.scope.
Nov 29 02:17:51 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:17:51 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ceac94602c8b4e563d7c2f5fbb60361cd619db799d6e4b7b18d19b9f3d2ec593/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:17:51 np0005539505 podman[234144]: 2025-11-29 07:17:51.775006335 +0000 UTC m=+0.324456567 container init 12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:17:51 np0005539505 podman[234144]: 2025-11-29 07:17:51.781093898 +0000 UTC m=+0.330544110 container start 12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:17:51 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[234159]: [NOTICE]   (234163) : New worker (234165) forked
Nov 29 02:17:51 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[234159]: [NOTICE]   (234163) : Loading success.
Nov 29 02:17:51 np0005539505 nova_compute[186958]: 2025-11-29 07:17:51.811 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.325 186962 DEBUG nova.network.neutron [req-786b9c33-0128-4b1e-8573-c1a3a0e6a403 req-929188b4-1175-4944-8a4c-198f55f724b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updated VIF entry in instance network info cache for port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.326 186962 DEBUG nova.network.neutron [req-786b9c33-0128-4b1e-8573-c1a3a0e6a403 req-929188b4-1175-4944-8a4c-198f55f724b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updating instance_info_cache with network_info: [{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.344 186962 DEBUG nova.compute.manager [req-ef639cda-14a1-4f79-a55e-18678997d46b req-56765156-8f40-4d1e-8aa2-d07ec097c6ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-vif-plugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.344 186962 DEBUG oslo_concurrency.lockutils [req-ef639cda-14a1-4f79-a55e-18678997d46b req-56765156-8f40-4d1e-8aa2-d07ec097c6ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.345 186962 DEBUG oslo_concurrency.lockutils [req-ef639cda-14a1-4f79-a55e-18678997d46b req-56765156-8f40-4d1e-8aa2-d07ec097c6ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.345 186962 DEBUG oslo_concurrency.lockutils [req-ef639cda-14a1-4f79-a55e-18678997d46b req-56765156-8f40-4d1e-8aa2-d07ec097c6ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.345 186962 DEBUG nova.compute.manager [req-ef639cda-14a1-4f79-a55e-18678997d46b req-56765156-8f40-4d1e-8aa2-d07ec097c6ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Processing event network-vif-plugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.346 186962 DEBUG nova.compute.manager [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.349 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400672.3495288, d71f022d-ac2d-48cb-bc26-3a9097ba969e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.350 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.351 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.354 186962 INFO nova.virt.libvirt.driver [-] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Instance spawned successfully.#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.354 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.565 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.566 186962 DEBUG oslo_concurrency.lockutils [req-786b9c33-0128-4b1e-8573-c1a3a0e6a403 req-929188b4-1175-4944-8a4c-198f55f724b5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.570 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.571 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.571 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.572 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.572 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.573 186962 DEBUG nova.virt.libvirt.driver [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.578 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:17:52 np0005539505 nova_compute[186958]: 2025-11-29 07:17:52.687 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:17:53 np0005539505 nova_compute[186958]: 2025-11-29 07:17:53.477 186962 INFO nova.compute.manager [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Took 12.42 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:17:53 np0005539505 nova_compute[186958]: 2025-11-29 07:17:53.478 186962 DEBUG nova.compute.manager [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:17:54 np0005539505 nova_compute[186958]: 2025-11-29 07:17:54.217 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:54 np0005539505 nova_compute[186958]: 2025-11-29 07:17:54.585 186962 INFO nova.compute.manager [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Took 16.84 seconds to build instance.#033[00m
Nov 29 02:17:54 np0005539505 nova_compute[186958]: 2025-11-29 07:17:54.774 186962 DEBUG oslo_concurrency.lockutils [None req-eace6464-1598-432e-9ade-16e96fea5c8e 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:54 np0005539505 nova_compute[186958]: 2025-11-29 07:17:54.906 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:55 np0005539505 nova_compute[186958]: 2025-11-29 07:17:55.466 186962 DEBUG nova.compute.manager [req-9bced731-be25-4901-a4ce-0455460dd3d1 req-d1c2ac30-0f3d-4c75-8d74-41460933260d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-vif-plugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:17:55 np0005539505 nova_compute[186958]: 2025-11-29 07:17:55.466 186962 DEBUG oslo_concurrency.lockutils [req-9bced731-be25-4901-a4ce-0455460dd3d1 req-d1c2ac30-0f3d-4c75-8d74-41460933260d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:17:55 np0005539505 nova_compute[186958]: 2025-11-29 07:17:55.467 186962 DEBUG oslo_concurrency.lockutils [req-9bced731-be25-4901-a4ce-0455460dd3d1 req-d1c2ac30-0f3d-4c75-8d74-41460933260d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:17:55 np0005539505 nova_compute[186958]: 2025-11-29 07:17:55.467 186962 DEBUG oslo_concurrency.lockutils [req-9bced731-be25-4901-a4ce-0455460dd3d1 req-d1c2ac30-0f3d-4c75-8d74-41460933260d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:17:55 np0005539505 nova_compute[186958]: 2025-11-29 07:17:55.467 186962 DEBUG nova.compute.manager [req-9bced731-be25-4901-a4ce-0455460dd3d1 req-d1c2ac30-0f3d-4c75-8d74-41460933260d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] No waiting events found dispatching network-vif-plugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:17:55 np0005539505 nova_compute[186958]: 2025-11-29 07:17:55.467 186962 WARNING nova.compute.manager [req-9bced731-be25-4901-a4ce-0455460dd3d1 req-d1c2ac30-0f3d-4c75-8d74-41460933260d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received unexpected event network-vif-plugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f for instance with vm_state active and task_state None.#033[00m
Nov 29 02:17:59 np0005539505 nova_compute[186958]: 2025-11-29 07:17:59.221 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:17:59 np0005539505 nova_compute[186958]: 2025-11-29 07:17:59.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:17:59 np0005539505 nova_compute[186958]: 2025-11-29 07:17:59.912 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:00 np0005539505 NetworkManager[55134]: <info>  [1764400680.6058] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Nov 29 02:18:00 np0005539505 NetworkManager[55134]: <info>  [1764400680.6063] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Nov 29 02:18:00 np0005539505 nova_compute[186958]: 2025-11-29 07:18:00.613 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:00 np0005539505 nova_compute[186958]: 2025-11-29 07:18:00.753 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:18:00Z|00470|binding|INFO|Releasing lport 17905b79-5cd7-4b55-9191-5d935325b1f0 from this chassis (sb_readonly=0)
Nov 29 02:18:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:18:00Z|00471|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:18:00 np0005539505 nova_compute[186958]: 2025-11-29 07:18:00.779 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:01 np0005539505 nova_compute[186958]: 2025-11-29 07:18:01.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:03 np0005539505 nova_compute[186958]: 2025-11-29 07:18:03.522 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:18:03.522 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:18:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:18:03.524 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:18:03 np0005539505 podman[234192]: 2025-11-29 07:18:03.749132618 +0000 UTC m=+0.076428329 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:18:03 np0005539505 podman[234191]: 2025-11-29 07:18:03.756660431 +0000 UTC m=+0.086199045 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, build-date=2025-08-20T13:12:41, release=1755695350)
Nov 29 02:18:04 np0005539505 nova_compute[186958]: 2025-11-29 07:18:04.224 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:04 np0005539505 nova_compute[186958]: 2025-11-29 07:18:04.285 186962 DEBUG nova.compute.manager [req-0d9a79f2-f16c-4a1b-9407-8deded09f7ec req-e0f9c2fe-a4eb-401b-a839-0fe11c2b2c24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-changed-4c93f14a-a590-48c6-acc4-f7ec9a91f59f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:18:04 np0005539505 nova_compute[186958]: 2025-11-29 07:18:04.286 186962 DEBUG nova.compute.manager [req-0d9a79f2-f16c-4a1b-9407-8deded09f7ec req-e0f9c2fe-a4eb-401b-a839-0fe11c2b2c24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing instance network info cache due to event network-changed-4c93f14a-a590-48c6-acc4-f7ec9a91f59f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:18:04 np0005539505 nova_compute[186958]: 2025-11-29 07:18:04.286 186962 DEBUG oslo_concurrency.lockutils [req-0d9a79f2-f16c-4a1b-9407-8deded09f7ec req-e0f9c2fe-a4eb-401b-a839-0fe11c2b2c24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:18:04 np0005539505 nova_compute[186958]: 2025-11-29 07:18:04.286 186962 DEBUG oslo_concurrency.lockutils [req-0d9a79f2-f16c-4a1b-9407-8deded09f7ec req-e0f9c2fe-a4eb-401b-a839-0fe11c2b2c24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:18:04 np0005539505 nova_compute[186958]: 2025-11-29 07:18:04.286 186962 DEBUG nova.network.neutron [req-0d9a79f2-f16c-4a1b-9407-8deded09f7ec req-e0f9c2fe-a4eb-401b-a839-0fe11c2b2c24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing network info cache for port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:18:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:18:04Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:74:74:96 10.100.0.11
Nov 29 02:18:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:18:04Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:74:74:96 10.100.0.11
Nov 29 02:18:04 np0005539505 nova_compute[186958]: 2025-11-29 07:18:04.915 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:06 np0005539505 podman[234235]: 2025-11-29 07:18:06.750162918 +0000 UTC m=+0.076656886 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:18:07 np0005539505 nova_compute[186958]: 2025-11-29 07:18:07.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:07 np0005539505 nova_compute[186958]: 2025-11-29 07:18:07.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:08 np0005539505 nova_compute[186958]: 2025-11-29 07:18:08.406 186962 DEBUG nova.network.neutron [req-0d9a79f2-f16c-4a1b-9407-8deded09f7ec req-e0f9c2fe-a4eb-401b-a839-0fe11c2b2c24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updated VIF entry in instance network info cache for port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:18:08 np0005539505 nova_compute[186958]: 2025-11-29 07:18:08.407 186962 DEBUG nova.network.neutron [req-0d9a79f2-f16c-4a1b-9407-8deded09f7ec req-e0f9c2fe-a4eb-401b-a839-0fe11c2b2c24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updating instance_info_cache with network_info: [{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:18:09 np0005539505 nova_compute[186958]: 2025-11-29 07:18:09.228 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:09 np0005539505 nova_compute[186958]: 2025-11-29 07:18:09.916 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:09 np0005539505 nova_compute[186958]: 2025-11-29 07:18:09.982 186962 DEBUG oslo_concurrency.lockutils [req-0d9a79f2-f16c-4a1b-9407-8deded09f7ec req-e0f9c2fe-a4eb-401b-a839-0fe11c2b2c24 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:18:13 np0005539505 nova_compute[186958]: 2025-11-29 07:18:13.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:13 np0005539505 nova_compute[186958]: 2025-11-29 07:18:13.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:18:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:18:13.527 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:18:14 np0005539505 nova_compute[186958]: 2025-11-29 07:18:14.232 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:14 np0005539505 nova_compute[186958]: 2025-11-29 07:18:14.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:14 np0005539505 nova_compute[186958]: 2025-11-29 07:18:14.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:14 np0005539505 nova_compute[186958]: 2025-11-29 07:18:14.638 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:14 np0005539505 nova_compute[186958]: 2025-11-29 07:18:14.638 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:14 np0005539505 nova_compute[186958]: 2025-11-29 07:18:14.638 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:14 np0005539505 nova_compute[186958]: 2025-11-29 07:18:14.639 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:18:14 np0005539505 nova_compute[186958]: 2025-11-29 07:18:14.931 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:15 np0005539505 podman[234255]: 2025-11-29 07:18:15.731132494 +0000 UTC m=+0.054611490 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:18:15 np0005539505 podman[234256]: 2025-11-29 07:18:15.790702745 +0000 UTC m=+0.100966836 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.223 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.281 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.283 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.338 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.346 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.402 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.404 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.460 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.607 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.608 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5390MB free_disk=73.16722106933594GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.609 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:16 np0005539505 nova_compute[186958]: 2025-11-29 07:18:16.609 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:19 np0005539505 nova_compute[186958]: 2025-11-29 07:18:19.093 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance aa4795d1-71b1-415f-ac22-5bb11775bc84 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:18:19 np0005539505 nova_compute[186958]: 2025-11-29 07:18:19.094 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance d71f022d-ac2d-48cb-bc26-3a9097ba969e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:18:19 np0005539505 nova_compute[186958]: 2025-11-29 07:18:19.094 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:18:19 np0005539505 nova_compute[186958]: 2025-11-29 07:18:19.094 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:18:19 np0005539505 nova_compute[186958]: 2025-11-29 07:18:19.236 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:19 np0005539505 nova_compute[186958]: 2025-11-29 07:18:19.251 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:18:19 np0005539505 nova_compute[186958]: 2025-11-29 07:18:19.288 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:18:19 np0005539505 nova_compute[186958]: 2025-11-29 07:18:19.357 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:18:19 np0005539505 nova_compute[186958]: 2025-11-29 07:18:19.358 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.749s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:19 np0005539505 nova_compute[186958]: 2025-11-29 07:18:19.934 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:20 np0005539505 nova_compute[186958]: 2025-11-29 07:18:20.354 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:20 np0005539505 nova_compute[186958]: 2025-11-29 07:18:20.354 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:20 np0005539505 nova_compute[186958]: 2025-11-29 07:18:20.355 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:18:20 np0005539505 nova_compute[186958]: 2025-11-29 07:18:20.355 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:18:20 np0005539505 nova_compute[186958]: 2025-11-29 07:18:20.973 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:18:20 np0005539505 nova_compute[186958]: 2025-11-29 07:18:20.974 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:18:20 np0005539505 nova_compute[186958]: 2025-11-29 07:18:20.974 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:18:20 np0005539505 nova_compute[186958]: 2025-11-29 07:18:20.975 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:18:21 np0005539505 podman[234332]: 2025-11-29 07:18:21.759124936 +0000 UTC m=+0.070813130 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:18:21 np0005539505 podman[234331]: 2025-11-29 07:18:21.773135884 +0000 UTC m=+0.081691369 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:18:24 np0005539505 nova_compute[186958]: 2025-11-29 07:18:24.240 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:24 np0005539505 nova_compute[186958]: 2025-11-29 07:18:24.936 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:18:26.958 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:18:26.959 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:18:26.960 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:29 np0005539505 nova_compute[186958]: 2025-11-29 07:18:29.248 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:29 np0005539505 nova_compute[186958]: 2025-11-29 07:18:29.419 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating instance_info_cache with network_info: [{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:18:29 np0005539505 nova_compute[186958]: 2025-11-29 07:18:29.909 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:18:29 np0005539505 nova_compute[186958]: 2025-11-29 07:18:29.909 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:18:29 np0005539505 nova_compute[186958]: 2025-11-29 07:18:29.910 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:18:29 np0005539505 nova_compute[186958]: 2025-11-29 07:18:29.937 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:34 np0005539505 nova_compute[186958]: 2025-11-29 07:18:34.252 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:34 np0005539505 podman[234373]: 2025-11-29 07:18:34.724275427 +0000 UTC m=+0.051093650 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:18:34 np0005539505 podman[234372]: 2025-11-29 07:18:34.728830757 +0000 UTC m=+0.061182607 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc.)
Nov 29 02:18:34 np0005539505 nova_compute[186958]: 2025-11-29 07:18:34.940 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:37 np0005539505 podman[234416]: 2025-11-29 07:18:37.731253555 +0000 UTC m=+0.064604924 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:18:39 np0005539505 nova_compute[186958]: 2025-11-29 07:18:39.256 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:39 np0005539505 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 29 02:18:39 np0005539505 nova_compute[186958]: 2025-11-29 07:18:39.942 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:44 np0005539505 nova_compute[186958]: 2025-11-29 07:18:44.262 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:44 np0005539505 nova_compute[186958]: 2025-11-29 07:18:44.945 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:46 np0005539505 podman[234437]: 2025-11-29 07:18:46.7110704 +0000 UTC m=+0.045452971 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:18:46 np0005539505 podman[234438]: 2025-11-29 07:18:46.743159581 +0000 UTC m=+0.073998851 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 02:18:47 np0005539505 nova_compute[186958]: 2025-11-29 07:18:47.401 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:47 np0005539505 nova_compute[186958]: 2025-11-29 07:18:47.401 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:47 np0005539505 nova_compute[186958]: 2025-11-29 07:18:47.402 186962 INFO nova.compute.manager [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Unshelving#033[00m
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.091 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000068', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '329bbbdd41424742b3045e77150a498e', 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'hostId': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.093 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'name': 'tempest-tempest.common.compute-instance-1971408115', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006b', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '16d7af1670ea460db3d0422f176b6f98', 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'hostId': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.105 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.106 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.119 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.119 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35700ba6-fe98-4716-8271-6980b5263d06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:18:48.094378', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5e83444-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.735223756, 'message_signature': '038b6e9338ed646f21b6c44609bf3b2b12998afae03492bc9fb46872f50fecbc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:18:48.094378', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5e8431c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.735223756, 'message_signature': 'd133c14410b360847e34aa0ddb95abfae0173a2c3d80cf3fcae92692caf12e21'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-vda', 'timestamp': '2025-11-29T07:18:48.094378', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5ea3c12-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.747654408, 'message_signature': 'db404430d3cb0d39e874e313d6987dc043fa80f71c20c81f941a6bed073350a5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-sda', 'timestamp': '2025-11-29T07:18:48.094378', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5ea49dc-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.747654408, 'message_signature': '8eaf89129939941e09fd775325aefb7515425e411bf4f3d8682f645ae2ac6449'}]}, 'timestamp': '2025-11-29 07:18:48.120090', '_unique_id': '044d779841bb485f96941ff611f18f8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.121 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.126 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for aa4795d1-71b1-415f-ac22-5bb11775bc84 / tapc373f1d7-16 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.126 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.129 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d71f022d-ac2d-48cb-bc26-3a9097ba969e / tap4c93f14a-a5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.129 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/network.incoming.bytes volume: 1352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a381dfad-dcac-43d0-b329-7835b2dde97e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:18:48.123736', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'a5eb6614-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.764606989, 'message_signature': '6fb710081e5aa5a4a6371b96a774dbcf2f05e50d737eb5dfe68523830475f0c3'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1352, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-0000006b-d71f022d-ac2d-48cb-bc26-3a9097ba969e-tap4c93f14a-a5', 'timestamp': '2025-11-29T07:18:48.123736', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'tap4c93f14a-a5', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:74:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4c93f14a-a5'}, 'message_id': 'a5ebc4d8-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.768274493, 'message_signature': '66d2f28d9d3515fde577cc1d5fcdd729c66e0925e8698e09e10cea710d9531cf'}]}, 'timestamp': '2025-11-29 07:18:48.129794', '_unique_id': '47ed07b95a884f9592f957938ef57162'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.152 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.requests volume: 301 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.153 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.177 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.177 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0870d20-ecd4-4503-b424-b6932e456fff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 301, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:18:48.131905', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5ef57c4-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': '9c39a978a701fbc2d355c489f395e58debb41554880b02048b170ed70eaca156'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:18:48.131905', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5ef6688-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': '20860565dee4dd89acf4e02645f43c2bee07dfd4cb6f79d322a98b3c22cca0d2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-vda', 'timestamp': '2025-11-29T07:18:48.131905', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5f31cf6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': '2ddf19fd3d3e787c644bf1de23ad125949ffe72dd93cf43cc27e36cb40fcf7fa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-sda', 'timestamp': '2025-11-29T07:18:48.131905', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5f329e4-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': '406d4515ddb782fe1975f0678b795333492d4d461c85d852632f649c4e3da3a7'}]}, 'timestamp': '2025-11-29 07:18:48.178273', '_unique_id': '44a546053f474f60805d7dfcca4e07fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.180 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.180 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.181 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.181 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.181 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3272d414-7042-45c7-a7dc-f2b311622021', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:18:48.180720', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5f396ae-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.735223756, 'message_signature': 'd227e1b63e6cef7ecf722349b8845f3b76d6110063fb68a4af74ccf53d560b48'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:18:48.180720', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5f3a112-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.735223756, 'message_signature': '352b6dff461d6456127909dbb581b7d7895082baa21a2c15349b140abcf5dfc9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-vda', 'timestamp': '2025-11-29T07:18:48.180720', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5f3ac98-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.747654408, 'message_signature': 'c7c4b062f48bf2a10a952ea9f37ea08669618a656db4b7aba5dd9c66e252df45'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-sda', 'timestamp': '2025-11-29T07:18:48.180720', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5f3b5da-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.747654408, 'message_signature': '737e89e7c14a083641289a917bcd79d588ed93efc910fb8e6b36e63aaf187d7a'}]}, 'timestamp': '2025-11-29 07:18:48.181807', '_unique_id': 'f06bebd5fb7d4b7f8a2d76392e475825'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.183 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.latency volume: 170051732 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.183 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.latency volume: 19650901 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.184 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.read.latency volume: 154824727 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.184 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.read.latency volume: 26758043 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3fc3ed3-d384-439c-8251-e1eedb24ec13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170051732, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:18:48.183631', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5f40760-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': '92a5457853fe326f0cfaca37d94166464133a493e43a289def12f9ef862fc42b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19650901, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:18:48.183631', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5f4125a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': '86d382fc0d4aa0e238f1aca9c75dfd43ec3b6c5273b141bb0f608e9c142a6bb8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 154824727, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-vda', 'timestamp': '2025-11-29T07:18:48.183631', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5f41d22-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': 'db46d00d427dc26cf724b2a3de94207a7cd1e21491d6cd2b5047efb76a4ca685'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26758043, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-sda', 'timestamp': '2025-11-29T07:18:48.183631', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5f4265a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': '4af4459f56ffd48c59031e8b3029ffcc37988eb1833f0637391685b93ea29572'}]}, 'timestamp': '2025-11-29 07:18:48.184684', '_unique_id': '0ac75661bc744cde99332e6c1d8ac334'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.186 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.200 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/cpu volume: 11950000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.214 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/cpu volume: 11310000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '433cafec-6d95-44b2-a5c9-bc9c226fbb9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11950000000, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'timestamp': '2025-11-29T07:18:48.186394', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a5f69e30-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.840972726, 'message_signature': '4df7ffbb93283e2f6c8e74babd33522e429b079ec926ac56a7795d10f00bc8c0'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11310000000, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'timestamp': '2025-11-29T07:18:48.186394', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a5f8dff6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.855668853, 'message_signature': '1e9c24d53a8433a70485d21d66e1aaba421393f78f8d7e8a89bcbfaa9ef0711c'}]}, 'timestamp': '2025-11-29 07:18:48.215777', '_unique_id': '6830b48de39e48a6a64f8511a9abd639'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.218 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.218 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.bytes volume: 72986624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.218 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.218 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.write.bytes volume: 72957952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.219 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a36fec1c-9d87-4be5-8f0e-939cb127cdd7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72986624, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:18:48.218172', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5f94f9a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': '5fd7dc69d024216522cbbbd2feff35a3617b407ecc7bd0c666b52a9a4e960d0d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:18:48.218172', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5f9603e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': '1f8812feed77cf683ce476a0d8052a94d122ed93e1530fecf29ff3e5c9f14c47'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72957952, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-vda', 'timestamp': '2025-11-29T07:18:48.218172', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5f96cf0-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': 'ea77562d1d60a359c5498fbb9330dcd41710962b077ac8ec6c93d7a9379a900a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-sda', 'timestamp': '2025-11-29T07:18:48.218172', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5f97d1c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': 'b711ec9e3d5439b86b08448889bdf627fab952328b640c1423d338cd3a578b08'}]}, 'timestamp': '2025-11-29 07:18:48.219721', '_unique_id': '0804fafb0c7c4947bce1e6da23bf8f74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.222 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.222 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.222 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1506153238>, <NovaLikeServer: tempest-tempest.common.compute-instance-1971408115>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1506153238>, <NovaLikeServer: tempest-tempest.common.compute-instance-1971408115>]
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.222 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.223 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff86268a-c27b-4d8e-b75a-7872cd1d8065', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:18:48.222796', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'a5fa03d6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.764606989, 'message_signature': 'edaaa79d50f31fc28fdc89c026b8fe02715bc364786f1ba031337afdfcd9860f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-0000006b-d71f022d-ac2d-48cb-bc26-3a9097ba969e-tap4c93f14a-a5', 'timestamp': '2025-11-29T07:18:48.222796', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'tap4c93f14a-a5', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:74:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4c93f14a-a5'}, 'message_id': 'a5fa1326-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.768274493, 'message_signature': '77b96a523e3d6f51b61009f721d9e76fcabe952f5042c00dff69245fdbe70900'}]}, 'timestamp': '2025-11-29 07:18:48.224004', '_unique_id': 'd0c6be9ac9e44570942a63e585016bec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.224 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.226 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.226 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '521771f7-616e-4c20-85b7-9ca50ae150e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:18:48.226320', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'a5fa8dce-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.764606989, 'message_signature': '63f5266e71ea2ffe7b27bf678a9b39d7dadee59c5676cd4bd27e3006e4e68cca'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-0000006b-d71f022d-ac2d-48cb-bc26-3a9097ba969e-tap4c93f14a-a5', 'timestamp': '2025-11-29T07:18:48.226320', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'tap4c93f14a-a5', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:74:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4c93f14a-a5'}, 'message_id': 'a5fa9b8e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.768274493, 'message_signature': 'd1594369b352a20952421f435726caa71bfe4e9e1c926c3a8c8feb0da0ab4aa0'}]}, 'timestamp': '2025-11-29 07:18:48.227061', '_unique_id': 'ed79cd03abf04970b5af2ea4609adec6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.228 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.229 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.229 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 nova_compute[186958]: 2025-11-29 07:18:48.229 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:18:48 np0005539505 nova_compute[186958]: 2025-11-29 07:18:48.230 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bba717c-f4cc-4776-937b-85ba761c3187', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:18:48.229298', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'a5fb0100-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.764606989, 'message_signature': 'b4b47c9296b7c4af03b4faf981289332055ee09ea2d0a8779055df8f6efb43bd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-0000006b-d71f022d-ac2d-48cb-bc26-3a9097ba969e-tap4c93f14a-a5', 'timestamp': '2025-11-29T07:18:48.229298', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'tap4c93f14a-a5', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:74:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4c93f14a-a5'}, 'message_id': 'a5fb0f2e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.768274493, 'message_signature': '9cb44b7cb4c50a2b95fd261aa457b27a0e1889e40753f15b3b2b9fd900fab14d'}]}, 'timestamp': '2025-11-29 07:18:48.229993', '_unique_id': 'd0226fc05ae7497fb5e0c1a557c760e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.230 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.231 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.231 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.requests volume: 1057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.232 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.232 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.232 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '972bf1ad-00ce-46ae-8436-c3c8c56443a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1057, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:18:48.231819', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5fb61ae-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': 'cffd611e482e1fe192b73b341e82eb94e23117018c0d786b7622ee641d3047b9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:18:48.231819', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5fb6bfe-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': '0d4491cefc156659204ce77010601334f2e5cc98d899f87d89f34b10b9ffc194'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-vda', 'timestamp': '2025-11-29T07:18:48.231819', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5fb7586-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': '705c78d27cd0cb1cad2cd67e6701e31cda4b5b14229d3e691d08f0d158e91ff9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-sda', 'timestamp': '2025-11-29T07:18:48.231819', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5fb7e82-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': '34a9aa02b009436752a02a69bd3e4423415a6d4d6a61c76b2c3855481eff4bab'}]}, 'timestamp': '2025-11-29 07:18:48.232818', '_unique_id': '919daa24012b44de8bc7064385a29517'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.233 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.234 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.234 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.234 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1506153238>, <NovaLikeServer: tempest-tempest.common.compute-instance-1971408115>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1506153238>, <NovaLikeServer: tempest-tempest.common.compute-instance-1971408115>]
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.235 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.236 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5be560e-f5ed-4ca7-a094-ad0bfae68a9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:18:48.235107', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'a5fbfc18-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.764606989, 'message_signature': 'a56dda717a433a9411d12c788ad8b51dfcd78ccd79d209363c8fee6d3abe0621'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-0000006b-d71f022d-ac2d-48cb-bc26-3a9097ba969e-tap4c93f14a-a5', 'timestamp': '2025-11-29T07:18:48.235107', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'tap4c93f14a-a5', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:74:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4c93f14a-a5'}, 'message_id': 'a5fc0988-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.768274493, 'message_signature': '4051b1ca521e463e2beb3cbb3a597c0bada05a241fbc41fc04cf399d5fcddd29'}]}, 'timestamp': '2025-11-29 07:18:48.236387', '_unique_id': '7172d18f534844909809f9a27d94936f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.237 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.238 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.238 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.238 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1506153238>, <NovaLikeServer: tempest-tempest.common.compute-instance-1971408115>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1506153238>, <NovaLikeServer: tempest-tempest.common.compute-instance-1971408115>]
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.238 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.238 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.238 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.239 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.239 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2edec474-1594-4e02-96df-ee7feb6e1c67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:18:48.238664', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5fc6d1a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.735223756, 'message_signature': 'a77769e7e5d98aadade3b24908f57849e92ea485fe91d2717489c7e387f638ce'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:18:48.238664', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5fc777e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.735223756, 'message_signature': '1b418d4b8d4bc6757c0614eeb0b45dfb778353efc7e43695e043a5b75c20818b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-vda', 'timestamp': '2025-11-29T07:18:48.238664', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5fc81b0-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.747654408, 'message_signature': 'f841f9cb67647fc12e2de1cf3d3283f755c2ab469deabc4d584a1e78f3c5c9e0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-sda', 'timestamp': '2025-11-29T07:18:48.238664', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5fc8da4-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.747654408, 'message_signature': '71e3acaeb4f209b1856c9b08e8da3a120858122e1c133b8c52586f962a7846a7'}]}, 'timestamp': '2025-11-29 07:18:48.239768', '_unique_id': '1a031f572fd94aeba455ac8e573b292a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.240 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.242 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.242 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26b76c60-ae45-4416-bb58-f47fc607f88f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:18:48.242052', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'a5fcf4ec-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.764606989, 'message_signature': '4c704bdc3290b0def871c700ab0fdfbee8bfbfa81c529eda1a4fe21cc8ff6790'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-0000006b-d71f022d-ac2d-48cb-bc26-3a9097ba969e-tap4c93f14a-a5', 'timestamp': '2025-11-29T07:18:48.242052', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'tap4c93f14a-a5', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:74:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4c93f14a-a5'}, 'message_id': 'a5fd0298-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.768274493, 'message_signature': 'd9f7c4607f24f4e2ed09bea4267359e2266dfba06efc6880fc2a63dd82334800'}]}, 'timestamp': '2025-11-29 07:18:48.242802', '_unique_id': '97b6c1468b5a484e9a03ee011562464f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.243 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.245 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.latency volume: 2796689788 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.245 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.245 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.write.latency volume: 3559030820 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.246 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 nova_compute[186958]: 2025-11-29 07:18:48.247 186962 DEBUG nova.objects.instance [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'pci_requests' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afe22382-bd52-4ab9-853c-c111a6cdf684', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2796689788, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:18:48.244985', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5fd6634-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': '01846e5062851e7622e33d8f4dbe8d786d825311392e467603350085b43214c1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:18:48.244985', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5fd7408-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': '4848e0c932656cbd2098e49b5f7c1cc62d9f9f95129bde2c38f23e7f57e4efc0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3559030820, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-vda', 'timestamp': '2025-11-29T07:18:48.244985', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a5fd802e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': '01a14f2637bc25a47e4da983488c711b730e42fb835e8ad6596734a4dffc587f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-sda', 'timestamp': '2025-11-29T07:18:48.244985', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a5fd8c0e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': '5bddce20c2e6458c1ed3ae32bb6df20d948f7666376be8f1847e4ecbb38f48ae'}]}, 'timestamp': '2025-11-29 07:18:48.246330', '_unique_id': '92afdd08070f4f2baa5dfd2c38ed6102'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.247 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.248 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.249 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a2eaaf1-4c23-4b5b-a7c1-8411ac4bb17c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:18:48.248745', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'a5fdfa72-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.764606989, 'message_signature': '1384384519811a731a6797890814644b8852adf361973165323276c1e87afafc'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-0000006b-d71f022d-ac2d-48cb-bc26-3a9097ba969e-tap4c93f14a-a5', 'timestamp': '2025-11-29T07:18:48.248745', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'tap4c93f14a-a5', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:74:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4c93f14a-a5'}, 'message_id': 'a5fe09a4-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.768274493, 'message_signature': '3ad019b3613a8087154f293df5f73ad661136ac75f5af6945378522210a310a0'}]}, 'timestamp': '2025-11-29 07:18:48.249540', '_unique_id': 'ae194bc260e24459b76033d84825a940'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.250 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.251 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.252 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c684f0af-3b68-455f-b628-977d23e4d40b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:18:48.251870', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'a5fe74c0-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.764606989, 'message_signature': '1057a7fa2a47f4d5402b9e93d09dd55890e881ae2b57ba5a5cc4891f1dc98cfa'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-0000006b-d71f022d-ac2d-48cb-bc26-3a9097ba969e-tap4c93f14a-a5', 'timestamp': '2025-11-29T07:18:48.251870', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'tap4c93f14a-a5', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:74:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4c93f14a-a5'}, 'message_id': 'a5fe8474-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.768274493, 'message_signature': 'dd4f888b9fa5174826417b154424ab3bc7b2e552b46c14e13d89618051bf7e3a'}]}, 'timestamp': '2025-11-29 07:18:48.252687', '_unique_id': 'fe4b1a1e70754cba84b72ffc110bbc1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.253 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.255 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.255 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/memory.usage volume: 42.66796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.255 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/memory.usage volume: 46.2109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0d468ac-3bde-4469-93f8-77edbd1f54b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.66796875, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'timestamp': '2025-11-29T07:18:48.255176', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a5fef56c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.840972726, 'message_signature': 'c53a652003416df9e7893fd9f9aecfc547a669fe4d8e5a2d2657e4c85bf2f970'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.2109375, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'timestamp': '2025-11-29T07:18:48.255176', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'a5ff03a4-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.855668853, 'message_signature': '94c058c7c62fc1cb2f28fe50512c38be06f4890e7dba676c81e8e1b2ac59fc9a'}]}, 'timestamp': '2025-11-29 07:18:48.255923', '_unique_id': '89e4032e036246d6b5b7e53f9a18d687'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.256 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.258 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.258 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.258 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c3b060f-269f-4c72-bd9f-aa8c3f466aa7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:18:48.258237', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'a5ff6c7c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.764606989, 'message_signature': '3ed96ce8290628b23d285b3d323d95d57d2fb72a87fe1baec2349d231f210001'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-0000006b-d71f022d-ac2d-48cb-bc26-3a9097ba969e-tap4c93f14a-a5', 'timestamp': '2025-11-29T07:18:48.258237', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'tap4c93f14a-a5', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:74:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4c93f14a-a5'}, 'message_id': 'a5ff79c4-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.768274493, 'message_signature': '903b7449154af45fbe58135059db0d336a3788d3a965d39980c1595c463f2eaf'}]}, 'timestamp': '2025-11-29 07:18:48.258957', '_unique_id': '8243eef87cb34ebe9c01a266203e40dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.259 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.260 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.261 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.261 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddc795ba-95a9-437c-9673-552ec4d10681', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:18:48.261055', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'a5ffda68-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.764606989, 'message_signature': 'b85b46a675f624307dc650f8236faf6e95b5b0737e50a40fb2c7d5c28a29810b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'instance-0000006b-d71f022d-ac2d-48cb-bc26-3a9097ba969e-tap4c93f14a-a5', 'timestamp': '2025-11-29T07:18:48.261055', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'tap4c93f14a-a5', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:74:74:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4c93f14a-a5'}, 'message_id': 'a5ffe74c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.768274493, 'message_signature': 'eee72023298411d1e77a281188c8fd80c36eb8ad78e36572dbfcdab2bb4d1884'}]}, 'timestamp': '2025-11-29 07:18:48.261758', '_unique_id': 'b0c2182a2b3c452889b2401dab437fd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.262 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.263 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.264 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.bytes volume: 29534720 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.264 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.264 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.read.bytes volume: 30304768 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.264 12 DEBUG ceilometer.compute.pollsters [-] d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '649de29b-0609-4060-806a-e0c8fe9d8581', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29534720, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:18:48.263994', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a6004e76-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': '68f1aa0cd8386253a63f9160a32e7ab3fa349b5138be221d175d09e8b107401a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:18:48.263994', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a600597a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.77274932, 'message_signature': 'ea0e8effd21cbb2042a7818782bf2e5d1147c08eca765920b4f16a59b751a924'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30304768, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-vda', 'timestamp': '2025-11-29T07:18:48.263994', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a6006186-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': '26396b375640ab2baef85d9b20daa32379d1747a49ac158385df9bce6e86a9f5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_name': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_name': None, 'resource_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e-sda', 'timestamp': '2025-11-29T07:18:48.263994', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-1971408115', 'name': 'instance-0000006b', 'instance_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'instance_type': 'm1.nano', 'host': 'd7bef00371814730be92b9a6940af2946aa1a884ec3fd5bcc9f0d6e6', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a6006a00-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6175.794396795, 'message_signature': 'fc638265deb6ed6d05d7f69af270cc76c24155aac0e17e7efe06cf78bf39ac8a'}]}, 'timestamp': '2025-11-29 07:18:48.265065', '_unique_id': '550c171aa7334ac4b5c2ae0eb02e5d3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.266 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:18:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:18:48.267 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1506153238>, <NovaLikeServer: tempest-tempest.common.compute-instance-1971408115>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersNegativeTestJSON-server-1506153238>, <NovaLikeServer: tempest-tempest.common.compute-instance-1971408115>]
Nov 29 02:18:48 np0005539505 nova_compute[186958]: 2025-11-29 07:18:48.323 186962 DEBUG nova.objects.instance [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'numa_topology' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:18:48 np0005539505 nova_compute[186958]: 2025-11-29 07:18:48.555 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:18:48 np0005539505 nova_compute[186958]: 2025-11-29 07:18:48.555 186962 INFO nova.compute.claims [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:18:49 np0005539505 nova_compute[186958]: 2025-11-29 07:18:49.263 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:49 np0005539505 nova_compute[186958]: 2025-11-29 07:18:49.946 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:50 np0005539505 nova_compute[186958]: 2025-11-29 07:18:50.337 186962 DEBUG nova.compute.provider_tree [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:18:50 np0005539505 nova_compute[186958]: 2025-11-29 07:18:50.795 186962 DEBUG nova.scheduler.client.report [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:18:50 np0005539505 nova_compute[186958]: 2025-11-29 07:18:50.989 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:18:51 np0005539505 nova_compute[186958]: 2025-11-29 07:18:51.547 186962 INFO nova.network.neutron [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updating port 160dd2b8-54e7-490c-8d0e-b15f57edcc04 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 29 02:18:52 np0005539505 podman[234491]: 2025-11-29 07:18:52.727437353 +0000 UTC m=+0.064034838 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:18:52 np0005539505 podman[234492]: 2025-11-29 07:18:52.733553667 +0000 UTC m=+0.062028951 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm)
Nov 29 02:18:53 np0005539505 nova_compute[186958]: 2025-11-29 07:18:53.911 186962 INFO nova.compute.manager [None req-ee28d86b-94e8-444d-bb7b-6c5515266eac 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Pausing#033[00m
Nov 29 02:18:53 np0005539505 nova_compute[186958]: 2025-11-29 07:18:53.913 186962 DEBUG nova.objects.instance [None req-ee28d86b-94e8-444d-bb7b-6c5515266eac 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'flavor' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:18:54 np0005539505 nova_compute[186958]: 2025-11-29 07:18:54.048 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400734.0485148, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:18:54 np0005539505 nova_compute[186958]: 2025-11-29 07:18:54.049 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:18:54 np0005539505 nova_compute[186958]: 2025-11-29 07:18:54.077 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:18:54 np0005539505 nova_compute[186958]: 2025-11-29 07:18:54.083 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:18:54 np0005539505 nova_compute[186958]: 2025-11-29 07:18:54.176 186962 DEBUG nova.compute.manager [None req-ee28d86b-94e8-444d-bb7b-6c5515266eac 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:18:54 np0005539505 nova_compute[186958]: 2025-11-29 07:18:54.267 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:54 np0005539505 nova_compute[186958]: 2025-11-29 07:18:54.540 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Nov 29 02:18:54 np0005539505 nova_compute[186958]: 2025-11-29 07:18:54.948 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:55 np0005539505 nova_compute[186958]: 2025-11-29 07:18:55.644 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:18:55 np0005539505 nova_compute[186958]: 2025-11-29 07:18:55.645 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquired lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:18:55 np0005539505 nova_compute[186958]: 2025-11-29 07:18:55.645 186962 DEBUG nova.network.neutron [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:18:59 np0005539505 nova_compute[186958]: 2025-11-29 07:18:59.223 186962 DEBUG nova.compute.manager [req-b3059143-55f5-4b87-9c93-a385e6bc6933 req-5f13a366-b61f-4b65-bbf8-fda11825e269 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-changed-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:18:59 np0005539505 nova_compute[186958]: 2025-11-29 07:18:59.224 186962 DEBUG nova.compute.manager [req-b3059143-55f5-4b87-9c93-a385e6bc6933 req-5f13a366-b61f-4b65-bbf8-fda11825e269 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Refreshing instance network info cache due to event network-changed-160dd2b8-54e7-490c-8d0e-b15f57edcc04. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:18:59 np0005539505 nova_compute[186958]: 2025-11-29 07:18:59.224 186962 DEBUG oslo_concurrency.lockutils [req-b3059143-55f5-4b87-9c93-a385e6bc6933 req-5f13a366-b61f-4b65-bbf8-fda11825e269 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:18:59 np0005539505 nova_compute[186958]: 2025-11-29 07:18:59.271 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:18:59 np0005539505 nova_compute[186958]: 2025-11-29 07:18:59.950 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:00 np0005539505 nova_compute[186958]: 2025-11-29 07:19:00.503 186962 INFO nova.compute.manager [None req-be17a81c-ea06-47ff-9c96-0ed341cd14c7 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Unpausing#033[00m
Nov 29 02:19:00 np0005539505 nova_compute[186958]: 2025-11-29 07:19:00.504 186962 DEBUG nova.objects.instance [None req-be17a81c-ea06-47ff-9c96-0ed341cd14c7 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'flavor' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:00 np0005539505 nova_compute[186958]: 2025-11-29 07:19:00.911 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400740.9111543, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:19:00 np0005539505 nova_compute[186958]: 2025-11-29 07:19:00.911 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:19:00 np0005539505 virtqemud[186353]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:19:00 np0005539505 nova_compute[186958]: 2025-11-29 07:19:00.914 186962 DEBUG nova.virt.libvirt.guest [None req-be17a81c-ea06-47ff-9c96-0ed341cd14c7 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:19:00 np0005539505 nova_compute[186958]: 2025-11-29 07:19:00.915 186962 DEBUG nova.compute.manager [None req-be17a81c-ea06-47ff-9c96-0ed341cd14c7 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:19:01 np0005539505 nova_compute[186958]: 2025-11-29 07:19:01.114 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:19:01 np0005539505 nova_compute[186958]: 2025-11-29 07:19:01.117 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:19:01 np0005539505 nova_compute[186958]: 2025-11-29 07:19:01.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:04 np0005539505 nova_compute[186958]: 2025-11-29 07:19:04.275 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:04 np0005539505 nova_compute[186958]: 2025-11-29 07:19:04.951 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.007 186962 DEBUG nova.network.neutron [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updating instance_info_cache with network_info: [{"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.173 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Releasing lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.175 186962 DEBUG nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.175 186962 INFO nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Creating image(s)#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.176 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.176 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.177 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.177 186962 DEBUG nova.objects.instance [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.179 186962 DEBUG oslo_concurrency.lockutils [req-b3059143-55f5-4b87-9c93-a385e6bc6933 req-5f13a366-b61f-4b65-bbf8-fda11825e269 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.180 186962 DEBUG nova.network.neutron [req-b3059143-55f5-4b87-9c93-a385e6bc6933 req-5f13a366-b61f-4b65-bbf8-fda11825e269 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Refreshing network info cache for port 160dd2b8-54e7-490c-8d0e-b15f57edcc04 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.247 186962 DEBUG nova.compute.manager [req-4f9227d5-e177-468c-b2f9-1dd35c53f1a1 req-7730da63-1074-44ec-8179-14fc71d196e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-changed-4c93f14a-a590-48c6-acc4-f7ec9a91f59f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.248 186962 DEBUG nova.compute.manager [req-4f9227d5-e177-468c-b2f9-1dd35c53f1a1 req-7730da63-1074-44ec-8179-14fc71d196e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing instance network info cache due to event network-changed-4c93f14a-a590-48c6-acc4-f7ec9a91f59f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.248 186962 DEBUG oslo_concurrency.lockutils [req-4f9227d5-e177-468c-b2f9-1dd35c53f1a1 req-7730da63-1074-44ec-8179-14fc71d196e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.249 186962 DEBUG oslo_concurrency.lockutils [req-4f9227d5-e177-468c-b2f9-1dd35c53f1a1 req-7730da63-1074-44ec-8179-14fc71d196e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.249 186962 DEBUG nova.network.neutron [req-4f9227d5-e177-468c-b2f9-1dd35c53f1a1 req-7730da63-1074-44ec-8179-14fc71d196e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing network info cache for port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.258 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "d537cc6a7df5615191df4c72ff29adbe892591b7" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:05 np0005539505 nova_compute[186958]: 2025-11-29 07:19:05.259 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "d537cc6a7df5615191df4c72ff29adbe892591b7" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:05 np0005539505 podman[234528]: 2025-11-29 07:19:05.732536847 +0000 UTC m=+0.049422923 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 29 02:19:05 np0005539505 podman[234529]: 2025-11-29 07:19:05.749096217 +0000 UTC m=+0.061167676 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.012 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.072 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7.part --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.073 186962 DEBUG nova.virt.images [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] 49c6be0d-bc35-4118-ad98-05409ab0466a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.074 186962 DEBUG nova.privsep.utils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.074 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7.part /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.445 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7.part /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7.converted" returned: 0 in 0.371s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.455 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.513 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7.converted --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.515 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "d537cc6a7df5615191df4c72ff29adbe892591b7" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.527 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:07.558 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:19:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:07.560 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.600 186962 DEBUG nova.network.neutron [req-4f9227d5-e177-468c-b2f9-1dd35c53f1a1 req-7730da63-1074-44ec-8179-14fc71d196e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updated VIF entry in instance network info cache for port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.602 186962 DEBUG nova.network.neutron [req-4f9227d5-e177-468c-b2f9-1dd35c53f1a1 req-7730da63-1074-44ec-8179-14fc71d196e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updating instance_info_cache with network_info: [{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.604 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.605 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.606 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "d537cc6a7df5615191df4c72ff29adbe892591b7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.606 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "d537cc6a7df5615191df4c72ff29adbe892591b7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.617 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.644 186962 DEBUG oslo_concurrency.lockutils [req-4f9227d5-e177-468c-b2f9-1dd35c53f1a1 req-7730da63-1074-44ec-8179-14fc71d196e6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.647 186962 DEBUG nova.network.neutron [req-b3059143-55f5-4b87-9c93-a385e6bc6933 req-5f13a366-b61f-4b65-bbf8-fda11825e269 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updated VIF entry in instance network info cache for port 160dd2b8-54e7-490c-8d0e-b15f57edcc04. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.647 186962 DEBUG nova.network.neutron [req-b3059143-55f5-4b87-9c93-a385e6bc6933 req-5f13a366-b61f-4b65-bbf8-fda11825e269 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updating instance_info_cache with network_info: [{"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.672 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.672 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7,backing_fmt=raw /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.691 186962 DEBUG oslo_concurrency.lockutils [req-b3059143-55f5-4b87-9c93-a385e6bc6933 req-5f13a366-b61f-4b65-bbf8-fda11825e269 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-99126e58-be6b-4a8d-bd7e-82d08cc3b61b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.705 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7,backing_fmt=raw /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.706 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "d537cc6a7df5615191df4c72ff29adbe892591b7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.706 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.760 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.761 186962 DEBUG nova.objects.instance [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'migration_context' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.777 186962 INFO nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Rebasing disk image.#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.777 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.832 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:07 np0005539505 nova_compute[186958]: 2025-11-29 07:19:07.833 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 -F raw /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:08 np0005539505 podman[234603]: 2025-11-29 07:19:08.72806646 +0000 UTC m=+0.058136721 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.277 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.520 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 -F raw /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk" returned: 0 in 1.687s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.521 186962 DEBUG nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.521 186962 DEBUG nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Ensure instance console log exists: /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.522 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.522 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.522 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.525 186962 DEBUG nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Start _get_guest_xml network_info=[{"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='0a592e5d0c13d3f75148fc77c023bd27',container_format='bare',created_at=2025-11-29T07:17:48Z,direct_url=<?>,disk_format='qcow2',id=49c6be0d-bc35-4118-ad98-05409ab0466a,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-1305320144-shelved',owner='32e51e3a9a8f4a1ca6e022735ebf5f7b',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-29T07:18:03Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.529 186962 WARNING nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.540 186962 DEBUG nova.virt.libvirt.host [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.541 186962 DEBUG nova.virt.libvirt.host [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.544 186962 DEBUG nova.virt.libvirt.host [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.545 186962 DEBUG nova.virt.libvirt.host [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.546 186962 DEBUG nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.546 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='0a592e5d0c13d3f75148fc77c023bd27',container_format='bare',created_at=2025-11-29T07:17:48Z,direct_url=<?>,disk_format='qcow2',id=49c6be0d-bc35-4118-ad98-05409ab0466a,min_disk=1,min_ram=0,name='tempest-ServerActionsTestOtherB-server-1305320144-shelved',owner='32e51e3a9a8f4a1ca6e022735ebf5f7b',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-11-29T07:18:03Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.547 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.547 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.547 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.548 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.548 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.548 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.548 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.548 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.549 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.549 186962 DEBUG nova.virt.hardware [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.549 186962 DEBUG nova.objects.instance [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.576 186962 DEBUG nova.virt.libvirt.vif [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1305320144',display_name='tempest-ServerActionsTestOtherB-server-1305320144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1305320144',id=103,image_ref='49c6be0d-bc35-4118-ad98-05409ab0466a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:16:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-18bazq8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member',shelved_at='2025-11-29T07:18:03.976116',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='49c6be0d-bc35-4118-ad98-05409ab0466a'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:18:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=99126e58-be6b-4a8d-bd7e-82d08cc3b61b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.576 186962 DEBUG nova.network.os_vif_util [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.577 186962 DEBUG nova.network.os_vif_util [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.578 186962 DEBUG nova.objects.instance [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'pci_devices' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.614 186962 DEBUG nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  <uuid>99126e58-be6b-4a8d-bd7e-82d08cc3b61b</uuid>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  <name>instance-00000067</name>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerActionsTestOtherB-server-1305320144</nova:name>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:19:09</nova:creationTime>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:        <nova:user uuid="ee2d4931cb504b13b92a2f52c95c05ce">tempest-ServerActionsTestOtherB-1538648925-project-member</nova:user>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:        <nova:project uuid="32e51e3a9a8f4a1ca6e022735ebf5f7b">tempest-ServerActionsTestOtherB-1538648925</nova:project>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="49c6be0d-bc35-4118-ad98-05409ab0466a"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:        <nova:port uuid="160dd2b8-54e7-490c-8d0e-b15f57edcc04">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <entry name="serial">99126e58-be6b-4a8d-bd7e-82d08cc3b61b</entry>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <entry name="uuid">99126e58-be6b-4a8d-bd7e-82d08cc3b61b</entry>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.config"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:55:12:21"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <target dev="tap160dd2b8-54"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/console.log" append="off"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <input type="keyboard" bus="usb"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:19:09 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:19:09 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:19:09 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:19:09 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.615 186962 DEBUG nova.compute.manager [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Preparing to wait for external event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.615 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.616 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.616 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.616 186962 DEBUG nova.virt.libvirt.vif [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1305320144',display_name='tempest-ServerActionsTestOtherB-server-1305320144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1305320144',id=103,image_ref='49c6be0d-bc35-4118-ad98-05409ab0466a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:16:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-18bazq8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member',shelved_at='2025-11-29T07:18:03.976116',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='49c6be0d-bc35-4118-ad98-05409ab0466a'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:18:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=99126e58-be6b-4a8d-bd7e-82d08cc3b61b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.617 186962 DEBUG nova.network.os_vif_util [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.617 186962 DEBUG nova.network.os_vif_util [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.618 186962 DEBUG os_vif [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.619 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.619 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.619 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.624 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.624 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap160dd2b8-54, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.625 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap160dd2b8-54, col_values=(('external_ids', {'iface-id': '160dd2b8-54e7-490c-8d0e-b15f57edcc04', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:12:21', 'vm-uuid': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:09 np0005539505 NetworkManager[55134]: <info>  [1764400749.6279] manager: (tap160dd2b8-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/238)
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.628 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.634 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.635 186962 INFO os_vif [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54')#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.878 186962 DEBUG nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.878 186962 DEBUG nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.878 186962 DEBUG nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] No VIF found with MAC fa:16:3e:55:12:21, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.880 186962 INFO nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Using config drive#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.953 186962 DEBUG nova.objects.instance [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'ec2_ids' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:09 np0005539505 nova_compute[186958]: 2025-11-29 07:19:09.957 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:10 np0005539505 nova_compute[186958]: 2025-11-29 07:19:10.056 186962 DEBUG nova.objects.instance [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'keypairs' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:10 np0005539505 nova_compute[186958]: 2025-11-29 07:19:10.477 186962 INFO nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Creating config drive at /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.config#033[00m
Nov 29 02:19:10 np0005539505 nova_compute[186958]: 2025-11-29 07:19:10.483 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8vif4n41 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.565 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:10 np0005539505 nova_compute[186958]: 2025-11-29 07:19:10.617 186962 DEBUG oslo_concurrency.processutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8vif4n41" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:10 np0005539505 kernel: tap160dd2b8-54: entered promiscuous mode
Nov 29 02:19:10 np0005539505 NetworkManager[55134]: <info>  [1764400750.6754] manager: (tap160dd2b8-54): new Tun device (/org/freedesktop/NetworkManager/Devices/239)
Nov 29 02:19:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:10Z|00472|binding|INFO|Claiming lport 160dd2b8-54e7-490c-8d0e-b15f57edcc04 for this chassis.
Nov 29 02:19:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:10Z|00473|binding|INFO|160dd2b8-54e7-490c-8d0e-b15f57edcc04: Claiming fa:16:3e:55:12:21 10.100.0.12
Nov 29 02:19:10 np0005539505 nova_compute[186958]: 2025-11-29 07:19:10.678 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.684 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:12:21 10.100.0.12'], port_security=['fa:16:3e:55:12:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '7', 'neutron:security_group_ids': '490b426d-026a-4a21-8c41-f013fe0c1458', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=160dd2b8-54e7-490c-8d0e-b15f57edcc04) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.686 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 160dd2b8-54e7-490c-8d0e-b15f57edcc04 in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 bound to our chassis#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.688 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df7cfc35-3f76-45b2-b70c-e4525d38f410#033[00m
Nov 29 02:19:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:10Z|00474|binding|INFO|Setting lport 160dd2b8-54e7-490c-8d0e-b15f57edcc04 ovn-installed in OVS
Nov 29 02:19:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:10Z|00475|binding|INFO|Setting lport 160dd2b8-54e7-490c-8d0e-b15f57edcc04 up in Southbound
Nov 29 02:19:10 np0005539505 nova_compute[186958]: 2025-11-29 07:19:10.691 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:10 np0005539505 nova_compute[186958]: 2025-11-29 07:19:10.694 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.702 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[339acbf5-c046-432f-af3c-034b3602b281]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.703 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdf7cfc35-31 in ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.706 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdf7cfc35-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.706 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f998a0-d74e-4638-ad2a-d23fb19e756c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.707 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd8d697-1a66-4204-8571-97a8d339ef10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 systemd-udevd[234645]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:19:10 np0005539505 systemd-machined[153285]: New machine qemu-56-instance-00000067.
Nov 29 02:19:10 np0005539505 NetworkManager[55134]: <info>  [1764400750.7208] device (tap160dd2b8-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:19:10 np0005539505 NetworkManager[55134]: <info>  [1764400750.7216] device (tap160dd2b8-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.722 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[4f34b8af-adc5-437f-8905-d6015072346a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 systemd[1]: Started Virtual Machine qemu-56-instance-00000067.
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.782 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[755ea47e-58c3-4efb-9563-0afb7bd6b4cf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.810 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[4a5e7bf8-0042-4119-bced-43b954f804bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 systemd-udevd[234648]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.817 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[87e53a5b-1ccc-4f5c-a88a-b5d2f7b95aea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 NetworkManager[55134]: <info>  [1764400750.8184] manager: (tapdf7cfc35-30): new Veth device (/org/freedesktop/NetworkManager/Devices/240)
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.855 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[5388caa6-c1df-4891-b1e6-494721d948d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.859 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[210fd93b-1ba6-4f0e-8642-a8d6c437520f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 NetworkManager[55134]: <info>  [1764400750.8822] device (tapdf7cfc35-30): carrier: link connected
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.891 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2b00815a-82a8-40e4-835d-7c722fe568e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.910 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bacca563-14c7-41e7-8901-cf5991c528da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 619846, 'reachable_time': 36926, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234677, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.925 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[569566d1-1ba0-414e-8270-1940a68f5f4a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:aeb6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 619846, 'tstamp': 619846}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234678, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.941 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d1cf771e-cf89-48fd-af8c-ee8fbd3e50eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf7cfc35-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:ae:b6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 152], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 619846, 'reachable_time': 36926, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234679, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:10.970 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5a0e1ea7-eba3-4c44-bcfe-35922711d28c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:11.024 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4588d79f-4763-490f-9133-ead6b68bd356]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:11.026 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:11.026 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:11.026 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf7cfc35-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.055 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:11 np0005539505 kernel: tapdf7cfc35-30: entered promiscuous mode
Nov 29 02:19:11 np0005539505 NetworkManager[55134]: <info>  [1764400751.0563] manager: (tapdf7cfc35-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/241)
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:11.059 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf7cfc35-30, col_values=(('external_ids', {'iface-id': 'cab31803-36dd-4107-bb9e-3d36862142c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.060 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:11Z|00476|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.071 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:11.072 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.073 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:11.073 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f31ba5a8-7053-4686-8376-ad386dbb45dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:11.074 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-df7cfc35-3f76-45b2-b70c-e4525d38f410
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/df7cfc35-3f76-45b2-b70c-e4525d38f410.pid.haproxy
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID df7cfc35-3f76-45b2-b70c-e4525d38f410
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:19:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:11.075 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'env', 'PROCESS_TAG=haproxy-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/df7cfc35-3f76-45b2-b70c-e4525d38f410.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.267 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400751.2661977, 99126e58-be6b-4a8d-bd7e-82d08cc3b61b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.267 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] VM Started (Lifecycle Event)#033[00m
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.299 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.305 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400751.2676792, 99126e58-be6b-4a8d-bd7e-82d08cc3b61b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.306 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.426 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.429 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:19:11 np0005539505 nova_compute[186958]: 2025-11-29 07:19:11.485 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:19:11 np0005539505 podman[234717]: 2025-11-29 07:19:11.658727422 +0000 UTC m=+0.024710032 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:19:12 np0005539505 podman[234717]: 2025-11-29 07:19:12.393144919 +0000 UTC m=+0.759127499 container create 2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 29 02:19:12 np0005539505 systemd[1]: Started libpod-conmon-2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3.scope.
Nov 29 02:19:12 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:19:12 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c06686be060407564adbb944780f18cb96e0dbd18bebd3e90bb012c944cc3e4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:19:12 np0005539505 nova_compute[186958]: 2025-11-29 07:19:12.705 186962 DEBUG oslo_concurrency.lockutils [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "interface-d71f022d-ac2d-48cb-bc26-3a9097ba969e-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:12 np0005539505 nova_compute[186958]: 2025-11-29 07:19:12.706 186962 DEBUG oslo_concurrency.lockutils [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-d71f022d-ac2d-48cb-bc26-3a9097ba969e-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:12 np0005539505 nova_compute[186958]: 2025-11-29 07:19:12.707 186962 DEBUG nova.objects.instance [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'flavor' on Instance uuid d71f022d-ac2d-48cb-bc26-3a9097ba969e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:12 np0005539505 podman[234717]: 2025-11-29 07:19:12.85207266 +0000 UTC m=+1.218055330 container init 2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:19:12 np0005539505 podman[234717]: 2025-11-29 07:19:12.85912079 +0000 UTC m=+1.225103370 container start 2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:19:12 np0005539505 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[234732]: [NOTICE]   (234736) : New worker (234738) forked
Nov 29 02:19:12 np0005539505 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[234732]: [NOTICE]   (234736) : Loading success.
Nov 29 02:19:12 np0005539505 nova_compute[186958]: 2025-11-29 07:19:12.980 186962 DEBUG nova.compute.manager [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-changed-4c93f14a-a590-48c6-acc4-f7ec9a91f59f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:12 np0005539505 nova_compute[186958]: 2025-11-29 07:19:12.981 186962 DEBUG nova.compute.manager [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing instance network info cache due to event network-changed-4c93f14a-a590-48c6-acc4-f7ec9a91f59f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:19:12 np0005539505 nova_compute[186958]: 2025-11-29 07:19:12.982 186962 DEBUG oslo_concurrency.lockutils [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:12 np0005539505 nova_compute[186958]: 2025-11-29 07:19:12.982 186962 DEBUG oslo_concurrency.lockutils [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:12 np0005539505 nova_compute[186958]: 2025-11-29 07:19:12.983 186962 DEBUG nova.network.neutron [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing network info cache for port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:19:13 np0005539505 nova_compute[186958]: 2025-11-29 07:19:13.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:13 np0005539505 nova_compute[186958]: 2025-11-29 07:19:13.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:19:13 np0005539505 nova_compute[186958]: 2025-11-29 07:19:13.620 186962 DEBUG nova.objects.instance [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'pci_requests' on Instance uuid d71f022d-ac2d-48cb-bc26-3a9097ba969e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:13 np0005539505 nova_compute[186958]: 2025-11-29 07:19:13.637 186962 DEBUG nova.network.neutron [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.190 186962 DEBUG nova.policy [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a13f011f5b74a6f94a2d2c8e9104f4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '16d7af1670ea460db3d0422f176b6f98', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.415 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.416 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.416 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.416 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.498 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.610 186962 DEBUG nova.network.neutron [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updated VIF entry in instance network info cache for port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.611 186962 DEBUG nova.network.neutron [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updating instance_info_cache with network_info: [{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.616 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.617 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.633 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.674 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.679 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.698 186962 DEBUG oslo_concurrency.lockutils [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.699 186962 DEBUG nova.compute.manager [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.699 186962 DEBUG oslo_concurrency.lockutils [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.699 186962 DEBUG oslo_concurrency.lockutils [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.699 186962 DEBUG oslo_concurrency.lockutils [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.700 186962 DEBUG nova.compute.manager [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Processing event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.700 186962 DEBUG nova.compute.manager [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.700 186962 DEBUG oslo_concurrency.lockutils [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.700 186962 DEBUG oslo_concurrency.lockutils [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.701 186962 DEBUG oslo_concurrency.lockutils [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.701 186962 DEBUG nova.compute.manager [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] No waiting events found dispatching network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.701 186962 WARNING nova.compute.manager [req-3dd4fb2b-ec00-4bc6-8ea6-cbfd065764b1 req-1dd28867-1e98-4ada-87a8-5c96f29a7127 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received unexpected event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.702 186962 DEBUG nova.compute.manager [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.706 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400754.7057486, 99126e58-be6b-4a8d-bd7e-82d08cc3b61b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.706 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.708 186962 DEBUG nova.virt.libvirt.driver [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.711 186962 INFO nova.virt.libvirt.driver [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Instance spawned successfully.#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.739 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.740 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.759 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.766 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.791 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.797 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.802 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.855 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.856 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.907 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:14 np0005539505 nova_compute[186958]: 2025-11-29 07:19:14.956 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.097 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.099 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5339MB free_disk=73.07075881958008GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.099 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.099 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.434 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance aa4795d1-71b1-415f-ac22-5bb11775bc84 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.434 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance d71f022d-ac2d-48cb-bc26-3a9097ba969e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.434 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 99126e58-be6b-4a8d-bd7e-82d08cc3b61b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.435 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.435 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:19:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:15Z|00477|binding|INFO|Releasing lport 17905b79-5cd7-4b55-9191-5d935325b1f0 from this chassis (sb_readonly=0)
Nov 29 02:19:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:15Z|00478|binding|INFO|Releasing lport cab31803-36dd-4107-bb9e-3d36862142c0 from this chassis (sb_readonly=0)
Nov 29 02:19:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:15Z|00479|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.510 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.512 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.534 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.535 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.552 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.563 186962 DEBUG nova.compute.manager [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.580 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.624 186962 DEBUG nova.network.neutron [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Successfully updated port: b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.650 186962 DEBUG oslo_concurrency.lockutils [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.651 186962 DEBUG oslo_concurrency.lockutils [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.651 186962 DEBUG nova.network.neutron [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.663 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.682 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.696 186962 DEBUG oslo_concurrency.lockutils [None req-a3c772ab-ea3e-46e9-9bd4-c7def8603e8b ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 28.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.718 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.719 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.740 186962 DEBUG nova.compute.manager [req-f226eb5d-b4db-44ea-b092-aed17d4bebc7 req-0e91a78a-6769-470a-8554-d59a74ddf9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-changed-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.740 186962 DEBUG nova.compute.manager [req-f226eb5d-b4db-44ea-b092-aed17d4bebc7 req-0e91a78a-6769-470a-8554-d59a74ddf9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing instance network info cache due to event network-changed-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.740 186962 DEBUG oslo_concurrency.lockutils [req-f226eb5d-b4db-44ea-b092-aed17d4bebc7 req-0e91a78a-6769-470a-8554-d59a74ddf9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:15 np0005539505 nova_compute[186958]: 2025-11-29 07:19:15.968 186962 WARNING nova.network.neutron [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] 90812230-35cb-4e21-b16b-75b900100d8b already exists in list: networks containing: ['90812230-35cb-4e21-b16b-75b900100d8b']. ignoring it#033[00m
Nov 29 02:19:17 np0005539505 podman[234767]: 2025-11-29 07:19:17.72535109 +0000 UTC m=+0.055893477 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.742 186962 DEBUG nova.network.neutron [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updating instance_info_cache with network_info: [{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:17 np0005539505 podman[234768]: 2025-11-29 07:19:17.819045729 +0000 UTC m=+0.138374578 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.833 186962 DEBUG oslo_concurrency.lockutils [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.834 186962 DEBUG oslo_concurrency.lockutils [req-f226eb5d-b4db-44ea-b092-aed17d4bebc7 req-0e91a78a-6769-470a-8554-d59a74ddf9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.835 186962 DEBUG nova.network.neutron [req-f226eb5d-b4db-44ea-b092-aed17d4bebc7 req-0e91a78a-6769-470a-8554-d59a74ddf9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing network info cache for port b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.838 186962 DEBUG nova.virt.libvirt.vif [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:17:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1971408115',display_name='tempest-tempest.common.compute-instance-1971408115',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1971408115',id=107,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:17:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-9f0t20th',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:17:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d71f022d-ac2d-48cb-bc26-3a9097ba969e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.838 186962 DEBUG nova.network.os_vif_util [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.839 186962 DEBUG nova.network.os_vif_util [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.839 186962 DEBUG os_vif [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.840 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.840 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.840 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.843 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.843 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1f37250-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.844 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb1f37250-55, col_values=(('external_ids', {'iface-id': 'b1f37250-55e3-4fc4-a9bb-2dedac4d03f5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2f:f7:1c', 'vm-uuid': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.845 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.846 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:19:17 np0005539505 NetworkManager[55134]: <info>  [1764400757.8462] manager: (tapb1f37250-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/242)
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.853 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.854 186962 INFO os_vif [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55')#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.854 186962 DEBUG nova.virt.libvirt.vif [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:17:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1971408115',display_name='tempest-tempest.common.compute-instance-1971408115',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1971408115',id=107,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:17:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-9f0t20th',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:17:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d71f022d-ac2d-48cb-bc26-3a9097ba969e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.855 186962 DEBUG nova.network.os_vif_util [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.855 186962 DEBUG nova.network.os_vif_util [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.857 186962 DEBUG nova.virt.libvirt.guest [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] attach device xml: <interface type="ethernet">
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:2f:f7:1c"/>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <target dev="tapb1f37250-55"/>
Nov 29 02:19:17 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:19:17 np0005539505 nova_compute[186958]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 29 02:19:17 np0005539505 kernel: tapb1f37250-55: entered promiscuous mode
Nov 29 02:19:17 np0005539505 NetworkManager[55134]: <info>  [1764400757.8715] manager: (tapb1f37250-55): new Tun device (/org/freedesktop/NetworkManager/Devices/243)
Nov 29 02:19:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:17Z|00480|binding|INFO|Claiming lport b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 for this chassis.
Nov 29 02:19:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:17Z|00481|binding|INFO|b1f37250-55e3-4fc4-a9bb-2dedac4d03f5: Claiming fa:16:3e:2f:f7:1c 10.100.0.7
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.876 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:17Z|00482|binding|INFO|Setting lport b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 up in Southbound
Nov 29 02:19:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:17.886 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:f7:1c 10.100.0.7'], port_security=['fa:16:3e:2f:f7:1c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-636496357', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-636496357', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '2', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:19:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:17Z|00483|binding|INFO|Setting lport b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 ovn-installed in OVS
Nov 29 02:19:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:17.888 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 in datapath 90812230-35cb-4e21-b16b-75b900100d8b bound to our chassis#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.887 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.889 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.890 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:17.891 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:19:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:17.906 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[32e90180-394f-405d-9b5d-4722c12237b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:17 np0005539505 systemd-udevd[234827]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:19:17 np0005539505 NetworkManager[55134]: <info>  [1764400757.9341] device (tapb1f37250-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:19:17 np0005539505 NetworkManager[55134]: <info>  [1764400757.9357] device (tapb1f37250-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:19:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:17.941 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ff458e0b-4155-4f8c-b406-30eab02769c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:17.944 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[8752ea29-cf53-47b6-83a0-9a5852fdf964]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.975 186962 DEBUG nova.virt.libvirt.driver [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.975 186962 DEBUG nova.virt.libvirt.driver [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.975 186962 DEBUG nova.virt.libvirt.driver [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:74:74:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.975 186962 DEBUG nova.virt.libvirt.driver [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] No VIF found with MAC fa:16:3e:2f:f7:1c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:19:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:17.975 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e5d481-8b2a-4a67-8fce-474e2c146f1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:17.992 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[20e7ae7a-5865-4551-97d9-f5b70ea47323]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 611854, 'reachable_time': 21480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234833, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:17 np0005539505 nova_compute[186958]: 2025-11-29 07:19:17.997 186962 DEBUG nova.virt.libvirt.guest [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <nova:name>tempest-tempest.common.compute-instance-1971408115</nova:name>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:19:17</nova:creationTime>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:19:17 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:    <nova:port uuid="4c93f14a-a590-48c6-acc4-f7ec9a91f59f">
Nov 29 02:19:17 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:    <nova:port uuid="b1f37250-55e3-4fc4-a9bb-2dedac4d03f5">
Nov 29 02:19:17 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:19:17 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:19:17 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:19:17 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:19:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:18.009 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[27850c9d-eeea-489a-8db3-dcdffb016852]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 611866, 'tstamp': 611866}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234834, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 611868, 'tstamp': 611868}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234834, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:18.011 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.013 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.014 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:18.014 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:18.014 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:19:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:18.015 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:18.015 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.028 186962 DEBUG oslo_concurrency.lockutils [None req-82b97091-845e-4969-ade3-336395290ff3 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-d71f022d-ac2d-48cb-bc26-3a9097ba969e-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.482 186962 DEBUG nova.compute.manager [req-abd3746e-f039-46c9-86b1-a60993500d47 req-d80cc8af-fcc0-428b-87bb-ef5969900d65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.482 186962 DEBUG oslo_concurrency.lockutils [req-abd3746e-f039-46c9-86b1-a60993500d47 req-d80cc8af-fcc0-428b-87bb-ef5969900d65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.482 186962 DEBUG oslo_concurrency.lockutils [req-abd3746e-f039-46c9-86b1-a60993500d47 req-d80cc8af-fcc0-428b-87bb-ef5969900d65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.483 186962 DEBUG oslo_concurrency.lockutils [req-abd3746e-f039-46c9-86b1-a60993500d47 req-d80cc8af-fcc0-428b-87bb-ef5969900d65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.483 186962 DEBUG nova.compute.manager [req-abd3746e-f039-46c9-86b1-a60993500d47 req-d80cc8af-fcc0-428b-87bb-ef5969900d65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] No waiting events found dispatching network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.483 186962 WARNING nova.compute.manager [req-abd3746e-f039-46c9-86b1-a60993500d47 req-d80cc8af-fcc0-428b-87bb-ef5969900d65 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received unexpected event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.714 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.715 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.715 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.994 186962 DEBUG oslo_concurrency.lockutils [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "interface-d71f022d-ac2d-48cb-bc26-3a9097ba969e-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:18 np0005539505 nova_compute[186958]: 2025-11-29 07:19:18.994 186962 DEBUG oslo_concurrency.lockutils [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-d71f022d-ac2d-48cb-bc26-3a9097ba969e-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.012 186962 DEBUG nova.objects.instance [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'flavor' on Instance uuid d71f022d-ac2d-48cb-bc26-3a9097ba969e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.044 186962 DEBUG nova.virt.libvirt.vif [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:17:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1971408115',display_name='tempest-tempest.common.compute-instance-1971408115',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1971408115',id=107,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:17:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-9f0t20th',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:17:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d71f022d-ac2d-48cb-bc26-3a9097ba969e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.044 186962 DEBUG nova.network.os_vif_util [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.045 186962 DEBUG nova.network.os_vif_util [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.047 186962 DEBUG nova.virt.libvirt.guest [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.049 186962 DEBUG nova.virt.libvirt.guest [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.051 186962 DEBUG nova.virt.libvirt.driver [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Attempting to detach device tapb1f37250-55 from instance d71f022d-ac2d-48cb-bc26-3a9097ba969e from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.052 186962 DEBUG nova.virt.libvirt.guest [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:2f:f7:1c"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <target dev="tapb1f37250-55"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.059 186962 DEBUG nova.virt.libvirt.guest [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.062 186962 DEBUG nova.virt.libvirt.guest [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface>not found in domain: <domain type='kvm' id='55'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <name>instance-0000006b</name>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <uuid>d71f022d-ac2d-48cb-bc26-3a9097ba969e</uuid>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:name>tempest-tempest.common.compute-instance-1971408115</nova:name>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:19:17</nova:creationTime>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:port uuid="4c93f14a-a590-48c6-acc4-f7ec9a91f59f">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:port uuid="b1f37250-55e3-4fc4-a9bb-2dedac4d03f5">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='serial'>d71f022d-ac2d-48cb-bc26-3a9097ba969e</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='uuid'>d71f022d-ac2d-48cb-bc26-3a9097ba969e</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk' index='2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.config' index='1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:74:74:96'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target dev='tap4c93f14a-a5'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:2f:f7:1c'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target dev='tapb1f37250-55'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='net1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/console.log' append='off'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/console.log' append='off'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c9,c59</label>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c9,c59</imagelabel>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.062 186962 INFO nova.virt.libvirt.driver [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully detached device tapb1f37250-55 from instance d71f022d-ac2d-48cb-bc26-3a9097ba969e from the persistent domain config.#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.062 186962 DEBUG nova.virt.libvirt.driver [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] (1/8): Attempting to detach device tapb1f37250-55 with device alias net1 from instance d71f022d-ac2d-48cb-bc26-3a9097ba969e from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.062 186962 DEBUG nova.virt.libvirt.guest [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:2f:f7:1c"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <target dev="tapb1f37250-55"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:19:19 np0005539505 kernel: tapb1f37250-55 (unregistering): left promiscuous mode
Nov 29 02:19:19 np0005539505 NetworkManager[55134]: <info>  [1764400759.1171] device (tapb1f37250-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:19:19 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:19Z|00484|binding|INFO|Releasing lport b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 from this chassis (sb_readonly=0)
Nov 29 02:19:19 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:19Z|00485|binding|INFO|Setting lport b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 down in Southbound
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.122 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:19 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:19Z|00486|binding|INFO|Removing iface tapb1f37250-55 ovn-installed in OVS
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.124 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.131 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:f7:1c 10.100.0.7'], port_security=['fa:16:3e:2f:f7:1c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-636496357', 'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-636496357', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '4', 'neutron:security_group_ids': '026dfe19-5964-4af9-9b69-58d89d9181a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.133 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 in datapath 90812230-35cb-4e21-b16b-75b900100d8b unbound from our chassis#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.135 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90812230-35cb-4e21-b16b-75b900100d8b#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.138 186962 DEBUG nova.virt.libvirt.driver [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Start waiting for the detach event from libvirt for device tapb1f37250-55 with device alias net1 for instance d71f022d-ac2d-48cb-bc26-3a9097ba969e _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.139 186962 DEBUG nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Received event <DeviceRemovedEvent: 1764400759.1384695, d71f022d-ac2d-48cb-bc26-3a9097ba969e => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.140 186962 DEBUG nova.virt.libvirt.guest [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.143 186962 DEBUG nova.virt.libvirt.guest [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:2f:f7:1c"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb1f37250-55"/></interface>not found in domain: <domain type='kvm' id='55'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <name>instance-0000006b</name>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <uuid>d71f022d-ac2d-48cb-bc26-3a9097ba969e</uuid>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:name>tempest-tempest.common.compute-instance-1971408115</nova:name>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:19:17</nova:creationTime>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:port uuid="4c93f14a-a590-48c6-acc4-f7ec9a91f59f">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:port uuid="b1f37250-55e3-4fc4-a9bb-2dedac4d03f5">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='serial'>d71f022d-ac2d-48cb-bc26-3a9097ba969e</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='uuid'>d71f022d-ac2d-48cb-bc26-3a9097ba969e</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk' index='2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk.config' index='1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:74:74:96'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target dev='tap4c93f14a-a5'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/console.log' append='off'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/console.log' append='off'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c9,c59</label>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c9,c59</imagelabel>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.144 186962 INFO nova.virt.libvirt.driver [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully detached device tapb1f37250-55 from instance d71f022d-ac2d-48cb-bc26-3a9097ba969e from the live domain config.#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.144 186962 DEBUG nova.virt.libvirt.vif [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:17:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1971408115',display_name='tempest-tempest.common.compute-instance-1971408115',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1971408115',id=107,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:17:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-9f0t20th',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:17:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d71f022d-ac2d-48cb-bc26-3a9097ba969e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.145 186962 DEBUG nova.network.os_vif_util [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.145 186962 DEBUG nova.network.os_vif_util [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.146 186962 DEBUG os_vif [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.148 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.148 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1f37250-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.149 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.149 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.151 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.150 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1a5d3978-7000-4e70-8490-6c5d66c66d2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.153 186962 INFO os_vif [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:f7:1c,bridge_name='br-int',has_traffic_filtering=True,id=b1f37250-55e3-4fc4-a9bb-2dedac4d03f5,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb1f37250-55')#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.154 186962 DEBUG nova.virt.libvirt.guest [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:name>tempest-tempest.common.compute-instance-1971408115</nova:name>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:19:19</nova:creationTime>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:user uuid="9a13f011f5b74a6f94a2d2c8e9104f4a">tempest-AttachInterfacesTestJSON-844604773-project-member</nova:user>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:project uuid="16d7af1670ea460db3d0422f176b6f98">tempest-AttachInterfacesTestJSON-844604773</nova:project>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    <nova:port uuid="4c93f14a-a590-48c6-acc4-f7ec9a91f59f">
Nov 29 02:19:19 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:19:19 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:19:19 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.183 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[54e40fd3-8b14-41c2-8328-cd667a7eea1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.186 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[5f54b292-2e65-4fb3-8437-c497afc82cd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.213 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[9096afb6-242e-4584-935f-c4e068b3b415]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.234 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[39c87429-d0f8-418c-a3ad-5467675825ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90812230-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:5f:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 150], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 611854, 'reachable_time': 21480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234845, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.257 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9c96e42e-5a7e-4809-b62f-9951b6fd1f2d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 611866, 'tstamp': 611866}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234846, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap90812230-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 611868, 'tstamp': 611868}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234846, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.273 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.274 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.276 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.277 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90812230-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.277 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.277 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90812230-30, col_values=(('external_ids', {'iface-id': '71b1ea47-55d6-453c-a181-e6370c4f7968'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:19.278 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.314 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.567 186962 DEBUG nova.network.neutron [req-f226eb5d-b4db-44ea-b092-aed17d4bebc7 req-0e91a78a-6769-470a-8554-d59a74ddf9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updated VIF entry in instance network info cache for port b1f37250-55e3-4fc4-a9bb-2dedac4d03f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.568 186962 DEBUG nova.network.neutron [req-f226eb5d-b4db-44ea-b092-aed17d4bebc7 req-0e91a78a-6769-470a-8554-d59a74ddf9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updating instance_info_cache with network_info: [{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.595 186962 DEBUG oslo_concurrency.lockutils [req-f226eb5d-b4db-44ea-b092-aed17d4bebc7 req-0e91a78a-6769-470a-8554-d59a74ddf9e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.596 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.597 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:19:19 np0005539505 nova_compute[186958]: 2025-11-29 07:19:19.959 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.563 186962 DEBUG nova.compute.manager [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.563 186962 DEBUG oslo_concurrency.lockutils [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.563 186962 DEBUG oslo_concurrency.lockutils [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.564 186962 DEBUG oslo_concurrency.lockutils [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.564 186962 DEBUG nova.compute.manager [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] No waiting events found dispatching network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.564 186962 WARNING nova.compute.manager [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received unexpected event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.564 186962 DEBUG nova.compute.manager [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-vif-unplugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.564 186962 DEBUG oslo_concurrency.lockutils [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.565 186962 DEBUG oslo_concurrency.lockutils [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.565 186962 DEBUG oslo_concurrency.lockutils [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.565 186962 DEBUG nova.compute.manager [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] No waiting events found dispatching network-vif-unplugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.565 186962 WARNING nova.compute.manager [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received unexpected event network-vif-unplugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.565 186962 DEBUG nova.compute.manager [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.566 186962 DEBUG oslo_concurrency.lockutils [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.566 186962 DEBUG oslo_concurrency.lockutils [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.566 186962 DEBUG oslo_concurrency.lockutils [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.566 186962 DEBUG nova.compute.manager [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] No waiting events found dispatching network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:19:20 np0005539505 nova_compute[186958]: 2025-11-29 07:19:20.566 186962 WARNING nova.compute.manager [req-76abf275-14ff-494f-b81a-e827bd7ff914 req-2440fd1d-ca60-4f0c-8e87-848cf38990e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received unexpected event network-vif-plugged-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:19:23 np0005539505 podman[234848]: 2025-11-29 07:19:23.736115464 +0000 UTC m=+0.065278614 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:19:23 np0005539505 podman[234849]: 2025-11-29 07:19:23.762263106 +0000 UTC m=+0.088067030 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 29 02:19:24 np0005539505 nova_compute[186958]: 2025-11-29 07:19:24.152 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:24 np0005539505 nova_compute[186958]: 2025-11-29 07:19:24.962 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.220 186962 DEBUG oslo_concurrency.lockutils [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.273 186962 DEBUG oslo_concurrency.lockutils [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.274 186962 DEBUG oslo_concurrency.lockutils [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.275 186962 DEBUG oslo_concurrency.lockutils [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.275 186962 DEBUG oslo_concurrency.lockutils [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.275 186962 DEBUG oslo_concurrency.lockutils [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.291 186962 INFO nova.compute.manager [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Terminating instance#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.305 186962 DEBUG nova.compute.manager [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:19:25 np0005539505 kernel: tap160dd2b8-54 (unregistering): left promiscuous mode
Nov 29 02:19:25 np0005539505 NetworkManager[55134]: <info>  [1764400765.3341] device (tap160dd2b8-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:19:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:25Z|00487|binding|INFO|Releasing lport 160dd2b8-54e7-490c-8d0e-b15f57edcc04 from this chassis (sb_readonly=0)
Nov 29 02:19:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:25Z|00488|binding|INFO|Setting lport 160dd2b8-54e7-490c-8d0e-b15f57edcc04 down in Southbound
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.343 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:25Z|00489|binding|INFO|Removing iface tap160dd2b8-54 ovn-installed in OVS
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.345 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.356 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:25 np0005539505 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000067.scope: Deactivated successfully.
Nov 29 02:19:25 np0005539505 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000067.scope: Consumed 11.443s CPU time.
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.401 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:12:21 10.100.0.12'], port_security=['fa:16:3e:55:12:21 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '99126e58-be6b-4a8d-bd7e-82d08cc3b61b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32e51e3a9a8f4a1ca6e022735ebf5f7b', 'neutron:revision_number': '9', 'neutron:security_group_ids': '490b426d-026a-4a21-8c41-f013fe0c1458', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.195', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b58113-8105-402c-a103-4692d3989228, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=160dd2b8-54e7-490c-8d0e-b15f57edcc04) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.403 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 160dd2b8-54e7-490c-8d0e-b15f57edcc04 in datapath df7cfc35-3f76-45b2-b70c-e4525d38f410 unbound from our chassis#033[00m
Nov 29 02:19:25 np0005539505 systemd-machined[153285]: Machine qemu-56-instance-00000067 terminated.
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.404 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df7cfc35-3f76-45b2-b70c-e4525d38f410, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.406 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[31ab6b61-2953-4d56-8455-e32cc9dfefcd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.406 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 namespace which is not needed anymore#033[00m
Nov 29 02:19:25 np0005539505 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[234732]: [NOTICE]   (234736) : haproxy version is 2.8.14-c23fe91
Nov 29 02:19:25 np0005539505 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[234732]: [NOTICE]   (234736) : path to executable is /usr/sbin/haproxy
Nov 29 02:19:25 np0005539505 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[234732]: [WARNING]  (234736) : Exiting Master process...
Nov 29 02:19:25 np0005539505 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[234732]: [ALERT]    (234736) : Current worker (234738) exited with code 143 (Terminated)
Nov 29 02:19:25 np0005539505 neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410[234732]: [WARNING]  (234736) : All workers exited. Exiting... (0)
Nov 29 02:19:25 np0005539505 systemd[1]: libpod-2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3.scope: Deactivated successfully.
Nov 29 02:19:25 np0005539505 conmon[234732]: conmon 2eb49ff895ca3886060e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3.scope/container/memory.events
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.521 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updating instance_info_cache with network_info: [{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "address": "fa:16:3e:2f:f7:1c", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb1f37250-55", "ovs_interfaceid": "b1f37250-55e3-4fc4-a9bb-2dedac4d03f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:25 np0005539505 podman[234910]: 2025-11-29 07:19:25.524888707 +0000 UTC m=+0.044012580 container died 2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:19:25 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3-userdata-shm.mount: Deactivated successfully.
Nov 29 02:19:25 np0005539505 systemd[1]: var-lib-containers-storage-overlay-3c06686be060407564adbb944780f18cb96e0dbd18bebd3e90bb012c944cc3e4-merged.mount: Deactivated successfully.
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.562 186962 INFO nova.virt.libvirt.driver [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Instance destroyed successfully.#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.562 186962 DEBUG nova.objects.instance [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lazy-loading 'resources' on Instance uuid 99126e58-be6b-4a8d-bd7e-82d08cc3b61b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:25 np0005539505 podman[234910]: 2025-11-29 07:19:25.571586312 +0000 UTC m=+0.090710185 container cleanup 2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:19:25 np0005539505 systemd[1]: libpod-conmon-2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3.scope: Deactivated successfully.
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.613 186962 DEBUG nova.virt.libvirt.vif [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1305320144',display_name='tempest-ServerActionsTestOtherB-server-1305320144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1305320144',id=103,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJEYU7KNgNpvYMhWLcgNKb4JeWm+l16ttLKZ2We4gLp8YMbZFLJD2i4RZSQXciBvCLn4uXa9U2Zxsdygka87gys3pZZ16d1VbC25mryAsCgbm8dp7GriXd9FfJytMY+M+Q==',key_name='tempest-keypair-1534024740',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:19:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='32e51e3a9a8f4a1ca6e022735ebf5f7b',ramdisk_id='',reservation_id='r-18bazq8j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1538648925',owner_user_name='tempest-ServerActionsTestOtherB-1538648925-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:19:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ee2d4931cb504b13b92a2f52c95c05ce',uuid=99126e58-be6b-4a8d-bd7e-82d08cc3b61b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.613 186962 DEBUG nova.network.os_vif_util [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converting VIF {"id": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "address": "fa:16:3e:55:12:21", "network": {"id": "df7cfc35-3f76-45b2-b70c-e4525d38f410", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1072835336-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32e51e3a9a8f4a1ca6e022735ebf5f7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap160dd2b8-54", "ovs_interfaceid": "160dd2b8-54e7-490c-8d0e-b15f57edcc04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.614 186962 DEBUG nova.network.os_vif_util [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.614 186962 DEBUG os_vif [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.616 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.616 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap160dd2b8-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.618 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.618 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.619 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.620 186962 DEBUG oslo_concurrency.lockutils [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquired lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.620 186962 DEBUG nova.network.neutron [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.621 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.624 186962 INFO os_vif [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:12:21,bridge_name='br-int',has_traffic_filtering=True,id=160dd2b8-54e7-490c-8d0e-b15f57edcc04,network=Network(df7cfc35-3f76-45b2-b70c-e4525d38f410),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap160dd2b8-54')#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.624 186962 INFO nova.virt.libvirt.driver [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Deleting instance files /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b_del#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.630 186962 INFO nova.virt.libvirt.driver [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Deletion of /var/lib/nova/instances/99126e58-be6b-4a8d-bd7e-82d08cc3b61b_del complete#033[00m
Nov 29 02:19:25 np0005539505 podman[234956]: 2025-11-29 07:19:25.634801106 +0000 UTC m=+0.043961079 container remove 2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.639 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[538c0dba-c66d-4b21-a9d0-640a7fce33ac]: (4, ('Sat Nov 29 07:19:25 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 (2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3)\n2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3\nSat Nov 29 07:19:25 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 (2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3)\n2eb49ff895ca3886060eef4c8a779e863e5371049ebbde6f09f34b0169dd53c3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.641 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7a6a4d32-ff61-4e53-bb20-c9d06c3967a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.642 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf7cfc35-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.643 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:25 np0005539505 kernel: tapdf7cfc35-30: left promiscuous mode
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.645 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.647 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[38ea64ee-ee83-4a54-89ff-5295a52c4be6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.655 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.670 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[25a232ce-6541-4c08-bf6d-aef297d9368b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.672 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3bebc6e3-b174-4d5c-854e-533b2c7080de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.688 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5f839bb4-c9e1-4b08-b499-9e1ea5f596a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 619839, 'reachable_time': 33452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234971, 'error': None, 'target': 'ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.691 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-df7cfc35-3f76-45b2-b70c-e4525d38f410 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:19:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:25.691 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[e315de27-d5a2-4683-8866-55cfd2d7cd6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:25 np0005539505 systemd[1]: run-netns-ovnmeta\x2ddf7cfc35\x2d3f76\x2d45b2\x2db70c\x2de4525d38f410.mount: Deactivated successfully.
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.738 186962 INFO nova.compute.manager [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.738 186962 DEBUG oslo.service.loopingcall [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.738 186962 DEBUG nova.compute.manager [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:19:25 np0005539505 nova_compute[186958]: 2025-11-29 07:19:25.739 186962 DEBUG nova.network.neutron [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:19:26 np0005539505 nova_compute[186958]: 2025-11-29 07:19:26.079 186962 DEBUG nova.compute.manager [req-5f6181a9-60a3-48a4-bbb6-d2ca4b5592f8 req-66a0eb22-9ffb-408a-9505-863361f1f10c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-vif-unplugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:26 np0005539505 nova_compute[186958]: 2025-11-29 07:19:26.079 186962 DEBUG oslo_concurrency.lockutils [req-5f6181a9-60a3-48a4-bbb6-d2ca4b5592f8 req-66a0eb22-9ffb-408a-9505-863361f1f10c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:26 np0005539505 nova_compute[186958]: 2025-11-29 07:19:26.079 186962 DEBUG oslo_concurrency.lockutils [req-5f6181a9-60a3-48a4-bbb6-d2ca4b5592f8 req-66a0eb22-9ffb-408a-9505-863361f1f10c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:26 np0005539505 nova_compute[186958]: 2025-11-29 07:19:26.080 186962 DEBUG oslo_concurrency.lockutils [req-5f6181a9-60a3-48a4-bbb6-d2ca4b5592f8 req-66a0eb22-9ffb-408a-9505-863361f1f10c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:26 np0005539505 nova_compute[186958]: 2025-11-29 07:19:26.080 186962 DEBUG nova.compute.manager [req-5f6181a9-60a3-48a4-bbb6-d2ca4b5592f8 req-66a0eb22-9ffb-408a-9505-863361f1f10c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] No waiting events found dispatching network-vif-unplugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:19:26 np0005539505 nova_compute[186958]: 2025-11-29 07:19:26.080 186962 DEBUG nova.compute.manager [req-5f6181a9-60a3-48a4-bbb6-d2ca4b5592f8 req-66a0eb22-9ffb-408a-9505-863361f1f10c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-vif-unplugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:19:26 np0005539505 nova_compute[186958]: 2025-11-29 07:19:26.762 186962 DEBUG nova.compute.manager [req-b12ae91e-edd2-4029-b470-ef918bd15411 req-0ff0f3ed-a708-490f-a9ea-29024a07cff9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-changed-4c93f14a-a590-48c6-acc4-f7ec9a91f59f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:26 np0005539505 nova_compute[186958]: 2025-11-29 07:19:26.762 186962 DEBUG nova.compute.manager [req-b12ae91e-edd2-4029-b470-ef918bd15411 req-0ff0f3ed-a708-490f-a9ea-29024a07cff9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing instance network info cache due to event network-changed-4c93f14a-a590-48c6-acc4-f7ec9a91f59f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:19:26 np0005539505 nova_compute[186958]: 2025-11-29 07:19:26.762 186962 DEBUG oslo_concurrency.lockutils [req-b12ae91e-edd2-4029-b470-ef918bd15411 req-0ff0f3ed-a708-490f-a9ea-29024a07cff9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:19:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:26.958 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:26.959 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:26.960 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.026 186962 DEBUG nova.network.neutron [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.094 186962 INFO nova.network.neutron [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Port b1f37250-55e3-4fc4-a9bb-2dedac4d03f5 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.094 186962 DEBUG nova.network.neutron [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updating instance_info_cache with network_info: [{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.145 186962 INFO nova.compute.manager [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Took 1.41 seconds to deallocate network for instance.#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.161 186962 DEBUG nova.compute.manager [req-33f3027c-8019-48e5-96fe-231cef40c0ee req-688828a6-2245-4146-a2c1-c093a4ea2282 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-vif-deleted-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.161 186962 INFO nova.compute.manager [req-33f3027c-8019-48e5-96fe-231cef40c0ee req-688828a6-2245-4146-a2c1-c093a4ea2282 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Neutron deleted interface 160dd2b8-54e7-490c-8d0e-b15f57edcc04; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.161 186962 DEBUG nova.network.neutron [req-33f3027c-8019-48e5-96fe-231cef40c0ee req-688828a6-2245-4146-a2c1-c093a4ea2282 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.199 186962 DEBUG oslo_concurrency.lockutils [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Releasing lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.203 186962 DEBUG oslo_concurrency.lockutils [req-b12ae91e-edd2-4029-b470-ef918bd15411 req-0ff0f3ed-a708-490f-a9ea-29024a07cff9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.204 186962 DEBUG nova.network.neutron [req-b12ae91e-edd2-4029-b470-ef918bd15411 req-0ff0f3ed-a708-490f-a9ea-29024a07cff9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Refreshing network info cache for port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.207 186962 DEBUG nova.compute.manager [req-33f3027c-8019-48e5-96fe-231cef40c0ee req-688828a6-2245-4146-a2c1-c093a4ea2282 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Detach interface failed, port_id=160dd2b8-54e7-490c-8d0e-b15f57edcc04, reason: Instance 99126e58-be6b-4a8d-bd7e-82d08cc3b61b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.253 186962 DEBUG oslo_concurrency.lockutils [None req-c2cdd25a-3433-416e-8f85-439ca42a8fac 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "interface-d71f022d-ac2d-48cb-bc26-3a9097ba969e-b1f37250-55e3-4fc4-a9bb-2dedac4d03f5" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 8.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.293 186962 DEBUG oslo_concurrency.lockutils [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.293 186962 DEBUG oslo_concurrency.lockutils [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.487 186962 DEBUG nova.compute.provider_tree [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.514 186962 DEBUG nova.scheduler.client.report [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.565 186962 DEBUG oslo_concurrency.lockutils [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:27 np0005539505 nova_compute[186958]: 2025-11-29 07:19:27.652 186962 INFO nova.scheduler.client.report [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Deleted allocations for instance 99126e58-be6b-4a8d-bd7e-82d08cc3b61b#033[00m
Nov 29 02:19:28 np0005539505 nova_compute[186958]: 2025-11-29 07:19:28.950 186962 DEBUG nova.compute.manager [req-fad560c6-c1af-4710-a841-3aefadf1f117 req-49a07396-1bc8-4dc8-a3b2-cb83ea7ecfc5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:28 np0005539505 nova_compute[186958]: 2025-11-29 07:19:28.951 186962 DEBUG oslo_concurrency.lockutils [req-fad560c6-c1af-4710-a841-3aefadf1f117 req-49a07396-1bc8-4dc8-a3b2-cb83ea7ecfc5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:28 np0005539505 nova_compute[186958]: 2025-11-29 07:19:28.951 186962 DEBUG oslo_concurrency.lockutils [req-fad560c6-c1af-4710-a841-3aefadf1f117 req-49a07396-1bc8-4dc8-a3b2-cb83ea7ecfc5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:28 np0005539505 nova_compute[186958]: 2025-11-29 07:19:28.952 186962 DEBUG oslo_concurrency.lockutils [req-fad560c6-c1af-4710-a841-3aefadf1f117 req-49a07396-1bc8-4dc8-a3b2-cb83ea7ecfc5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:28 np0005539505 nova_compute[186958]: 2025-11-29 07:19:28.952 186962 DEBUG nova.compute.manager [req-fad560c6-c1af-4710-a841-3aefadf1f117 req-49a07396-1bc8-4dc8-a3b2-cb83ea7ecfc5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] No waiting events found dispatching network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:19:28 np0005539505 nova_compute[186958]: 2025-11-29 07:19:28.953 186962 WARNING nova.compute.manager [req-fad560c6-c1af-4710-a841-3aefadf1f117 req-49a07396-1bc8-4dc8-a3b2-cb83ea7ecfc5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Received unexpected event network-vif-plugged-160dd2b8-54e7-490c-8d0e-b15f57edcc04 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:19:29 np0005539505 nova_compute[186958]: 2025-11-29 07:19:29.193 186962 DEBUG nova.network.neutron [req-b12ae91e-edd2-4029-b470-ef918bd15411 req-0ff0f3ed-a708-490f-a9ea-29024a07cff9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updated VIF entry in instance network info cache for port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:19:29 np0005539505 nova_compute[186958]: 2025-11-29 07:19:29.194 186962 DEBUG nova.network.neutron [req-b12ae91e-edd2-4029-b470-ef918bd15411 req-0ff0f3ed-a708-490f-a9ea-29024a07cff9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updating instance_info_cache with network_info: [{"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:19:30 np0005539505 nova_compute[186958]: 2025-11-29 07:19:30.015 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:30 np0005539505 nova_compute[186958]: 2025-11-29 07:19:30.179 186962 DEBUG oslo_concurrency.lockutils [req-b12ae91e-edd2-4029-b470-ef918bd15411 req-0ff0f3ed-a708-490f-a9ea-29024a07cff9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-d71f022d-ac2d-48cb-bc26-3a9097ba969e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:19:30 np0005539505 nova_compute[186958]: 2025-11-29 07:19:30.193 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:30 np0005539505 nova_compute[186958]: 2025-11-29 07:19:30.618 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:31 np0005539505 nova_compute[186958]: 2025-11-29 07:19:31.418 186962 DEBUG oslo_concurrency.lockutils [None req-edcf0812-3072-4306-af18-9aba5a161062 ee2d4931cb504b13b92a2f52c95c05ce 32e51e3a9a8f4a1ca6e022735ebf5f7b - - default default] Lock "99126e58-be6b-4a8d-bd7e-82d08cc3b61b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:35 np0005539505 nova_compute[186958]: 2025-11-29 07:19:35.032 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:35 np0005539505 nova_compute[186958]: 2025-11-29 07:19:35.620 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:35 np0005539505 nova_compute[186958]: 2025-11-29 07:19:35.720 186962 DEBUG oslo_concurrency.lockutils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:35 np0005539505 nova_compute[186958]: 2025-11-29 07:19:35.721 186962 DEBUG oslo_concurrency.lockutils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:35 np0005539505 nova_compute[186958]: 2025-11-29 07:19:35.721 186962 INFO nova.compute.manager [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Shelving#033[00m
Nov 29 02:19:35 np0005539505 nova_compute[186958]: 2025-11-29 07:19:35.842 186962 DEBUG nova.virt.libvirt.driver [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:19:36 np0005539505 podman[234977]: 2025-11-29 07:19:36.728727335 +0000 UTC m=+0.056094642 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 02:19:36 np0005539505 podman[234978]: 2025-11-29 07:19:36.742172077 +0000 UTC m=+0.062959768 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:19:38 np0005539505 kernel: tapc373f1d7-16 (unregistering): left promiscuous mode
Nov 29 02:19:38 np0005539505 NetworkManager[55134]: <info>  [1764400778.3586] device (tapc373f1d7-16): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:19:38 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:38Z|00490|binding|INFO|Releasing lport c373f1d7-168e-494b-8e6f-c8af44b0db68 from this chassis (sb_readonly=0)
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.366 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:38 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:38Z|00491|binding|INFO|Setting lport c373f1d7-168e-494b-8e6f-c8af44b0db68 down in Southbound
Nov 29 02:19:38 np0005539505 ovn_controller[95143]: 2025-11-29T07:19:38Z|00492|binding|INFO|Removing iface tapc373f1d7-16 ovn-installed in OVS
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.369 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.379 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:38.387 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:c4:96 10.100.0.5'], port_security=['fa:16:3e:1b:c4:96 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14d61e69-b152-4adc-a95c-58748969e299', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '329bbbdd41424742b3045e77150a498e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '24db58f8-235a-4b76-869f-efe13404b22a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61c05e4b-7426-41e7-9cd6-8f37a87e832e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=c373f1d7-168e-494b-8e6f-c8af44b0db68) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:19:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:38.388 104094 INFO neutron.agent.ovn.metadata.agent [-] Port c373f1d7-168e-494b-8e6f-c8af44b0db68 in datapath 14d61e69-b152-4adc-a95c-58748969e299 unbound from our chassis#033[00m
Nov 29 02:19:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:38.390 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 14d61e69-b152-4adc-a95c-58748969e299, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:19:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:38.391 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fc497f87-a28f-47ed-9650-3f3a65d08c1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:38.391 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 namespace which is not needed anymore#033[00m
Nov 29 02:19:38 np0005539505 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000068.scope: Deactivated successfully.
Nov 29 02:19:38 np0005539505 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000068.scope: Consumed 18.070s CPU time.
Nov 29 02:19:38 np0005539505 systemd-machined[153285]: Machine qemu-54-instance-00000068 terminated.
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.803 186962 DEBUG nova.compute.manager [req-0bd506bf-7268-4402-9fa6-a92838973178 req-e3781387-bee8-429c-ac7f-535acf3fe9ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-unplugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.804 186962 DEBUG oslo_concurrency.lockutils [req-0bd506bf-7268-4402-9fa6-a92838973178 req-e3781387-bee8-429c-ac7f-535acf3fe9ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.804 186962 DEBUG oslo_concurrency.lockutils [req-0bd506bf-7268-4402-9fa6-a92838973178 req-e3781387-bee8-429c-ac7f-535acf3fe9ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.804 186962 DEBUG oslo_concurrency.lockutils [req-0bd506bf-7268-4402-9fa6-a92838973178 req-e3781387-bee8-429c-ac7f-535acf3fe9ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.805 186962 DEBUG nova.compute.manager [req-0bd506bf-7268-4402-9fa6-a92838973178 req-e3781387-bee8-429c-ac7f-535acf3fe9ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] No waiting events found dispatching network-vif-unplugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.805 186962 WARNING nova.compute.manager [req-0bd506bf-7268-4402-9fa6-a92838973178 req-e3781387-bee8-429c-ac7f-535acf3fe9ec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received unexpected event network-vif-unplugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 for instance with vm_state active and task_state shelving.#033[00m
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.858 186962 INFO nova.virt.libvirt.driver [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.866 186962 INFO nova.virt.libvirt.driver [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance destroyed successfully.#033[00m
Nov 29 02:19:38 np0005539505 nova_compute[186958]: 2025-11-29 07:19:38.867 186962 DEBUG nova.objects.instance [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'numa_topology' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:19:39 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[233535]: [NOTICE]   (233539) : haproxy version is 2.8.14-c23fe91
Nov 29 02:19:39 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[233535]: [NOTICE]   (233539) : path to executable is /usr/sbin/haproxy
Nov 29 02:19:39 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[233535]: [WARNING]  (233539) : Exiting Master process...
Nov 29 02:19:39 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[233535]: [ALERT]    (233539) : Current worker (233541) exited with code 143 (Terminated)
Nov 29 02:19:39 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[233535]: [WARNING]  (233539) : All workers exited. Exiting... (0)
Nov 29 02:19:39 np0005539505 systemd[1]: libpod-84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9.scope: Deactivated successfully.
Nov 29 02:19:39 np0005539505 podman[235045]: 2025-11-29 07:19:39.071801775 +0000 UTC m=+0.599509831 container died 84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:19:39 np0005539505 nova_compute[186958]: 2025-11-29 07:19:39.297 186962 INFO nova.virt.libvirt.driver [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Beginning cold snapshot process#033[00m
Nov 29 02:19:39 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9-userdata-shm.mount: Deactivated successfully.
Nov 29 02:19:39 np0005539505 systemd[1]: var-lib-containers-storage-overlay-1dba7ac76e171b26d7c4d0c26c5cc2b44a89bc51ef048df2852b0927c97a510d-merged.mount: Deactivated successfully.
Nov 29 02:19:39 np0005539505 podman[235075]: 2025-11-29 07:19:39.786292017 +0000 UTC m=+0.691012217 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 02:19:40 np0005539505 nova_compute[186958]: 2025-11-29 07:19:40.035 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:40 np0005539505 podman[235045]: 2025-11-29 07:19:40.196671681 +0000 UTC m=+1.724379737 container cleanup 84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:19:40 np0005539505 systemd[1]: libpod-conmon-84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9.scope: Deactivated successfully.
Nov 29 02:19:40 np0005539505 nova_compute[186958]: 2025-11-29 07:19:40.560 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400765.559311, 99126e58-be6b-4a8d-bd7e-82d08cc3b61b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:19:40 np0005539505 nova_compute[186958]: 2025-11-29 07:19:40.561 186962 INFO nova.compute.manager [-] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:19:40 np0005539505 podman[235110]: 2025-11-29 07:19:40.618467928 +0000 UTC m=+0.388663818 container remove 84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:19:40 np0005539505 nova_compute[186958]: 2025-11-29 07:19:40.624 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:40.625 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[852f9cda-c920-4bd0-8d99-f1b7a5091eb0]: (4, ('Sat Nov 29 07:19:38 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 (84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9)\n84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9\nSat Nov 29 07:19:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 (84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9)\n84b7a98f0ca01918d00cca385d8eecc1530c228645f65cd828f4440cd71b12b9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:40.627 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a5f86424-0130-4c9b-b2a5-45c639afc3c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:40.628 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14d61e69-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:19:40 np0005539505 nova_compute[186958]: 2025-11-29 07:19:40.630 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:40 np0005539505 kernel: tap14d61e69-b0: left promiscuous mode
Nov 29 02:19:40 np0005539505 nova_compute[186958]: 2025-11-29 07:19:40.644 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:40.647 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[462702ff-2580-454f-a037-63ae586e76f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:40.704 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[93a8c862-aadb-45df-aba0-c9f8c7b50b42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:40.706 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[660bc67d-56da-4f6c-9bcd-d484a35aa469]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:40.723 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b79406ac-10ab-4373-a0be-e911e93c8a3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 607101, 'reachable_time': 16574, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235129, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:40.725 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:19:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:19:40.725 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[7a62f97d-f4b7-4097-b72f-3a77a6dabb67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:19:40 np0005539505 systemd[1]: run-netns-ovnmeta\x2d14d61e69\x2db152\x2d4adc\x2da95c\x2d58748969e299.mount: Deactivated successfully.
Nov 29 02:19:41 np0005539505 nova_compute[186958]: 2025-11-29 07:19:41.706 186962 DEBUG nova.compute.manager [None req-def86b2e-7678-49e8-8fd6-3870e6f1257b - - - - - -] [instance: 99126e58-be6b-4a8d-bd7e-82d08cc3b61b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:19:41 np0005539505 nova_compute[186958]: 2025-11-29 07:19:41.722 186962 DEBUG nova.compute.manager [req-e6dbaad1-5f7d-447e-be11-482a3ca76eac req-55c39f8c-998d-440e-9d82-86736844e3ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:19:41 np0005539505 nova_compute[186958]: 2025-11-29 07:19:41.723 186962 DEBUG oslo_concurrency.lockutils [req-e6dbaad1-5f7d-447e-be11-482a3ca76eac req-55c39f8c-998d-440e-9d82-86736844e3ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:19:41 np0005539505 nova_compute[186958]: 2025-11-29 07:19:41.758 186962 DEBUG oslo_concurrency.lockutils [req-e6dbaad1-5f7d-447e-be11-482a3ca76eac req-55c39f8c-998d-440e-9d82-86736844e3ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:19:41 np0005539505 nova_compute[186958]: 2025-11-29 07:19:41.758 186962 DEBUG oslo_concurrency.lockutils [req-e6dbaad1-5f7d-447e-be11-482a3ca76eac req-55c39f8c-998d-440e-9d82-86736844e3ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:19:41 np0005539505 nova_compute[186958]: 2025-11-29 07:19:41.759 186962 DEBUG nova.compute.manager [req-e6dbaad1-5f7d-447e-be11-482a3ca76eac req-55c39f8c-998d-440e-9d82-86736844e3ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] No waiting events found dispatching network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:19:41 np0005539505 nova_compute[186958]: 2025-11-29 07:19:41.759 186962 WARNING nova.compute.manager [req-e6dbaad1-5f7d-447e-be11-482a3ca76eac req-55c39f8c-998d-440e-9d82-86736844e3ba 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received unexpected event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Nov 29 02:19:41 np0005539505 nova_compute[186958]: 2025-11-29 07:19:41.947 186962 DEBUG nova.privsep.utils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:19:41 np0005539505 nova_compute[186958]: 2025-11-29 07:19:41.948 186962 DEBUG oslo_concurrency.processutils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk /var/lib/nova/instances/snapshots/tmps7r13_nk/721142605c69463dab31f444571984d4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:19:43 np0005539505 nova_compute[186958]: 2025-11-29 07:19:43.791 186962 DEBUG oslo_concurrency.processutils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk /var/lib/nova/instances/snapshots/tmps7r13_nk/721142605c69463dab31f444571984d4" returned: 0 in 1.843s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:19:43 np0005539505 nova_compute[186958]: 2025-11-29 07:19:43.792 186962 INFO nova.virt.libvirt.driver [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:19:45 np0005539505 nova_compute[186958]: 2025-11-29 07:19:45.037 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:45 np0005539505 nova_compute[186958]: 2025-11-29 07:19:45.626 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:48 np0005539505 podman[235141]: 2025-11-29 07:19:48.718152061 +0000 UTC m=+0.051916854 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:19:48 np0005539505 podman[235142]: 2025-11-29 07:19:48.746610408 +0000 UTC m=+0.077756657 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 29 02:19:50 np0005539505 nova_compute[186958]: 2025-11-29 07:19:50.038 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:50 np0005539505 nova_compute[186958]: 2025-11-29 07:19:50.629 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:53 np0005539505 nova_compute[186958]: 2025-11-29 07:19:53.626 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400778.6249523, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:19:53 np0005539505 nova_compute[186958]: 2025-11-29 07:19:53.627 186962 INFO nova.compute.manager [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:19:54 np0005539505 podman[235194]: 2025-11-29 07:19:54.748078478 +0000 UTC m=+0.064180092 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:19:54 np0005539505 podman[235193]: 2025-11-29 07:19:54.759208544 +0000 UTC m=+0.087030290 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 29 02:19:55 np0005539505 nova_compute[186958]: 2025-11-29 07:19:55.041 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:55 np0005539505 nova_compute[186958]: 2025-11-29 07:19:55.631 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:19:58 np0005539505 nova_compute[186958]: 2025-11-29 07:19:58.868 186962 DEBUG nova.compute.manager [None req-d11e7fc8-8805-4fc5-82b5-142457ea07da - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:19:58 np0005539505 nova_compute[186958]: 2025-11-29 07:19:58.873 186962 DEBUG nova.compute.manager [None req-d11e7fc8-8805-4fc5-82b5-142457ea07da - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: shelving_image_uploading, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:19:59 np0005539505 nova_compute[186958]: 2025-11-29 07:19:59.016 186962 INFO nova.compute.manager [None req-d11e7fc8-8805-4fc5-82b5-142457ea07da - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] During sync_power_state the instance has a pending task (shelving_image_uploading). Skip.#033[00m
Nov 29 02:20:00 np0005539505 nova_compute[186958]: 2025-11-29 07:20:00.042 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:00 np0005539505 nova_compute[186958]: 2025-11-29 07:20:00.633 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:01 np0005539505 nova_compute[186958]: 2025-11-29 07:20:01.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:01 np0005539505 nova_compute[186958]: 2025-11-29 07:20:01.853 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:03 np0005539505 nova_compute[186958]: 2025-11-29 07:20:03.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:05 np0005539505 nova_compute[186958]: 2025-11-29 07:20:05.045 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:05 np0005539505 nova_compute[186958]: 2025-11-29 07:20:05.187 186962 INFO nova.virt.libvirt.driver [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Snapshot image upload complete#033[00m
Nov 29 02:20:05 np0005539505 nova_compute[186958]: 2025-11-29 07:20:05.187 186962 DEBUG nova.compute.manager [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:05 np0005539505 nova_compute[186958]: 2025-11-29 07:20:05.302 186962 INFO nova.compute.manager [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Shelve offloading#033[00m
Nov 29 02:20:05 np0005539505 nova_compute[186958]: 2025-11-29 07:20:05.323 186962 INFO nova.virt.libvirt.driver [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance destroyed successfully.#033[00m
Nov 29 02:20:05 np0005539505 nova_compute[186958]: 2025-11-29 07:20:05.324 186962 DEBUG nova.compute.manager [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:05 np0005539505 nova_compute[186958]: 2025-11-29 07:20:05.326 186962 DEBUG oslo_concurrency.lockutils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:05 np0005539505 nova_compute[186958]: 2025-11-29 07:20:05.326 186962 DEBUG oslo_concurrency.lockutils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquired lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:05 np0005539505 nova_compute[186958]: 2025-11-29 07:20:05.326 186962 DEBUG nova.network.neutron [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:20:05 np0005539505 nova_compute[186958]: 2025-11-29 07:20:05.636 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:07 np0005539505 nova_compute[186958]: 2025-11-29 07:20:07.099 186962 DEBUG nova.network.neutron [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating instance_info_cache with network_info: [{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:07 np0005539505 nova_compute[186958]: 2025-11-29 07:20:07.124 186962 DEBUG oslo_concurrency.lockutils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Releasing lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:07Z|00493|binding|INFO|Releasing lport 71b1ea47-55d6-453c-a181-e6370c4f7968 from this chassis (sb_readonly=0)
Nov 29 02:20:07 np0005539505 nova_compute[186958]: 2025-11-29 07:20:07.188 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:07 np0005539505 podman[235235]: 2025-11-29 07:20:07.717935324 +0000 UTC m=+0.048839017 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:20:07 np0005539505 podman[235234]: 2025-11-29 07:20:07.718110759 +0000 UTC m=+0.051912604 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.283 186962 INFO nova.virt.libvirt.driver [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance destroyed successfully.#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.284 186962 DEBUG nova.objects.instance [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'resources' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.296 186962 DEBUG nova.virt.libvirt.vif [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1506153238',display_name='tempest-ServersNegativeTestJSON-server-1506153238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1506153238',id=104,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:17:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='329bbbdd41424742b3045e77150a498e',ramdisk_id='',reservation_id='r-widkbdap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1191192320',owner_user_name='tempest-ServersNegativeTestJSON-1191192320-project-member',shelved_at='2025-11-29T07:20:05.187714',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='36058703-a33f-4375-a956-154dfec4971e'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:19:44Z,user_data=None,user_id='2647a3e4fc214b4a85db1283eb7ef117',uuid=aa4795d1-71b1-415f-ac22-5bb11775bc84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.297 186962 DEBUG nova.network.os_vif_util [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converting VIF {"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.298 186962 DEBUG nova.network.os_vif_util [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.298 186962 DEBUG os_vif [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.300 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.300 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc373f1d7-16, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.302 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.303 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.306 186962 INFO os_vif [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16')#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.307 186962 INFO nova.virt.libvirt.driver [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Deleting instance files /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84_del#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.313 186962 INFO nova.virt.libvirt.driver [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Deletion of /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84_del complete#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.420 186962 INFO nova.scheduler.client.report [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Deleted allocations for instance aa4795d1-71b1-415f-ac22-5bb11775bc84#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.456 186962 DEBUG nova.compute.manager [req-06c54318-c561-4eb3-a331-bb4891086736 req-273466bd-cbc8-4b33-aa24-6cd4aa5b5f25 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-changed-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.457 186962 DEBUG nova.compute.manager [req-06c54318-c561-4eb3-a331-bb4891086736 req-273466bd-cbc8-4b33-aa24-6cd4aa5b5f25 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Refreshing instance network info cache due to event network-changed-c373f1d7-168e-494b-8e6f-c8af44b0db68. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.457 186962 DEBUG oslo_concurrency.lockutils [req-06c54318-c561-4eb3-a331-bb4891086736 req-273466bd-cbc8-4b33-aa24-6cd4aa5b5f25 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.457 186962 DEBUG oslo_concurrency.lockutils [req-06c54318-c561-4eb3-a331-bb4891086736 req-273466bd-cbc8-4b33-aa24-6cd4aa5b5f25 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.457 186962 DEBUG nova.network.neutron [req-06c54318-c561-4eb3-a331-bb4891086736 req-273466bd-cbc8-4b33-aa24-6cd4aa5b5f25 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Refreshing network info cache for port c373f1d7-168e-494b-8e6f-c8af44b0db68 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.484 186962 DEBUG oslo_concurrency.lockutils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.484 186962 DEBUG oslo_concurrency.lockutils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.530 186962 DEBUG nova.compute.provider_tree [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.544 186962 DEBUG nova.scheduler.client.report [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.576 186962 DEBUG oslo_concurrency.lockutils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.649 186962 DEBUG oslo_concurrency.lockutils [None req-5f3f8705-f7d7-400e-a379-ee53bfff6627 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 32.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:08.796 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:08 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:08.797 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:20:08 np0005539505 nova_compute[186958]: 2025-11-29 07:20:08.797 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:09 np0005539505 nova_compute[186958]: 2025-11-29 07:20:09.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:09 np0005539505 nova_compute[186958]: 2025-11-29 07:20:09.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:09 np0005539505 nova_compute[186958]: 2025-11-29 07:20:09.916 186962 DEBUG nova.network.neutron [req-06c54318-c561-4eb3-a331-bb4891086736 req-273466bd-cbc8-4b33-aa24-6cd4aa5b5f25 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updated VIF entry in instance network info cache for port c373f1d7-168e-494b-8e6f-c8af44b0db68. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:20:09 np0005539505 nova_compute[186958]: 2025-11-29 07:20:09.917 186962 DEBUG nova.network.neutron [req-06c54318-c561-4eb3-a331-bb4891086736 req-273466bd-cbc8-4b33-aa24-6cd4aa5b5f25 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating instance_info_cache with network_info: [{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": null, "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tapc373f1d7-16", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:09 np0005539505 nova_compute[186958]: 2025-11-29 07:20:09.937 186962 DEBUG oslo_concurrency.lockutils [req-06c54318-c561-4eb3-a331-bb4891086736 req-273466bd-cbc8-4b33-aa24-6cd4aa5b5f25 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:10 np0005539505 nova_compute[186958]: 2025-11-29 07:20:10.046 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:10 np0005539505 podman[235274]: 2025-11-29 07:20:10.722493812 +0000 UTC m=+0.055416303 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:20:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:10.799 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:13 np0005539505 nova_compute[186958]: 2025-11-29 07:20:13.303 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.048 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.154 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.155 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.155 186962 INFO nova.compute.manager [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Unshelving#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.292 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.293 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.297 186962 DEBUG nova.objects.instance [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'pci_requests' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.315 186962 DEBUG nova.objects.instance [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'numa_topology' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.339 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.339 186962 INFO nova.compute.claims [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.522 186962 DEBUG nova.compute.provider_tree [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.544 186962 DEBUG nova.scheduler.client.report [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.571 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:15 np0005539505 nova_compute[186958]: 2025-11-29 07:20:15.729 186962 INFO nova.network.neutron [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating port c373f1d7-168e-494b-8e6f-c8af44b0db68 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.368 186962 DEBUG oslo_concurrency.lockutils [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.368 186962 DEBUG oslo_concurrency.lockutils [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.369 186962 DEBUG oslo_concurrency.lockutils [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.369 186962 DEBUG oslo_concurrency.lockutils [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.369 186962 DEBUG oslo_concurrency.lockutils [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.379 186962 INFO nova.compute.manager [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Terminating instance#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.392 186962 DEBUG nova.compute.manager [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.397 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.398 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquired lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.398 186962 DEBUG nova.network.neutron [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.402 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.403 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.403 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.403 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:20:16 np0005539505 kernel: tap4c93f14a-a5 (unregistering): left promiscuous mode
Nov 29 02:20:16 np0005539505 NetworkManager[55134]: <info>  [1764400816.4189] device (tap4c93f14a-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.426 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:16 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:16Z|00494|binding|INFO|Releasing lport 4c93f14a-a590-48c6-acc4-f7ec9a91f59f from this chassis (sb_readonly=0)
Nov 29 02:20:16 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:16Z|00495|binding|INFO|Setting lport 4c93f14a-a590-48c6-acc4-f7ec9a91f59f down in Southbound
Nov 29 02:20:16 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:16Z|00496|binding|INFO|Removing iface tap4c93f14a-a5 ovn-installed in OVS
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.431 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.440 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:74:96 10.100.0.11'], port_security=['fa:16:3e:74:74:96 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd71f022d-ac2d-48cb-bc26-3a9097ba969e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90812230-35cb-4e21-b16b-75b900100d8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '16d7af1670ea460db3d0422f176b6f98', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a81715ba-eace-471d-9f71-9964fcbf6d85', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41b9bfbf-a9b3-4bdb-9144-e5db6a660517, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=4c93f14a-a590-48c6-acc4-f7ec9a91f59f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.444 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 4c93f14a-a590-48c6-acc4-f7ec9a91f59f in datapath 90812230-35cb-4e21-b16b-75b900100d8b unbound from our chassis#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.445 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.445 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90812230-35cb-4e21-b16b-75b900100d8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.446 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7522d538-d0c9-453d-8bd1-bb1da9f9efdd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.447 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b namespace which is not needed anymore#033[00m
Nov 29 02:20:16 np0005539505 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Nov 29 02:20:16 np0005539505 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d0000006b.scope: Consumed 17.848s CPU time.
Nov 29 02:20:16 np0005539505 systemd-machined[153285]: Machine qemu-55-instance-0000006b terminated.
Nov 29 02:20:16 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[234159]: [NOTICE]   (234163) : haproxy version is 2.8.14-c23fe91
Nov 29 02:20:16 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[234159]: [NOTICE]   (234163) : path to executable is /usr/sbin/haproxy
Nov 29 02:20:16 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[234159]: [WARNING]  (234163) : Exiting Master process...
Nov 29 02:20:16 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[234159]: [ALERT]    (234163) : Current worker (234165) exited with code 143 (Terminated)
Nov 29 02:20:16 np0005539505 neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b[234159]: [WARNING]  (234163) : All workers exited. Exiting... (0)
Nov 29 02:20:16 np0005539505 systemd[1]: libpod-12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f.scope: Deactivated successfully.
Nov 29 02:20:16 np0005539505 podman[235318]: 2025-11-29 07:20:16.582436887 +0000 UTC m=+0.050394621 container died 12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:20:16 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f-userdata-shm.mount: Deactivated successfully.
Nov 29 02:20:16 np0005539505 systemd[1]: var-lib-containers-storage-overlay-ceac94602c8b4e563d7c2f5fbb60361cd619db799d6e4b7b18d19b9f3d2ec593-merged.mount: Deactivated successfully.
Nov 29 02:20:16 np0005539505 podman[235318]: 2025-11-29 07:20:16.646947637 +0000 UTC m=+0.114905371 container cleanup 12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:20:16 np0005539505 systemd[1]: libpod-conmon-12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f.scope: Deactivated successfully.
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.670 186962 INFO nova.virt.libvirt.driver [-] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Instance destroyed successfully.#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.670 186962 DEBUG nova.objects.instance [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lazy-loading 'resources' on Instance uuid d71f022d-ac2d-48cb-bc26-3a9097ba969e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.702 186962 DEBUG nova.virt.libvirt.vif [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:17:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1971408115',display_name='tempest-tempest.common.compute-instance-1971408115',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1971408115',id=107,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMX5/zkiagzb49OjkaqyK11NwIMqRBntzGSeeTFE8j9TqNEslf5JYsUjf3moZ2PI1ppBz9BY/MWjfh23WcLG+y1kvSGlR73yX4w+oRZT0XI0twWDsk7St4EGSLFKX+q9yQ==',key_name='tempest-keypair-527901596',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:17:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='16d7af1670ea460db3d0422f176b6f98',ramdisk_id='',reservation_id='r-9f0t20th',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-844604773',owner_user_name='tempest-AttachInterfacesTestJSON-844604773-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:17:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a13f011f5b74a6f94a2d2c8e9104f4a',uuid=d71f022d-ac2d-48cb-bc26-3a9097ba969e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.703 186962 DEBUG nova.network.os_vif_util [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converting VIF {"id": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "address": "fa:16:3e:74:74:96", "network": {"id": "90812230-35cb-4e21-b16b-75b900100d8b", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1372179517-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "16d7af1670ea460db3d0422f176b6f98", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c93f14a-a5", "ovs_interfaceid": "4c93f14a-a590-48c6-acc4-f7ec9a91f59f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.704 186962 DEBUG nova.network.os_vif_util [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:74:96,bridge_name='br-int',has_traffic_filtering=True,id=4c93f14a-a590-48c6-acc4-f7ec9a91f59f,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c93f14a-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.704 186962 DEBUG os_vif [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:74:96,bridge_name='br-int',has_traffic_filtering=True,id=4c93f14a-a590-48c6-acc4-f7ec9a91f59f,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c93f14a-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.705 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.706 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c93f14a-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.748 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.750 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.752 186962 INFO os_vif [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:74:96,bridge_name='br-int',has_traffic_filtering=True,id=4c93f14a-a590-48c6-acc4-f7ec9a91f59f,network=Network(90812230-35cb-4e21-b16b-75b900100d8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c93f14a-a5')#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.753 186962 INFO nova.virt.libvirt.driver [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Deleting instance files /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e_del#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.754 186962 INFO nova.virt.libvirt.driver [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Deletion of /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e_del complete#033[00m
Nov 29 02:20:16 np0005539505 podman[235353]: 2025-11-29 07:20:16.761687573 +0000 UTC m=+0.087173625 container remove 12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.766 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[de7121b2-a770-4952-8a10-a068f69bc6bc]: (4, ('Sat Nov 29 07:20:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b (12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f)\n12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f\nSat Nov 29 07:20:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b (12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f)\n12db6bb758cc7fb00d3abb8fe9d7e006febc6c17fc270b7a12821d42708ffb9f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.768 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2e76775f-dd42-4b9c-af4e-e2d435889384]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.769 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90812230-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.770 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:16 np0005539505 kernel: tap90812230-30: left promiscuous mode
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.773 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.775 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[92611ed2-f5af-4ddd-97db-77a47c63eec9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.785 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.786 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-0000006b, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/d71f022d-ac2d-48cb-bc26-3a9097ba969e/disk#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.796 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dd62274b-eaf4-4097-a6d7-d8c255ce4121]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.798 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[46def2bd-6486-4ebd-84a3-ee5c03bc4761]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.818 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[63ac06f1-4822-47d1-8268-fe4a1c1aac3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 611846, 'reachable_time': 25484, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235371, 'error': None, 'target': 'ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.821 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-90812230-35cb-4e21-b16b-75b900100d8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:20:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:16.821 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[1712601a-ac56-49de-afd9-942b8ae0238b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:16 np0005539505 systemd[1]: run-netns-ovnmeta\x2d90812230\x2d35cb\x2d4e21\x2db16b\x2d75b900100d8b.mount: Deactivated successfully.
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.960 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.962 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5709MB free_disk=73.11267471313477GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.962 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:16 np0005539505 nova_compute[186958]: 2025-11-29 07:20:16.962 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.021 186962 INFO nova.compute.manager [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Took 0.63 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.022 186962 DEBUG oslo.service.loopingcall [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.022 186962 DEBUG nova.compute.manager [-] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.022 186962 DEBUG nova.network.neutron [-] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.064 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance d71f022d-ac2d-48cb-bc26-3a9097ba969e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.064 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance aa4795d1-71b1-415f-ac22-5bb11775bc84 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.065 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.065 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.132 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.151 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.277 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.277 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.627 186962 DEBUG nova.network.neutron [-] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.654 186962 INFO nova.compute.manager [-] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Took 0.63 seconds to deallocate network for instance.#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.678 186962 DEBUG nova.network.neutron [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating instance_info_cache with network_info: [{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.704 186962 DEBUG nova.compute.manager [req-8a9e0de7-16a2-4808-84e0-e2516a8677bd req-d87e263d-b29d-4081-bcae-9b802ed71f75 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-vif-deleted-4c93f14a-a590-48c6-acc4-f7ec9a91f59f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.714 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Releasing lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.716 186962 DEBUG nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.717 186962 INFO nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Creating image(s)#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.718 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.718 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.720 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.720 186962 DEBUG nova.objects.instance [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'trusted_certs' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.741 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "e69d3f3d9d2adc72437096d883077081c1258369" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.743 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "e69d3f3d9d2adc72437096d883077081c1258369" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.751 186962 DEBUG oslo_concurrency.lockutils [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.752 186962 DEBUG oslo_concurrency.lockutils [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:17 np0005539505 nova_compute[186958]: 2025-11-29 07:20:17.815 186962 DEBUG nova.compute.provider_tree [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:18 np0005539505 nova_compute[186958]: 2025-11-29 07:20:18.101 186962 DEBUG nova.scheduler.client.report [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:18 np0005539505 nova_compute[186958]: 2025-11-29 07:20:18.274 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.633 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.634 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.634 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.636 186962 DEBUG oslo_concurrency.lockutils [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.639 186962 DEBUG nova.compute.manager [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-changed-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.640 186962 DEBUG nova.compute.manager [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Refreshing instance network info cache due to event network-changed-c373f1d7-168e-494b-8e6f-c8af44b0db68. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.640 186962 DEBUG oslo_concurrency.lockutils [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.640 186962 DEBUG oslo_concurrency.lockutils [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.641 186962 DEBUG nova.network.neutron [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Refreshing network info cache for port c373f1d7-168e-494b-8e6f-c8af44b0db68 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.671 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.707 186962 INFO nova.scheduler.client.report [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Deleted allocations for instance d71f022d-ac2d-48cb-bc26-3a9097ba969e#033[00m
Nov 29 02:20:19 np0005539505 podman[235372]: 2025-11-29 07:20:19.716194162 +0000 UTC m=+0.050387470 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:20:19 np0005539505 podman[235373]: 2025-11-29 07:20:19.759025867 +0000 UTC m=+0.092387961 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:20:19 np0005539505 nova_compute[186958]: 2025-11-29 07:20:19.821 186962 DEBUG oslo_concurrency.lockutils [None req-495fbbf2-a079-4493-8aec-e82174b619c0 9a13f011f5b74a6f94a2d2c8e9104f4a 16d7af1670ea460db3d0422f176b6f98 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.000 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.050 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.062 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369.part --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.063 186962 DEBUG nova.virt.images [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] 36058703-a33f-4375-a956-154dfec4971e was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.064 186962 DEBUG nova.privsep.utils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.065 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369.part /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.533 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369.part /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369.converted" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.544 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.603 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369.converted --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.605 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "e69d3f3d9d2adc72437096d883077081c1258369" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.862s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.618 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.675 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.676 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "e69d3f3d9d2adc72437096d883077081c1258369" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.677 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "e69d3f3d9d2adc72437096d883077081c1258369" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.688 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.745 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:20 np0005539505 nova_compute[186958]: 2025-11-29 07:20:20.746 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369,backing_fmt=raw /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:21 np0005539505 nova_compute[186958]: 2025-11-29 07:20:21.173 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369,backing_fmt=raw /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk 1073741824" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:21 np0005539505 nova_compute[186958]: 2025-11-29 07:20:21.174 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "e69d3f3d9d2adc72437096d883077081c1258369" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.497s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:21 np0005539505 nova_compute[186958]: 2025-11-29 07:20:21.174 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:21 np0005539505 nova_compute[186958]: 2025-11-29 07:20:21.226 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:21 np0005539505 nova_compute[186958]: 2025-11-29 07:20:21.227 186962 DEBUG nova.objects.instance [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'migration_context' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:21 np0005539505 nova_compute[186958]: 2025-11-29 07:20:21.477 186962 INFO nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Rebasing disk image.#033[00m
Nov 29 02:20:21 np0005539505 nova_compute[186958]: 2025-11-29 07:20:21.477 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:21 np0005539505 nova_compute[186958]: 2025-11-29 07:20:21.531 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:21 np0005539505 nova_compute[186958]: 2025-11-29 07:20:21.532 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 -F raw /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:21 np0005539505 nova_compute[186958]: 2025-11-29 07:20:21.749 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:22 np0005539505 nova_compute[186958]: 2025-11-29 07:20:22.982 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 -F raw /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk" returned: 0 in 1.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:22 np0005539505 nova_compute[186958]: 2025-11-29 07:20:22.982 186962 DEBUG nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:20:22 np0005539505 nova_compute[186958]: 2025-11-29 07:20:22.983 186962 DEBUG nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Ensure instance console log exists: /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:20:22 np0005539505 nova_compute[186958]: 2025-11-29 07:20:22.983 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:22 np0005539505 nova_compute[186958]: 2025-11-29 07:20:22.983 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:22 np0005539505 nova_compute[186958]: 2025-11-29 07:20:22.984 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:22 np0005539505 nova_compute[186958]: 2025-11-29 07:20:22.986 186962 DEBUG nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Start _get_guest_xml network_info=[{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='4e565702f89275e3c016972dccadb9ae',container_format='bare',created_at=2025-11-29T07:19:34Z,direct_url=<?>,disk_format='qcow2',id=36058703-a33f-4375-a956-154dfec4971e,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1506153238-shelved',owner='329bbbdd41424742b3045e77150a498e',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-11-29T07:20:04Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:20:22 np0005539505 nova_compute[186958]: 2025-11-29 07:20:22.989 186962 WARNING nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:20:22 np0005539505 nova_compute[186958]: 2025-11-29 07:20:22.995 186962 DEBUG nova.virt.libvirt.host [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:20:22 np0005539505 nova_compute[186958]: 2025-11-29 07:20:22.996 186962 DEBUG nova.virt.libvirt.host [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:20:22 np0005539505 nova_compute[186958]: 2025-11-29 07:20:22.999 186962 DEBUG nova.virt.libvirt.host [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.000 186962 DEBUG nova.virt.libvirt.host [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.001 186962 DEBUG nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.001 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='4e565702f89275e3c016972dccadb9ae',container_format='bare',created_at=2025-11-29T07:19:34Z,direct_url=<?>,disk_format='qcow2',id=36058703-a33f-4375-a956-154dfec4971e,min_disk=1,min_ram=0,name='tempest-ServersNegativeTestJSON-server-1506153238-shelved',owner='329bbbdd41424742b3045e77150a498e',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-11-29T07:20:04Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.002 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.002 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.002 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.003 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.003 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.003 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.003 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.004 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.004 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.004 186962 DEBUG nova.virt.hardware [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.004 186962 DEBUG nova.objects.instance [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'vcpu_model' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.200 186962 DEBUG nova.virt.libvirt.vif [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1506153238',display_name='tempest-ServersNegativeTestJSON-server-1506153238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1506153238',id=104,image_ref='36058703-a33f-4375-a956-154dfec4971e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:17:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='329bbbdd41424742b3045e77150a498e',ramdisk_id='',reservation_id='r-widkbdap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1191192320',owner_user_name='tempest-ServersNegativeTestJSON-1191192320-project-member',shelved_at='2025-11-29T07:20:05.187714',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='36058703-a33f-4375-a956-154dfec4971e'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:20:15Z,user_data=None,user_id='2647a3e4fc214b4a85db1283eb7ef117',uuid=aa4795d1-71b1-415f-ac22-5bb11775bc84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.200 186962 DEBUG nova.network.os_vif_util [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converting VIF {"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.201 186962 DEBUG nova.network.os_vif_util [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.202 186962 DEBUG nova.objects.instance [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'pci_devices' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.226 186962 DEBUG nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  <uuid>aa4795d1-71b1-415f-ac22-5bb11775bc84</uuid>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  <name>instance-00000068</name>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersNegativeTestJSON-server-1506153238</nova:name>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:20:22</nova:creationTime>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:        <nova:user uuid="2647a3e4fc214b4a85db1283eb7ef117">tempest-ServersNegativeTestJSON-1191192320-project-member</nova:user>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:        <nova:project uuid="329bbbdd41424742b3045e77150a498e">tempest-ServersNegativeTestJSON-1191192320</nova:project>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="36058703-a33f-4375-a956-154dfec4971e"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:        <nova:port uuid="c373f1d7-168e-494b-8e6f-c8af44b0db68">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <entry name="serial">aa4795d1-71b1-415f-ac22-5bb11775bc84</entry>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <entry name="uuid">aa4795d1-71b1-415f-ac22-5bb11775bc84</entry>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.config"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:1b:c4:96"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <target dev="tapc373f1d7-16"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/console.log" append="off"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <input type="keyboard" bus="usb"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:20:23 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:20:23 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:20:23 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:20:23 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.227 186962 DEBUG nova.compute.manager [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Preparing to wait for external event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.228 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.228 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.228 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.229 186962 DEBUG nova.virt.libvirt.vif [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1506153238',display_name='tempest-ServersNegativeTestJSON-server-1506153238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1506153238',id=104,image_ref='36058703-a33f-4375-a956-154dfec4971e',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:17:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='329bbbdd41424742b3045e77150a498e',ramdisk_id='',reservation_id='r-widkbdap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1191192320',owner_user_name='tempest-ServersNegativeTestJSON-1191192320-project-member',shelved_at='2025-11-29T07:20:05.187714',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='36058703-a33f-4375-a956-154dfec4971e'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:20:15Z,user_data=None,user_id='2647a3e4fc214b4a85db1283eb7ef117',uuid=aa4795d1-71b1-415f-ac22-5bb11775bc84,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.229 186962 DEBUG nova.network.os_vif_util [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converting VIF {"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.229 186962 DEBUG nova.network.os_vif_util [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.230 186962 DEBUG os_vif [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.230 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.231 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.231 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.234 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.234 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc373f1d7-16, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.235 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc373f1d7-16, col_values=(('external_ids', {'iface-id': 'c373f1d7-168e-494b-8e6f-c8af44b0db68', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:c4:96', 'vm-uuid': 'aa4795d1-71b1-415f-ac22-5bb11775bc84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:23 np0005539505 NetworkManager[55134]: <info>  [1764400823.2373] manager: (tapc373f1d7-16): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.240 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.244 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.244 186962 INFO os_vif [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16')#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.292 186962 DEBUG nova.network.neutron [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updated VIF entry in instance network info cache for port c373f1d7-168e-494b-8e6f-c8af44b0db68. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.293 186962 DEBUG nova.network.neutron [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating instance_info_cache with network_info: [{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.650 186962 DEBUG oslo_concurrency.lockutils [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.651 186962 DEBUG nova.compute.manager [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-vif-unplugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.652 186962 DEBUG oslo_concurrency.lockutils [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.653 186962 DEBUG oslo_concurrency.lockutils [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.653 186962 DEBUG oslo_concurrency.lockutils [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.653 186962 DEBUG nova.compute.manager [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] No waiting events found dispatching network-vif-unplugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.654 186962 WARNING nova.compute.manager [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received unexpected event network-vif-unplugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.655 186962 DEBUG nova.compute.manager [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received event network-vif-plugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.655 186962 DEBUG oslo_concurrency.lockutils [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.656 186962 DEBUG oslo_concurrency.lockutils [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.656 186962 DEBUG oslo_concurrency.lockutils [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "d71f022d-ac2d-48cb-bc26-3a9097ba969e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.657 186962 DEBUG nova.compute.manager [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] No waiting events found dispatching network-vif-plugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.657 186962 WARNING nova.compute.manager [req-b4e1d8c5-9c74-4637-aea4-f0fde144ffcf req-0af3bc2b-7770-459b-998d-49f74677c12c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Received unexpected event network-vif-plugged-4c93f14a-a590-48c6-acc4-f7ec9a91f59f for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.658 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.659 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.659 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.756 186962 DEBUG nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.756 186962 DEBUG nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.757 186962 DEBUG nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] No VIF found with MAC fa:16:3e:1b:c4:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.757 186962 INFO nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Using config drive#033[00m
Nov 29 02:20:23 np0005539505 nova_compute[186958]: 2025-11-29 07:20:23.951 186962 DEBUG nova.objects.instance [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'ec2_ids' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:24 np0005539505 nova_compute[186958]: 2025-11-29 07:20:24.005 186962 DEBUG nova.objects.instance [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'keypairs' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.052 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.274 186962 INFO nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Creating config drive at /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.config#033[00m
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.278 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpds3ixxvx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.403 186962 DEBUG oslo_concurrency.processutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpds3ixxvx" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:25 np0005539505 kernel: tapc373f1d7-16: entered promiscuous mode
Nov 29 02:20:25 np0005539505 NetworkManager[55134]: <info>  [1764400825.4807] manager: (tapc373f1d7-16): new Tun device (/org/freedesktop/NetworkManager/Devices/245)
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.481 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:25Z|00497|binding|INFO|Claiming lport c373f1d7-168e-494b-8e6f-c8af44b0db68 for this chassis.
Nov 29 02:20:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:25Z|00498|binding|INFO|c373f1d7-168e-494b-8e6f-c8af44b0db68: Claiming fa:16:3e:1b:c4:96 10.100.0.5
Nov 29 02:20:25 np0005539505 systemd-udevd[235498]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:20:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:25Z|00499|binding|INFO|Setting lport c373f1d7-168e-494b-8e6f-c8af44b0db68 ovn-installed in OVS
Nov 29 02:20:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:25Z|00500|binding|INFO|Setting lport c373f1d7-168e-494b-8e6f-c8af44b0db68 up in Southbound
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.592 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.590 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:c4:96 10.100.0.5'], port_security=['fa:16:3e:1b:c4:96 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14d61e69-b152-4adc-a95c-58748969e299', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '329bbbdd41424742b3045e77150a498e', 'neutron:revision_number': '7', 'neutron:security_group_ids': '24db58f8-235a-4b76-869f-efe13404b22a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61c05e4b-7426-41e7-9cd6-8f37a87e832e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=c373f1d7-168e-494b-8e6f-c8af44b0db68) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.591 104094 INFO neutron.agent.ovn.metadata.agent [-] Port c373f1d7-168e-494b-8e6f-c8af44b0db68 in datapath 14d61e69-b152-4adc-a95c-58748969e299 bound to our chassis#033[00m
Nov 29 02:20:25 np0005539505 NetworkManager[55134]: <info>  [1764400825.5944] device (tapc373f1d7-16): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:20:25 np0005539505 NetworkManager[55134]: <info>  [1764400825.5956] device (tapc373f1d7-16): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.596 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 14d61e69-b152-4adc-a95c-58748969e299#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.608 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d30aa70c-72f1-4e00-9929-8aba99158a85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 podman[235468]: 2025-11-29 07:20:25.609341369 +0000 UTC m=+0.132918852 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.609 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap14d61e69-b1 in ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:20:25 np0005539505 systemd-machined[153285]: New machine qemu-57-instance-00000068.
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.611 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap14d61e69-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.611 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[83312208-87f5-4628-a50c-ba9e932f4d66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.612 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3de9a2d6-e47c-471d-ae12-8cc362f0bcf0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.623 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[18249664-0d08-4a45-895a-2f980c917245]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 systemd[1]: Started Virtual Machine qemu-57-instance-00000068.
Nov 29 02:20:25 np0005539505 podman[235467]: 2025-11-29 07:20:25.628848352 +0000 UTC m=+0.156851711 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.636 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7a52e441-a0b3-43ea-a257-66713ba334f7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.663 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[72132a7e-1f08-401a-aa79-c5684caad35b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.671 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc36248-df38-4ea6-8610-9cacb02dcafd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 NetworkManager[55134]: <info>  [1764400825.6728] manager: (tap14d61e69-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/246)
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.698 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a1597764-6649-491c-ad31-8a9ebcbf26f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.702 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3b46a002-21b5-47ac-a649-f0b3af61e8fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 NetworkManager[55134]: <info>  [1764400825.7221] device (tap14d61e69-b0): carrier: link connected
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.728 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[73e04413-e61e-4529-9616-bdf08dea4ffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.747 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b5416b-7219-4d8f-b7f3-b278e7c5c3fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14d61e69-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:42:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627330, 'reachable_time': 42315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235546, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.764 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6123f275-5e71-4c3f-a054-05628111838a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:42d7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 627330, 'tstamp': 627330}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235547, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.781 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d7eb0b65-4938-4d95-9900-e641cfc5fb5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14d61e69-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:42:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627330, 'reachable_time': 42315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235548, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.808 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[69310b63-01c5-4283-97fc-73ea46343ddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.869 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f7fcbc8a-a1fe-4a49-9cff-3ea9252c9cf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.871 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14d61e69-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.871 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.872 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14d61e69-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.874 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:25 np0005539505 NetworkManager[55134]: <info>  [1764400825.8749] manager: (tap14d61e69-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Nov 29 02:20:25 np0005539505 kernel: tap14d61e69-b0: entered promiscuous mode
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.877 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap14d61e69-b0, col_values=(('external_ids', {'iface-id': '17905b79-5cd7-4b55-9191-5d935325b1f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:25Z|00501|binding|INFO|Releasing lport 17905b79-5cd7-4b55-9191-5d935325b1f0 from this chassis (sb_readonly=0)
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.880 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.882 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.883 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c27c07ae-b06a-454b-b233-2576759509a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.884 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-14d61e69-b152-4adc-a95c-58748969e299
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 14d61e69-b152-4adc-a95c-58748969e299
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:20:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:25.886 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'env', 'PROCESS_TAG=haproxy-14d61e69-b152-4adc-a95c-58748969e299', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/14d61e69-b152-4adc-a95c-58748969e299.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.891 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.899 186962 DEBUG nova.compute.manager [req-0a745457-8bb3-41d5-b1f4-0926c327dd09 req-f2e48dc8-bf55-4448-a0ba-2934669a0a23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.900 186962 DEBUG oslo_concurrency.lockutils [req-0a745457-8bb3-41d5-b1f4-0926c327dd09 req-f2e48dc8-bf55-4448-a0ba-2934669a0a23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.900 186962 DEBUG oslo_concurrency.lockutils [req-0a745457-8bb3-41d5-b1f4-0926c327dd09 req-f2e48dc8-bf55-4448-a0ba-2934669a0a23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.900 186962 DEBUG oslo_concurrency.lockutils [req-0a745457-8bb3-41d5-b1f4-0926c327dd09 req-f2e48dc8-bf55-4448-a0ba-2934669a0a23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:25 np0005539505 nova_compute[186958]: 2025-11-29 07:20:25.901 186962 DEBUG nova.compute.manager [req-0a745457-8bb3-41d5-b1f4-0926c327dd09 req-f2e48dc8-bf55-4448-a0ba-2934669a0a23 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Processing event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.141 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating instance_info_cache with network_info: [{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.155 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.156 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.229 186962 DEBUG nova.compute.manager [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.230 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400826.228724, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.230 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Started (Lifecycle Event)#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.233 186962 DEBUG nova.virt.libvirt.driver [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.235 186962 INFO nova.virt.libvirt.driver [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance spawned successfully.#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.252 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.255 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.276 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.276 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400826.2292047, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.276 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.303 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.306 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400826.2325492, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.307 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.332 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.335 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:20:26 np0005539505 podman[235584]: 2025-11-29 07:20:26.24804574 +0000 UTC m=+0.032628516 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:20:26 np0005539505 nova_compute[186958]: 2025-11-29 07:20:26.355 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:20:26 np0005539505 podman[235584]: 2025-11-29 07:20:26.483772639 +0000 UTC m=+0.268355395 container create 5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:20:26 np0005539505 systemd[1]: Started libpod-conmon-5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30.scope.
Nov 29 02:20:26 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:20:26 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32c33047ab6087c733b5b6d300cb538e9c89e2e286a0278ba481a975ab5373f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:20:26 np0005539505 podman[235584]: 2025-11-29 07:20:26.701746713 +0000 UTC m=+0.486329489 container init 5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:20:26 np0005539505 podman[235584]: 2025-11-29 07:20:26.707936559 +0000 UTC m=+0.492519315 container start 5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 02:20:26 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235599]: [NOTICE]   (235603) : New worker (235605) forked
Nov 29 02:20:26 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235599]: [NOTICE]   (235603) : Loading success.
Nov 29 02:20:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:26.959 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:26.961 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:26.961 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:28 np0005539505 nova_compute[186958]: 2025-11-29 07:20:28.236 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:28 np0005539505 nova_compute[186958]: 2025-11-29 07:20:28.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:20:29 np0005539505 nova_compute[186958]: 2025-11-29 07:20:29.903 186962 DEBUG nova.compute.manager [req-6724ec51-3fb0-4325-ba8a-2ef5810a0cf4 req-00b761f2-4493-4032-9065-d0bfafec848e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:29 np0005539505 nova_compute[186958]: 2025-11-29 07:20:29.904 186962 DEBUG oslo_concurrency.lockutils [req-6724ec51-3fb0-4325-ba8a-2ef5810a0cf4 req-00b761f2-4493-4032-9065-d0bfafec848e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:29 np0005539505 nova_compute[186958]: 2025-11-29 07:20:29.904 186962 DEBUG oslo_concurrency.lockutils [req-6724ec51-3fb0-4325-ba8a-2ef5810a0cf4 req-00b761f2-4493-4032-9065-d0bfafec848e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:29 np0005539505 nova_compute[186958]: 2025-11-29 07:20:29.904 186962 DEBUG oslo_concurrency.lockutils [req-6724ec51-3fb0-4325-ba8a-2ef5810a0cf4 req-00b761f2-4493-4032-9065-d0bfafec848e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:29 np0005539505 nova_compute[186958]: 2025-11-29 07:20:29.905 186962 DEBUG nova.compute.manager [req-6724ec51-3fb0-4325-ba8a-2ef5810a0cf4 req-00b761f2-4493-4032-9065-d0bfafec848e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] No waiting events found dispatching network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:29 np0005539505 nova_compute[186958]: 2025-11-29 07:20:29.905 186962 WARNING nova.compute.manager [req-6724ec51-3fb0-4325-ba8a-2ef5810a0cf4 req-00b761f2-4493-4032-9065-d0bfafec848e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received unexpected event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 for instance with vm_state shelved_offloaded and task_state spawning.#033[00m
Nov 29 02:20:30 np0005539505 nova_compute[186958]: 2025-11-29 07:20:30.054 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:31 np0005539505 nova_compute[186958]: 2025-11-29 07:20:31.007 186962 DEBUG nova.compute.manager [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:31 np0005539505 nova_compute[186958]: 2025-11-29 07:20:31.113 186962 DEBUG oslo_concurrency.lockutils [None req-dc9d83b8-978a-44df-bbbe-8540027d15e0 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 15.959s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:31Z|00502|binding|INFO|Releasing lport 17905b79-5cd7-4b55-9191-5d935325b1f0 from this chassis (sb_readonly=0)
Nov 29 02:20:31 np0005539505 nova_compute[186958]: 2025-11-29 07:20:31.411 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:31Z|00503|binding|INFO|Releasing lport 17905b79-5cd7-4b55-9191-5d935325b1f0 from this chassis (sb_readonly=0)
Nov 29 02:20:31 np0005539505 nova_compute[186958]: 2025-11-29 07:20:31.575 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:31 np0005539505 nova_compute[186958]: 2025-11-29 07:20:31.669 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400816.6687007, d71f022d-ac2d-48cb-bc26-3a9097ba969e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:31 np0005539505 nova_compute[186958]: 2025-11-29 07:20:31.670 186962 INFO nova.compute.manager [-] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:20:31 np0005539505 nova_compute[186958]: 2025-11-29 07:20:31.694 186962 DEBUG nova.compute.manager [None req-5c86cfc6-4b67-4644-abc4-7b140b5e20f4 - - - - - -] [instance: d71f022d-ac2d-48cb-bc26-3a9097ba969e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:33 np0005539505 nova_compute[186958]: 2025-11-29 07:20:33.239 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:35 np0005539505 nova_compute[186958]: 2025-11-29 07:20:35.055 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:38 np0005539505 nova_compute[186958]: 2025-11-29 07:20:38.201 186962 DEBUG nova.objects.instance [None req-002b1d90-6a86-4bd1-a078-166d9eafc327 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'pci_devices' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:38 np0005539505 nova_compute[186958]: 2025-11-29 07:20:38.241 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:38 np0005539505 nova_compute[186958]: 2025-11-29 07:20:38.326 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400838.3258379, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:38 np0005539505 nova_compute[186958]: 2025-11-29 07:20:38.326 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:20:38 np0005539505 nova_compute[186958]: 2025-11-29 07:20:38.352 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:38 np0005539505 nova_compute[186958]: 2025-11-29 07:20:38.356 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:20:38 np0005539505 nova_compute[186958]: 2025-11-29 07:20:38.379 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Nov 29 02:20:38 np0005539505 podman[235627]: 2025-11-29 07:20:38.725164985 +0000 UTC m=+0.056885855 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter)
Nov 29 02:20:38 np0005539505 podman[235628]: 2025-11-29 07:20:38.756411931 +0000 UTC m=+0.086440643 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:20:39 np0005539505 kernel: tapc373f1d7-16 (unregistering): left promiscuous mode
Nov 29 02:20:39 np0005539505 NetworkManager[55134]: <info>  [1764400839.7956] device (tapc373f1d7-16): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:20:39 np0005539505 nova_compute[186958]: 2025-11-29 07:20:39.799 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:39Z|00504|binding|INFO|Releasing lport c373f1d7-168e-494b-8e6f-c8af44b0db68 from this chassis (sb_readonly=0)
Nov 29 02:20:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:39Z|00505|binding|INFO|Setting lport c373f1d7-168e-494b-8e6f-c8af44b0db68 down in Southbound
Nov 29 02:20:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:39Z|00506|binding|INFO|Removing iface tapc373f1d7-16 ovn-installed in OVS
Nov 29 02:20:39 np0005539505 nova_compute[186958]: 2025-11-29 07:20:39.801 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:39.807 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:c4:96 10.100.0.5'], port_security=['fa:16:3e:1b:c4:96 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14d61e69-b152-4adc-a95c-58748969e299', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '329bbbdd41424742b3045e77150a498e', 'neutron:revision_number': '9', 'neutron:security_group_ids': '24db58f8-235a-4b76-869f-efe13404b22a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61c05e4b-7426-41e7-9cd6-8f37a87e832e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=c373f1d7-168e-494b-8e6f-c8af44b0db68) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:39.808 104094 INFO neutron.agent.ovn.metadata.agent [-] Port c373f1d7-168e-494b-8e6f-c8af44b0db68 in datapath 14d61e69-b152-4adc-a95c-58748969e299 unbound from our chassis#033[00m
Nov 29 02:20:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:39.809 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 14d61e69-b152-4adc-a95c-58748969e299, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:20:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:39.811 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4dce71be-2693-4b84-9684-97251f39b9b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:39.811 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 namespace which is not needed anymore#033[00m
Nov 29 02:20:39 np0005539505 nova_compute[186958]: 2025-11-29 07:20:39.816 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:39 np0005539505 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000068.scope: Deactivated successfully.
Nov 29 02:20:39 np0005539505 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000068.scope: Consumed 12.305s CPU time.
Nov 29 02:20:39 np0005539505 systemd-machined[153285]: Machine qemu-57-instance-00000068 terminated.
Nov 29 02:20:39 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235599]: [NOTICE]   (235603) : haproxy version is 2.8.14-c23fe91
Nov 29 02:20:39 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235599]: [NOTICE]   (235603) : path to executable is /usr/sbin/haproxy
Nov 29 02:20:39 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235599]: [WARNING]  (235603) : Exiting Master process...
Nov 29 02:20:39 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235599]: [ALERT]    (235603) : Current worker (235605) exited with code 143 (Terminated)
Nov 29 02:20:39 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235599]: [WARNING]  (235603) : All workers exited. Exiting... (0)
Nov 29 02:20:39 np0005539505 systemd[1]: libpod-5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30.scope: Deactivated successfully.
Nov 29 02:20:39 np0005539505 podman[235696]: 2025-11-29 07:20:39.948496024 +0000 UTC m=+0.050130223 container died 5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:20:39 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30-userdata-shm.mount: Deactivated successfully.
Nov 29 02:20:39 np0005539505 systemd[1]: var-lib-containers-storage-overlay-32c33047ab6087c733b5b6d300cb538e9c89e2e286a0278ba481a975ab5373f5-merged.mount: Deactivated successfully.
Nov 29 02:20:39 np0005539505 podman[235696]: 2025-11-29 07:20:39.98854476 +0000 UTC m=+0.090178959 container cleanup 5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:20:39 np0005539505 NetworkManager[55134]: <info>  [1764400839.9903] manager: (tapc373f1d7-16): new Tun device (/org/freedesktop/NetworkManager/Devices/248)
Nov 29 02:20:39 np0005539505 systemd[1]: libpod-conmon-5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30.scope: Deactivated successfully.
Nov 29 02:20:40 np0005539505 nova_compute[186958]: 2025-11-29 07:20:40.019 186962 DEBUG nova.compute.manager [req-a2724e4e-b436-4838-abd9-df88e153dde7 req-da16f8ca-ad71-4cf1-9d66-d4a978a20076 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-unplugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:40 np0005539505 nova_compute[186958]: 2025-11-29 07:20:40.020 186962 DEBUG oslo_concurrency.lockutils [req-a2724e4e-b436-4838-abd9-df88e153dde7 req-da16f8ca-ad71-4cf1-9d66-d4a978a20076 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:40 np0005539505 nova_compute[186958]: 2025-11-29 07:20:40.020 186962 DEBUG oslo_concurrency.lockutils [req-a2724e4e-b436-4838-abd9-df88e153dde7 req-da16f8ca-ad71-4cf1-9d66-d4a978a20076 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:40 np0005539505 nova_compute[186958]: 2025-11-29 07:20:40.021 186962 DEBUG oslo_concurrency.lockutils [req-a2724e4e-b436-4838-abd9-df88e153dde7 req-da16f8ca-ad71-4cf1-9d66-d4a978a20076 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:40 np0005539505 nova_compute[186958]: 2025-11-29 07:20:40.021 186962 DEBUG nova.compute.manager [req-a2724e4e-b436-4838-abd9-df88e153dde7 req-da16f8ca-ad71-4cf1-9d66-d4a978a20076 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] No waiting events found dispatching network-vif-unplugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:40 np0005539505 nova_compute[186958]: 2025-11-29 07:20:40.021 186962 WARNING nova.compute.manager [req-a2724e4e-b436-4838-abd9-df88e153dde7 req-da16f8ca-ad71-4cf1-9d66-d4a978a20076 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received unexpected event network-vif-unplugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 for instance with vm_state active and task_state suspending.#033[00m
Nov 29 02:20:40 np0005539505 nova_compute[186958]: 2025-11-29 07:20:40.025 186962 DEBUG nova.compute.manager [None req-002b1d90-6a86-4bd1-a078-166d9eafc327 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:40 np0005539505 podman[235735]: 2025-11-29 07:20:40.051439465 +0000 UTC m=+0.040763458 container remove 5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 02:20:40 np0005539505 nova_compute[186958]: 2025-11-29 07:20:40.057 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:40.058 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[37a71989-a083-4126-b065-94051267eef2]: (4, ('Sat Nov 29 07:20:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 (5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30)\n5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30\nSat Nov 29 07:20:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 (5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30)\n5682b8c7b89276e836f21221670ab6cecef8993ccef0ccc394269abd5a083f30\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:40.059 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bf0ce485-c14a-488b-b18b-e65938224bc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:40.059 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14d61e69-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:40 np0005539505 nova_compute[186958]: 2025-11-29 07:20:40.060 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:40 np0005539505 kernel: tap14d61e69-b0: left promiscuous mode
Nov 29 02:20:40 np0005539505 nova_compute[186958]: 2025-11-29 07:20:40.072 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:40 np0005539505 nova_compute[186958]: 2025-11-29 07:20:40.075 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:40.078 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0bbeeeb5-01be-4b40-b1cf-254db9833d0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:40.096 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a8200e58-956c-4526-99f7-fce77bdfa7bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:40.097 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6ce48abd-d0d1-4c46-b3fe-6ec774db6899]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:40.112 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9820bbbb-e41b-4626-a507-0e249396a02b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 627324, 'reachable_time': 30765, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235764, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:40 np0005539505 systemd[1]: run-netns-ovnmeta\x2d14d61e69\x2db152\x2d4adc\x2da95c\x2d58748969e299.mount: Deactivated successfully.
Nov 29 02:20:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:40.117 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:20:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:40.117 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[747d8b49-8bb6-4a3a-968f-b38ea4b3e4ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:41 np0005539505 podman[235765]: 2025-11-29 07:20:41.708752907 +0000 UTC m=+0.045258325 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 29 02:20:42 np0005539505 nova_compute[186958]: 2025-11-29 07:20:42.353 186962 DEBUG nova.compute.manager [req-8db320c2-019e-4e77-96bb-25932bef01b1 req-88a010c7-bdbe-462d-ab80-073f631d2c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:42 np0005539505 nova_compute[186958]: 2025-11-29 07:20:42.354 186962 DEBUG oslo_concurrency.lockutils [req-8db320c2-019e-4e77-96bb-25932bef01b1 req-88a010c7-bdbe-462d-ab80-073f631d2c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:42 np0005539505 nova_compute[186958]: 2025-11-29 07:20:42.354 186962 DEBUG oslo_concurrency.lockutils [req-8db320c2-019e-4e77-96bb-25932bef01b1 req-88a010c7-bdbe-462d-ab80-073f631d2c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:42 np0005539505 nova_compute[186958]: 2025-11-29 07:20:42.354 186962 DEBUG oslo_concurrency.lockutils [req-8db320c2-019e-4e77-96bb-25932bef01b1 req-88a010c7-bdbe-462d-ab80-073f631d2c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:42 np0005539505 nova_compute[186958]: 2025-11-29 07:20:42.354 186962 DEBUG nova.compute.manager [req-8db320c2-019e-4e77-96bb-25932bef01b1 req-88a010c7-bdbe-462d-ab80-073f631d2c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] No waiting events found dispatching network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:42 np0005539505 nova_compute[186958]: 2025-11-29 07:20:42.354 186962 WARNING nova.compute.manager [req-8db320c2-019e-4e77-96bb-25932bef01b1 req-88a010c7-bdbe-462d-ab80-073f631d2c81 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received unexpected event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 for instance with vm_state suspended and task_state None.#033[00m
Nov 29 02:20:42 np0005539505 nova_compute[186958]: 2025-11-29 07:20:42.876 186962 INFO nova.compute.manager [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Resuming#033[00m
Nov 29 02:20:42 np0005539505 nova_compute[186958]: 2025-11-29 07:20:42.877 186962 DEBUG nova.objects.instance [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'flavor' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:42 np0005539505 nova_compute[186958]: 2025-11-29 07:20:42.934 186962 DEBUG oslo_concurrency.lockutils [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:42 np0005539505 nova_compute[186958]: 2025-11-29 07:20:42.934 186962 DEBUG oslo_concurrency.lockutils [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquired lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:42 np0005539505 nova_compute[186958]: 2025-11-29 07:20:42.934 186962 DEBUG nova.network.neutron [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:20:43 np0005539505 nova_compute[186958]: 2025-11-29 07:20:43.243 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.060 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.428 186962 DEBUG nova.network.neutron [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating instance_info_cache with network_info: [{"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.829 186962 DEBUG oslo_concurrency.lockutils [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Releasing lock "refresh_cache-aa4795d1-71b1-415f-ac22-5bb11775bc84" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.836 186962 DEBUG nova.virt.libvirt.vif [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1506153238',display_name='tempest-ServersNegativeTestJSON-server-1506153238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1506153238',id=104,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:20:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='329bbbdd41424742b3045e77150a498e',ramdisk_id='',reservation_id='r-widkbdap',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServersNegativeTestJSON-1191192320',owner_user_name='tempest-ServersNegativeTestJSON-1191192320-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:20:40Z,user_data=None,user_id='2647a3e4fc214b4a85db1283eb7ef117',uuid=aa4795d1-71b1-415f-ac22-5bb11775bc84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.837 186962 DEBUG nova.network.os_vif_util [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converting VIF {"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.839 186962 DEBUG nova.network.os_vif_util [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.840 186962 DEBUG os_vif [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.840 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.841 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.841 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.844 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.845 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc373f1d7-16, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.845 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc373f1d7-16, col_values=(('external_ids', {'iface-id': 'c373f1d7-168e-494b-8e6f-c8af44b0db68', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:c4:96', 'vm-uuid': 'aa4795d1-71b1-415f-ac22-5bb11775bc84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.846 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.846 186962 INFO os_vif [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16')#033[00m
Nov 29 02:20:45 np0005539505 nova_compute[186958]: 2025-11-29 07:20:45.921 186962 DEBUG nova.objects.instance [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'numa_topology' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:45 np0005539505 kernel: tapc373f1d7-16: entered promiscuous mode
Nov 29 02:20:46 np0005539505 NetworkManager[55134]: <info>  [1764400846.0001] manager: (tapc373f1d7-16): new Tun device (/org/freedesktop/NetworkManager/Devices/249)
Nov 29 02:20:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:46Z|00507|binding|INFO|Claiming lport c373f1d7-168e-494b-8e6f-c8af44b0db68 for this chassis.
Nov 29 02:20:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:46Z|00508|binding|INFO|c373f1d7-168e-494b-8e6f-c8af44b0db68: Claiming fa:16:3e:1b:c4:96 10.100.0.5
Nov 29 02:20:46 np0005539505 nova_compute[186958]: 2025-11-29 07:20:46.002 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:46Z|00509|binding|INFO|Setting lport c373f1d7-168e-494b-8e6f-c8af44b0db68 ovn-installed in OVS
Nov 29 02:20:46 np0005539505 nova_compute[186958]: 2025-11-29 07:20:46.015 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:46 np0005539505 nova_compute[186958]: 2025-11-29 07:20:46.017 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:46 np0005539505 systemd-udevd[235798]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:20:46 np0005539505 systemd-machined[153285]: New machine qemu-58-instance-00000068.
Nov 29 02:20:46 np0005539505 NetworkManager[55134]: <info>  [1764400846.0430] device (tapc373f1d7-16): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:20:46 np0005539505 NetworkManager[55134]: <info>  [1764400846.0442] device (tapc373f1d7-16): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:20:46 np0005539505 systemd[1]: Started Virtual Machine qemu-58-instance-00000068.
Nov 29 02:20:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:46Z|00510|binding|INFO|Setting lport c373f1d7-168e-494b-8e6f-c8af44b0db68 up in Southbound
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.293 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:c4:96 10.100.0.5'], port_security=['fa:16:3e:1b:c4:96 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14d61e69-b152-4adc-a95c-58748969e299', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '329bbbdd41424742b3045e77150a498e', 'neutron:revision_number': '10', 'neutron:security_group_ids': '24db58f8-235a-4b76-869f-efe13404b22a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61c05e4b-7426-41e7-9cd6-8f37a87e832e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=c373f1d7-168e-494b-8e6f-c8af44b0db68) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.295 104094 INFO neutron.agent.ovn.metadata.agent [-] Port c373f1d7-168e-494b-8e6f-c8af44b0db68 in datapath 14d61e69-b152-4adc-a95c-58748969e299 bound to our chassis#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.297 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 14d61e69-b152-4adc-a95c-58748969e299#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.309 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[583a151d-a7b4-475d-a309-baff636d81a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.310 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap14d61e69-b1 in ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.312 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap14d61e69-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.312 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a18374-86f0-4c80-a0df-a60d69d21ed8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.313 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e21e9c00-22b8-48b0-903b-379ec94b548e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.324 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[f53f756c-1cee-4015-8eb9-b7d59a5c0c51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.347 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c2668f-2766-49fb-91ab-06a8736d1916]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.380 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1e599938-fd75-48bb-9e26-70bfcf062506]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.386 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bfdf2215-b95c-4cff-b24c-cc3299386820]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 NetworkManager[55134]: <info>  [1764400846.3878] manager: (tap14d61e69-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/250)
Nov 29 02:20:46 np0005539505 systemd-udevd[235801]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.417 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[286e620f-abd5-4863-a3d1-c7621c6ea0a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.420 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f628a462-690d-496f-afcf-c6bd6c457671]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 NetworkManager[55134]: <info>  [1764400846.4404] device (tap14d61e69-b0): carrier: link connected
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.443 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[7fcd493f-fd88-4bf3-ab74-f621fbb7fde4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.459 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[361e96f4-d24a-44a0-a9e8-5eba6da100f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14d61e69-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:42:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629402, 'reachable_time': 30525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235832, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.475 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[60f77d0b-48f3-4145-b22e-945ca8a6ee23]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:42d7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 629402, 'tstamp': 629402}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235833, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.492 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8e166e23-5324-441b-8d5e-80c3c86665be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap14d61e69-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:42:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629402, 'reachable_time': 30525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235834, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.524 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[aa87131d-24ef-4acc-92a4-cc67e0904ce2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.587 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[83aab8df-0510-4bd6-b781-e12c5c0bfacc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.589 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14d61e69-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.590 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.590 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14d61e69-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:46 np0005539505 NetworkManager[55134]: <info>  [1764400846.5931] manager: (tap14d61e69-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Nov 29 02:20:46 np0005539505 kernel: tap14d61e69-b0: entered promiscuous mode
Nov 29 02:20:46 np0005539505 nova_compute[186958]: 2025-11-29 07:20:46.594 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.595 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap14d61e69-b0, col_values=(('external_ids', {'iface-id': '17905b79-5cd7-4b55-9191-5d935325b1f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:46Z|00511|binding|INFO|Releasing lport 17905b79-5cd7-4b55-9191-5d935325b1f0 from this chassis (sb_readonly=0)
Nov 29 02:20:46 np0005539505 nova_compute[186958]: 2025-11-29 07:20:46.596 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:46 np0005539505 nova_compute[186958]: 2025-11-29 07:20:46.610 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.611 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.612 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[de364ee6-3946-4726-a094-ca3851caf90e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.613 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-14d61e69-b152-4adc-a95c-58748969e299
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/14d61e69-b152-4adc-a95c-58748969e299.pid.haproxy
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 14d61e69-b152-4adc-a95c-58748969e299
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:20:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:46.616 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'env', 'PROCESS_TAG=haproxy-14d61e69-b152-4adc-a95c-58748969e299', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/14d61e69-b152-4adc-a95c-58748969e299.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:20:47 np0005539505 podman[235872]: 2025-11-29 07:20:47.003318021 +0000 UTC m=+0.051055950 container create 882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:20:47 np0005539505 systemd[1]: Started libpod-conmon-882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37.scope.
Nov 29 02:20:47 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:20:47 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4868fce2b7197029fa5e7ce79af80c15274fb6c5fe420fd687347237b766e86/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:20:47 np0005539505 podman[235872]: 2025-11-29 07:20:46.977118328 +0000 UTC m=+0.024856257 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:20:47 np0005539505 podman[235872]: 2025-11-29 07:20:47.076046024 +0000 UTC m=+0.123783973 container init 882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 02:20:47 np0005539505 podman[235872]: 2025-11-29 07:20:47.081725476 +0000 UTC m=+0.129463405 container start 882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:20:47 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235888]: [NOTICE]   (235892) : New worker (235894) forked
Nov 29 02:20:47 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235888]: [NOTICE]   (235892) : Loading success.
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.140 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Removed pending event for aa4795d1-71b1-415f-ac22-5bb11775bc84 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.141 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400847.1403651, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.141 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Started (Lifecycle Event)#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.168 186962 DEBUG nova.compute.manager [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.169 186962 DEBUG nova.objects.instance [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'pci_devices' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.334 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.340 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.343 186962 INFO nova.virt.libvirt.driver [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance running successfully.#033[00m
Nov 29 02:20:47 np0005539505 virtqemud[186353]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.346 186962 DEBUG nova.virt.libvirt.guest [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.346 186962 DEBUG nova.compute.manager [None req-b96be346-e6b5-4fe4-8111-d73f4ced1363 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.384 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.384 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400847.1474957, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.384 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.416 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.418 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.897 186962 DEBUG nova.compute.manager [req-6460e2cd-06ce-4833-92f2-02d8cebee3a8 req-9823278f-1afd-4df0-819f-5e8b2846e0af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.897 186962 DEBUG oslo_concurrency.lockutils [req-6460e2cd-06ce-4833-92f2-02d8cebee3a8 req-9823278f-1afd-4df0-819f-5e8b2846e0af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.898 186962 DEBUG oslo_concurrency.lockutils [req-6460e2cd-06ce-4833-92f2-02d8cebee3a8 req-9823278f-1afd-4df0-819f-5e8b2846e0af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.898 186962 DEBUG oslo_concurrency.lockutils [req-6460e2cd-06ce-4833-92f2-02d8cebee3a8 req-9823278f-1afd-4df0-819f-5e8b2846e0af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.898 186962 DEBUG nova.compute.manager [req-6460e2cd-06ce-4833-92f2-02d8cebee3a8 req-9823278f-1afd-4df0-819f-5e8b2846e0af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] No waiting events found dispatching network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:47 np0005539505 nova_compute[186958]: 2025-11-29 07:20:47.898 186962 WARNING nova.compute.manager [req-6460e2cd-06ce-4833-92f2-02d8cebee3a8 req-9823278f-1afd-4df0-819f-5e8b2846e0af 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received unexpected event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.091 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000068', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '329bbbdd41424742b3045e77150a498e', 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'hostId': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.092 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.116 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.requests volume: 4 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.116 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1be2cd32-ba19-4407-911d-f13cffd217ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 4, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:20:48.092138', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed70577e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': 'ee00e064f197e16ec8780202bb94e5eff13eedf1b542eb5dda2a17f8f5099a27'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:20:48.092138', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed7066ce-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': '2b16ac6bd0da691e37d21130aba040b003542b9fe19f59640fe26fd613c28d98'}]}, 'timestamp': '2025-11-29 07:20:48.117299', '_unique_id': '5e4c40a0943b46a694dbf5e0cab7d2fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.118 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.119 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.bytes volume: 16384 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.119 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '969185d4-9f7e-48d6-866a-5a5a776a9522', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16384, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:20:48.119685', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed70d15e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': '5628240184f1e8677a709d518e2840341c65af92461ced5e417e1208373423b6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:20:48.119685', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed70d9d8-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': '31a58675f7b829c2d6d071ac7d8c43b16334749e6c5a22a5548edebe7fee5fae'}]}, 'timestamp': '2025-11-29 07:20:48.120181', '_unique_id': '04ac2e0a3631448a83c0c6ca3e6f21fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.120 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.132 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.132 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7d5a6f7-a7ec-45cb-9eef-ddfa8ac1498b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:20:48.121455', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed72c680-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.762310993, 'message_signature': 'a3346a4d83695f8d174a593f4cdb5fc95dfa7acef010d4e750e728159e5120ae'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:20:48.121455', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed72d22e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.762310993, 'message_signature': '6120ee2319fe61aa89180b2c2877e443d20b05503574df1db5affde927e294f9'}]}, 'timestamp': '2025-11-29 07:20:48.133103', '_unique_id': '11759a5f7c4c48049ea1a2a466d1140b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.134 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.bytes volume: 6573568 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.135 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.bytes volume: 219276 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80d5821e-ef93-4239-b6ff-384b7cc12f96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6573568, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:20:48.134862', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed73221a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': 'f337f24dbeca59938259d111337f0da0bcfc1120426e2c395854f8426e9e3880'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 219276, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:20:48.134862', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed732d00-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': 'c01d618678baec8423aceafc8470c5993535e295bb2c2d8214091789bc2115f0'}]}, 'timestamp': '2025-11-29 07:20:48.135433', '_unique_id': 'ca0ea5eee2f04d218157b531ce669e68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.137 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.137 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7016c898-6379-4f6f-afab-78304ecbffee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:20:48.137515', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed7389c6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.762310993, 'message_signature': '6f80aa1672cd476c0c6933f297d145e10b7b76d00970a6dfb4f2f19e3ec00c2e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:20:48.137515', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed739682-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.762310993, 'message_signature': 'a96a942fa0d6e5c4162efae9710a39d0c6350c2a39558aad350b766ae78f0f96'}]}, 'timestamp': '2025-11-29 07:20:48.138104', '_unique_id': 'cac603c193c443ea98779d0337f72f27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.138 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.139 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.141 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4bacceba-2471-41c5-ac46-e973a714c9c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:20:48.139446', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'ed742bec-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.780290123, 'message_signature': '0ddb4891b9037e2967f1305b51e35b96c175aadad4656e4506c5b696c8d6adeb'}]}, 'timestamp': '2025-11-29 07:20:48.141939', '_unique_id': '5cc8757d956d438d82c4eb8642fbaeb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.142 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.143 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.143 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b5e2893-cd39-497a-b180-c382b2d9442b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:20:48.143201', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed7468f0-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.762310993, 'message_signature': 'b95acddb22aaa1fc0f10a945cdc8c9787d396f23875d16a58b6539f31da08620'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:20:48.143201', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed7473cc-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.762310993, 'message_signature': 'e968d94c32fd7cae14fb3dc5eb63bb9ece2767338329ba7e1a9b81c6cffcbee7'}]}, 'timestamp': '2025-11-29 07:20:48.143802', '_unique_id': '4abbb519ea934975859354167db1e895'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.144 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.145 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.outgoing.bytes.delta volume: 746 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc13edb4-7d72-49c1-b411-5c63426bcdbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 746, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:20:48.145516', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'ed74c282-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.780290123, 'message_signature': '9196eb269da61b83e7bf1ddadfbd86baeb211f09b990350f7cd7c207fee198ef'}]}, 'timestamp': '2025-11-29 07:20:48.145832', '_unique_id': 'a0662e9e187a40e387abff60e29acfa5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.outgoing.bytes volume: 746 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '725d86ef-239a-4af4-aa0f-625fde894976', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 746, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:20:48.147188', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'ed7502e2-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.780290123, 'message_signature': '3e957639bb352339e3ee47bd9762726ccb5bd5720111551316b0c63e093a19c4'}]}, 'timestamp': '2025-11-29 07:20:48.147437', '_unique_id': 'f9822e245e7f414ca165fe07f6da94aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.147 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.148 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.149 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e79fe6b1-737f-4a14-861e-48fc0d655f87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:20:48.149130', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'ed75531e-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.780290123, 'message_signature': 'ca4c606859171f575351762d2d3d6d989cc723708081c683df8d708d564b2d4f'}]}, 'timestamp': '2025-11-29 07:20:48.149541', '_unique_id': '400650fc02d94b0ebccba9b47662413b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.150 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.151 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.latency volume: 59804886 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.151 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.latency volume: 20819987 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b362b566-072a-45e4-9638-8ae1c73cf2fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 59804886, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:20:48.151203', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed75a300-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': '2d0e330636d9fa9a20c1d18bcc06d11781c5cbce6d2ddec70fceab0d8e7c92aa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20819987, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:20:48.151203', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed75ae40-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': '4fe8826d94499d770a6c200e3837d042ee2a0a98fcbb5dfe4d910ae0e796f6d0'}]}, 'timestamp': '2025-11-29 07:20:48.151857', '_unique_id': '629b5e13c1df4571ac2d6ca1a7943689'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.153 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.incoming.packets volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a906480-af9a-40f2-ade0-8d8843fb5a82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 6, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:20:48.153412', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'ed75f6b6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.780290123, 'message_signature': '88b55df384ef66c2bae77c11a64f5aebb51c83eb9907f1b3b1d40f17ff615c8d'}]}, 'timestamp': '2025-11-29 07:20:48.153720', '_unique_id': '1eea8b490a4e4f7fb01e6f7779007919'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.154 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.155 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34361969-183c-4f4e-9826-8e743acf6c61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:20:48.155180', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'ed763ce8-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.780290123, 'message_signature': '4c92b551da6b410c2a87334964438f03b9fb3151dab92e33065f0a713aa6ef53'}]}, 'timestamp': '2025-11-29 07:20:48.155554', '_unique_id': '59903096f6914a9883d76f48d210c648'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.157 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.latency volume: 1389259 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.157 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad12b454-c588-4a1d-b4e9-7ceedb25e1b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1389259, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:20:48.156977', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed7681a8-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': '1f9aad3bae43d70440cf4f1f01716746558225a38be15b2cd58bbede3bacdc4e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:20:48.156977', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed768d1a-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': '0ec8c2e17cff87a2b30807d1ab4cf7499548a6cdd3cd605a279e4a6432d18206'}]}, 'timestamp': '2025-11-29 07:20:48.157539', '_unique_id': 'd000a6db64674694933bd40377fe48c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.158 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0bd945a-70c6-4b8e-9a59-acc13d2040ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:20:48.158743', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'ed76c58c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.780290123, 'message_signature': '124667188cce1d946de01762423dbadeaacbb050d24652bc8021eaa53941dbd1'}]}, 'timestamp': '2025-11-29 07:20:48.158973', '_unique_id': 'ddc558b3c82d4fd4b1acf051fea28260'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.159 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.160 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.161 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f3efad7-25db-4817-9a3e-92996c4b682e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:20:48.161082', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'ed7722c0-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.780290123, 'message_signature': '394981c06a9e287d1eb17ba8eecdb8505b89418d09a13ec760f6f2b1e2752f24'}]}, 'timestamp': '2025-11-29 07:20:48.161414', '_unique_id': 'c430e5d787a3441d88eab9d27f352c16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.162 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.180 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdbc283a-fe03-42e7-91d6-4bf3d0765dcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'timestamp': '2025-11-29T07:20:48.162972', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ed7a19b2-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.82104753, 'message_signature': '2a05ed1c3002576ba2ca490158258e9eb82a6da1702760c9ca446be9ac87f67e'}]}, 'timestamp': '2025-11-29 07:20:48.180928', '_unique_id': '6724059caef54dd58ceaf5b86e6b5bf5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.182 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.incoming.bytes volume: 532 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b08318c6-4fc7-4cdd-ae6e-4a413f962732', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 532, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:20:48.182599', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'ed7a69c6-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.780290123, 'message_signature': 'ba89a059dec8c3de1564c886d4a7e52a3c45406f2dd5577803df0cf7cc9f8288'}]}, 'timestamp': '2025-11-29 07:20:48.182838', '_unique_id': '97ee943e03e542e799f685dfedc9972b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.183 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/network.incoming.bytes.delta volume: 532 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '796b4a23-4150-4395-94f4-0f83ce358a79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 532, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'instance-00000068-aa4795d1-71b1-415f-ac22-5bb11775bc84-tapc373f1d7-16', 'timestamp': '2025-11-29T07:20:48.183888', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'tapc373f1d7-16', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1b:c4:96', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc373f1d7-16'}, 'message_id': 'ed7a9bb2-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.780290123, 'message_signature': '88240d8ecea65a1788d201d576d594f59bc7c9885695b6f0562fa238e1b57a73'}]}, 'timestamp': '2025-11-29 07:20:48.184114', '_unique_id': '197b7465ff6f44a587669df2cf0f493f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.185 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.requests volume: 313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.185 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/disk.device.read.requests volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3111fdb8-6448-4ebc-8241-a0066cfbe510', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 313, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-vda', 'timestamp': '2025-11-29T07:20:48.185310', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ed7ad406-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': '56640f545b4b465bafa88fafc61f2ab48e0583455566f6801a88130dc63b7e23'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 88, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84-sda', 'timestamp': '2025-11-29T07:20:48.185310', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ed7adf3c-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.732966831, 'message_signature': '782387ccd44fa50e25ba2e1a2f863fd309dcbfd560baf92b4aff3f56c48a63ae'}]}, 'timestamp': '2025-11-29 07:20:48.185863', '_unique_id': '76ba2c5ca39b4075b343889872aa2b7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 DEBUG ceilometer.compute.pollsters [-] aa4795d1-71b1-415f-ac22-5bb11775bc84/cpu volume: 910000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73f3e2ee-7d11-4509-9618-2ca9b4b2efe1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 910000000, 'user_id': '2647a3e4fc214b4a85db1283eb7ef117', 'user_name': None, 'project_id': '329bbbdd41424742b3045e77150a498e', 'project_name': None, 'resource_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'timestamp': '2025-11-29T07:20:48.187149', 'resource_metadata': {'display_name': 'tempest-ServersNegativeTestJSON-server-1506153238', 'name': 'instance-00000068', 'instance_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'instance_type': 'm1.nano', 'host': '22959cba132688d93f7873fbda2e8af3e9c1a1336c2032f9cf5dea7f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '36058703-a33f-4375-a956-154dfec4971e'}, 'image_ref': '36058703-a33f-4375-a956-154dfec4971e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ed7b1c54-ccf3-11f0-8954-fa163e5a5606', 'monotonic_time': 6295.82104753, 'message_signature': '717d84a42cf9695c0a938bafb6fdf51068758219eb514e4ffee476de471057b0'}]}, 'timestamp': '2025-11-29 07:20:48.187403', '_unique_id': '730a997892bf42ca931b9e803693a773'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:20:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:20:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:20:48 np0005539505 nova_compute[186958]: 2025-11-29 07:20:48.246 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:48 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:48Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1b:c4:96 10.100.0.5
Nov 29 02:20:50 np0005539505 nova_compute[186958]: 2025-11-29 07:20:50.018 186962 DEBUG nova.compute.manager [req-94b83d4e-a5ef-4eef-9d91-c4b4c807cd56 req-a734e682-5e0b-4d23-a127-319c9d70ba5a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:50 np0005539505 nova_compute[186958]: 2025-11-29 07:20:50.018 186962 DEBUG oslo_concurrency.lockutils [req-94b83d4e-a5ef-4eef-9d91-c4b4c807cd56 req-a734e682-5e0b-4d23-a127-319c9d70ba5a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:50 np0005539505 nova_compute[186958]: 2025-11-29 07:20:50.019 186962 DEBUG oslo_concurrency.lockutils [req-94b83d4e-a5ef-4eef-9d91-c4b4c807cd56 req-a734e682-5e0b-4d23-a127-319c9d70ba5a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:50 np0005539505 nova_compute[186958]: 2025-11-29 07:20:50.019 186962 DEBUG oslo_concurrency.lockutils [req-94b83d4e-a5ef-4eef-9d91-c4b4c807cd56 req-a734e682-5e0b-4d23-a127-319c9d70ba5a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:50 np0005539505 nova_compute[186958]: 2025-11-29 07:20:50.019 186962 DEBUG nova.compute.manager [req-94b83d4e-a5ef-4eef-9d91-c4b4c807cd56 req-a734e682-5e0b-4d23-a127-319c9d70ba5a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] No waiting events found dispatching network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:50 np0005539505 nova_compute[186958]: 2025-11-29 07:20:50.020 186962 WARNING nova.compute.manager [req-94b83d4e-a5ef-4eef-9d91-c4b4c807cd56 req-a734e682-5e0b-4d23-a127-319c9d70ba5a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received unexpected event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:20:50 np0005539505 nova_compute[186958]: 2025-11-29 07:20:50.063 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:50 np0005539505 podman[235906]: 2025-11-29 07:20:50.721057525 +0000 UTC m=+0.051277076 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:20:50 np0005539505 podman[235907]: 2025-11-29 07:20:50.749232194 +0000 UTC m=+0.079322861 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:20:53 np0005539505 nova_compute[186958]: 2025-11-29 07:20:53.249 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:53 np0005539505 nova_compute[186958]: 2025-11-29 07:20:53.850 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Acquiring lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:53 np0005539505 nova_compute[186958]: 2025-11-29 07:20:53.851 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:53 np0005539505 nova_compute[186958]: 2025-11-29 07:20:53.873 186962 DEBUG nova.compute.manager [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:20:53 np0005539505 nova_compute[186958]: 2025-11-29 07:20:53.988 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:53 np0005539505 nova_compute[186958]: 2025-11-29 07:20:53.989 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:53 np0005539505 nova_compute[186958]: 2025-11-29 07:20:53.996 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:20:53 np0005539505 nova_compute[186958]: 2025-11-29 07:20:53.997 186962 INFO nova.compute.claims [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:20:55 np0005539505 nova_compute[186958]: 2025-11-29 07:20:55.065 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:55 np0005539505 podman[235955]: 2025-11-29 07:20:55.720883725 +0000 UTC m=+0.052716937 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 29 02:20:55 np0005539505 podman[235956]: 2025-11-29 07:20:55.729680614 +0000 UTC m=+0.056820333 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 29 02:20:55 np0005539505 nova_compute[186958]: 2025-11-29 07:20:55.805 186962 DEBUG nova.compute.provider_tree [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:56 np0005539505 nova_compute[186958]: 2025-11-29 07:20:56.074 186962 DEBUG nova.scheduler.client.report [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:56 np0005539505 nova_compute[186958]: 2025-11-29 07:20:56.276 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:56 np0005539505 nova_compute[186958]: 2025-11-29 07:20:56.276 186962 DEBUG nova.compute.manager [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.043 186962 DEBUG nova.compute.manager [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.044 186962 DEBUG nova.network.neutron [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.079 186962 INFO nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.113 186962 DEBUG nova.compute.manager [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.242 186962 DEBUG nova.policy [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ae34cae3b83748ca86590adaf3f6dab6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f098dabf54514ea688ed89906cf2d3dc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.255 186962 DEBUG nova.compute.manager [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.256 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.257 186962 INFO nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Creating image(s)#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.258 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Acquiring lock "/var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.258 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "/var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.259 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "/var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.271 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.327 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.328 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.329 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.340 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.400 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.402 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.433 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.434 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.434 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.489 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.490 186962 DEBUG nova.virt.disk.api [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Checking if we can resize image /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.491 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.556 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.557 186962 DEBUG nova.virt.disk.api [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Cannot resize image /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.557 186962 DEBUG nova.objects.instance [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lazy-loading 'migration_context' on Instance uuid dcc91175-ce19-46f4-b2c3-9a47065c5a3b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.573 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.573 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Ensure instance console log exists: /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.574 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.574 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:57 np0005539505 nova_compute[186958]: 2025-11-29 07:20:57.574 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.021 186962 DEBUG nova.network.neutron [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Successfully created port: e11f1d32-b7a1-4573-bf0c-82d8ef318172 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.250 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.418 186962 DEBUG oslo_concurrency.lockutils [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.419 186962 DEBUG oslo_concurrency.lockutils [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.419 186962 DEBUG oslo_concurrency.lockutils [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.419 186962 DEBUG oslo_concurrency.lockutils [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.420 186962 DEBUG oslo_concurrency.lockutils [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.431 186962 INFO nova.compute.manager [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Terminating instance#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.441 186962 DEBUG nova.compute.manager [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:20:58 np0005539505 kernel: tapc373f1d7-16 (unregistering): left promiscuous mode
Nov 29 02:20:58 np0005539505 NetworkManager[55134]: <info>  [1764400858.4620] device (tapc373f1d7-16): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.472 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:58Z|00512|binding|INFO|Releasing lport c373f1d7-168e-494b-8e6f-c8af44b0db68 from this chassis (sb_readonly=0)
Nov 29 02:20:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:58Z|00513|binding|INFO|Setting lport c373f1d7-168e-494b-8e6f-c8af44b0db68 down in Southbound
Nov 29 02:20:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:20:58Z|00514|binding|INFO|Removing iface tapc373f1d7-16 ovn-installed in OVS
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.476 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.483 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:c4:96 10.100.0.5'], port_security=['fa:16:3e:1b:c4:96 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'aa4795d1-71b1-415f-ac22-5bb11775bc84', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-14d61e69-b152-4adc-a95c-58748969e299', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '329bbbdd41424742b3045e77150a498e', 'neutron:revision_number': '11', 'neutron:security_group_ids': '24db58f8-235a-4b76-869f-efe13404b22a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=61c05e4b-7426-41e7-9cd6-8f37a87e832e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=c373f1d7-168e-494b-8e6f-c8af44b0db68) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.485 104094 INFO neutron.agent.ovn.metadata.agent [-] Port c373f1d7-168e-494b-8e6f-c8af44b0db68 in datapath 14d61e69-b152-4adc-a95c-58748969e299 unbound from our chassis#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.487 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 14d61e69-b152-4adc-a95c-58748969e299, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.489 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee97293-8ee3-497e-92df-28917550985a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.490 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 namespace which is not needed anymore#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.496 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539505 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000068.scope: Deactivated successfully.
Nov 29 02:20:58 np0005539505 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d00000068.scope: Consumed 2.463s CPU time.
Nov 29 02:20:58 np0005539505 systemd-machined[153285]: Machine qemu-58-instance-00000068 terminated.
Nov 29 02:20:58 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235888]: [NOTICE]   (235892) : haproxy version is 2.8.14-c23fe91
Nov 29 02:20:58 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235888]: [NOTICE]   (235892) : path to executable is /usr/sbin/haproxy
Nov 29 02:20:58 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235888]: [WARNING]  (235892) : Exiting Master process...
Nov 29 02:20:58 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235888]: [WARNING]  (235892) : Exiting Master process...
Nov 29 02:20:58 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235888]: [ALERT]    (235892) : Current worker (235894) exited with code 143 (Terminated)
Nov 29 02:20:58 np0005539505 neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299[235888]: [WARNING]  (235892) : All workers exited. Exiting... (0)
Nov 29 02:20:58 np0005539505 systemd[1]: libpod-882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37.scope: Deactivated successfully.
Nov 29 02:20:58 np0005539505 podman[236034]: 2025-11-29 07:20:58.609758802 +0000 UTC m=+0.041583061 container died 882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:20:58 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37-userdata-shm.mount: Deactivated successfully.
Nov 29 02:20:58 np0005539505 systemd[1]: var-lib-containers-storage-overlay-e4868fce2b7197029fa5e7ce79af80c15274fb6c5fe420fd687347237b766e86-merged.mount: Deactivated successfully.
Nov 29 02:20:58 np0005539505 podman[236034]: 2025-11-29 07:20:58.648685356 +0000 UTC m=+0.080509605 container cleanup 882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 02:20:58 np0005539505 systemd[1]: libpod-conmon-882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37.scope: Deactivated successfully.
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.702 186962 INFO nova.virt.libvirt.driver [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Instance destroyed successfully.#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.703 186962 DEBUG nova.objects.instance [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lazy-loading 'resources' on Instance uuid aa4795d1-71b1-415f-ac22-5bb11775bc84 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.719 186962 DEBUG nova.virt.libvirt.vif [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1506153238',display_name='tempest-ServersNegativeTestJSON-server-1506153238',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1506153238',id=104,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:20:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='329bbbdd41424742b3045e77150a498e',ramdisk_id='',reservation_id='r-widkbdap',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-1191192320',owner_user_name='tempest-ServersNegativeTestJSON-1191192320-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:20:47Z,user_data=None,user_id='2647a3e4fc214b4a85db1283eb7ef117',uuid=aa4795d1-71b1-415f-ac22-5bb11775bc84,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:20:58 np0005539505 podman[236065]: 2025-11-29 07:20:58.720127933 +0000 UTC m=+0.049291019 container remove 882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.720 186962 DEBUG nova.network.os_vif_util [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converting VIF {"id": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "address": "fa:16:3e:1b:c4:96", "network": {"id": "14d61e69-b152-4adc-a95c-58748969e299", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-2003556983-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "329bbbdd41424742b3045e77150a498e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc373f1d7-16", "ovs_interfaceid": "c373f1d7-168e-494b-8e6f-c8af44b0db68", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.721 186962 DEBUG nova.network.os_vif_util [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.721 186962 DEBUG os_vif [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.723 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.723 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc373f1d7-16, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.725 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.727 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.728 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e1df6632-28f6-4fbd-aff9-ef88dfb525c2]: (4, ('Sat Nov 29 07:20:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 (882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37)\n882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37\nSat Nov 29 07:20:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 (882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37)\n882b894175ed0786f2e24e5d85c12cf0a044c6c2aea00b918a48b00172f2ff37\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.731 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6d7d4cce-9ade-4f1b-b34c-1a2f13d22de2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.732 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14d61e69-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:58 np0005539505 kernel: tap14d61e69-b0: left promiscuous mode
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.735 186962 INFO os_vif [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:c4:96,bridge_name='br-int',has_traffic_filtering=True,id=c373f1d7-168e-494b-8e6f-c8af44b0db68,network=Network(14d61e69-b152-4adc-a95c-58748969e299),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc373f1d7-16')#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.736 186962 INFO nova.virt.libvirt.driver [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Deleting instance files /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84_del#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.741 186962 INFO nova.virt.libvirt.driver [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Deletion of /var/lib/nova/instances/aa4795d1-71b1-415f-ac22-5bb11775bc84_del complete#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.744 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.750 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c13d4d37-b340-4a8c-b51f-99017bdd78d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.778 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7e41b135-63ae-457b-b5c2-7367f95ee582]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.780 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7f38249e-4a63-4df6-a3ec-0cad06e55560]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.799 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1251cdae-2ad1-4c0f-8ad5-da8fc1216646]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 629395, 'reachable_time': 27157, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236095, 'error': None, 'target': 'ovnmeta-14d61e69-b152-4adc-a95c-58748969e299', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.802 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-14d61e69-b152-4adc-a95c-58748969e299 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:20:58 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:20:58.802 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[d5a74b29-45f9-4d56-8275-e9ea9907e8ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:20:58 np0005539505 systemd[1]: run-netns-ovnmeta\x2d14d61e69\x2db152\x2d4adc\x2da95c\x2d58748969e299.mount: Deactivated successfully.
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.818 186962 INFO nova.compute.manager [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.819 186962 DEBUG oslo.service.loopingcall [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.819 186962 DEBUG nova.compute.manager [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.819 186962 DEBUG nova.network.neutron [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.872 186962 DEBUG nova.network.neutron [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Successfully updated port: e11f1d32-b7a1-4573-bf0c-82d8ef318172 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.895 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Acquiring lock "refresh_cache-dcc91175-ce19-46f4-b2c3-9a47065c5a3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.895 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Acquired lock "refresh_cache-dcc91175-ce19-46f4-b2c3-9a47065c5a3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:58 np0005539505 nova_compute[186958]: 2025-11-29 07:20:58.895 186962 DEBUG nova.network.neutron [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.010 186962 DEBUG nova.compute.manager [req-2eab5e6a-1bc0-4267-8da9-ba8d89af58e1 req-bdc48fa2-d634-425c-b269-a889414ab41f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Received event network-changed-e11f1d32-b7a1-4573-bf0c-82d8ef318172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.010 186962 DEBUG nova.compute.manager [req-2eab5e6a-1bc0-4267-8da9-ba8d89af58e1 req-bdc48fa2-d634-425c-b269-a889414ab41f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Refreshing instance network info cache due to event network-changed-e11f1d32-b7a1-4573-bf0c-82d8ef318172. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.011 186962 DEBUG oslo_concurrency.lockutils [req-2eab5e6a-1bc0-4267-8da9-ba8d89af58e1 req-bdc48fa2-d634-425c-b269-a889414ab41f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-dcc91175-ce19-46f4-b2c3-9a47065c5a3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.044 186962 DEBUG nova.network.neutron [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.506 186962 DEBUG nova.network.neutron [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.528 186962 INFO nova.compute.manager [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Took 0.71 seconds to deallocate network for instance.#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.616 186962 DEBUG oslo_concurrency.lockutils [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.617 186962 DEBUG oslo_concurrency.lockutils [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.684 186962 DEBUG nova.compute.provider_tree [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.722 186962 DEBUG nova.scheduler.client.report [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.749 186962 DEBUG oslo_concurrency.lockutils [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.753 186962 DEBUG nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-unplugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.753 186962 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.754 186962 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.754 186962 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.754 186962 DEBUG nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] No waiting events found dispatching network-vif-unplugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.754 186962 WARNING nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received unexpected event network-vif-unplugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.754 186962 DEBUG nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.754 186962 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.755 186962 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.755 186962 DEBUG oslo_concurrency.lockutils [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.755 186962 DEBUG nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] No waiting events found dispatching network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.755 186962 WARNING nova.compute.manager [req-038c88ff-950c-4b9d-829b-9b5cd9423dc2 req-05b70f2d-a578-4ad3-be5f-ab44d7647c85 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received unexpected event network-vif-plugged-c373f1d7-168e-494b-8e6f-c8af44b0db68 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.782 186962 INFO nova.scheduler.client.report [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Deleted allocations for instance aa4795d1-71b1-415f-ac22-5bb11775bc84#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.830 186962 DEBUG nova.network.neutron [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Updating instance_info_cache with network_info: [{"id": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "address": "fa:16:3e:c8:8c:dd", "network": {"id": "fb9b6d5b-9325-488c-9874-5dad63b487ef", "bridge": "br-int", "label": "tempest-ServersTestJSON-1033615045-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f098dabf54514ea688ed89906cf2d3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape11f1d32-b7", "ovs_interfaceid": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.853 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Releasing lock "refresh_cache-dcc91175-ce19-46f4-b2c3-9a47065c5a3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.853 186962 DEBUG nova.compute.manager [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Instance network_info: |[{"id": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "address": "fa:16:3e:c8:8c:dd", "network": {"id": "fb9b6d5b-9325-488c-9874-5dad63b487ef", "bridge": "br-int", "label": "tempest-ServersTestJSON-1033615045-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f098dabf54514ea688ed89906cf2d3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape11f1d32-b7", "ovs_interfaceid": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.853 186962 DEBUG oslo_concurrency.lockutils [req-2eab5e6a-1bc0-4267-8da9-ba8d89af58e1 req-bdc48fa2-d634-425c-b269-a889414ab41f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-dcc91175-ce19-46f4-b2c3-9a47065c5a3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.854 186962 DEBUG nova.network.neutron [req-2eab5e6a-1bc0-4267-8da9-ba8d89af58e1 req-bdc48fa2-d634-425c-b269-a889414ab41f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Refreshing network info cache for port e11f1d32-b7a1-4573-bf0c-82d8ef318172 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.857 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Start _get_guest_xml network_info=[{"id": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "address": "fa:16:3e:c8:8c:dd", "network": {"id": "fb9b6d5b-9325-488c-9874-5dad63b487ef", "bridge": "br-int", "label": "tempest-ServersTestJSON-1033615045-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f098dabf54514ea688ed89906cf2d3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape11f1d32-b7", "ovs_interfaceid": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.864 186962 WARNING nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.866 186962 DEBUG oslo_concurrency.lockutils [None req-a046c98e-5cb3-4189-a691-5c1e8ff91b0b 2647a3e4fc214b4a85db1283eb7ef117 329bbbdd41424742b3045e77150a498e - - default default] Lock "aa4795d1-71b1-415f-ac22-5bb11775bc84" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.447s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.872 186962 DEBUG nova.virt.libvirt.host [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.872 186962 DEBUG nova.virt.libvirt.host [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.877 186962 DEBUG nova.virt.libvirt.host [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.878 186962 DEBUG nova.virt.libvirt.host [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.879 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.879 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.879 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.880 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.880 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.880 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.880 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.880 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.881 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.881 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.881 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.881 186962 DEBUG nova.virt.hardware [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.884 186962 DEBUG nova.virt.libvirt.vif [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:20:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-26960618',display_name='tempest-ServersTestJSON-server-26960618',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-26960618',id=113,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFAkwrnwJIV2dHUeR6GiloLZ5fNRFscp4Nl3oMm+lk6fhQyprdJ6/R2wCe9xPzJASnddH6OtStPxb6jWU1aHmUG9N7bOxgbcEM0BlM7roptY13Spiu5wX1xCZ2XmCdEdrw==',key_name='tempest-keypair-392868756',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f098dabf54514ea688ed89906cf2d3dc',ramdisk_id='',reservation_id='r-dgnd33do',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-598601190',owner_user_name='tempest-ServersTestJSON-598601190-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:20:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ae34cae3b83748ca86590adaf3f6dab6',uuid=dcc91175-ce19-46f4-b2c3-9a47065c5a3b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "address": "fa:16:3e:c8:8c:dd", "network": {"id": "fb9b6d5b-9325-488c-9874-5dad63b487ef", "bridge": "br-int", "label": "tempest-ServersTestJSON-1033615045-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f098dabf54514ea688ed89906cf2d3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape11f1d32-b7", "ovs_interfaceid": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.885 186962 DEBUG nova.network.os_vif_util [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Converting VIF {"id": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "address": "fa:16:3e:c8:8c:dd", "network": {"id": "fb9b6d5b-9325-488c-9874-5dad63b487ef", "bridge": "br-int", "label": "tempest-ServersTestJSON-1033615045-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f098dabf54514ea688ed89906cf2d3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape11f1d32-b7", "ovs_interfaceid": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.885 186962 DEBUG nova.network.os_vif_util [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:8c:dd,bridge_name='br-int',has_traffic_filtering=True,id=e11f1d32-b7a1-4573-bf0c-82d8ef318172,network=Network(fb9b6d5b-9325-488c-9874-5dad63b487ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape11f1d32-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.887 186962 DEBUG nova.objects.instance [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lazy-loading 'pci_devices' on Instance uuid dcc91175-ce19-46f4-b2c3-9a47065c5a3b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.903 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  <uuid>dcc91175-ce19-46f4-b2c3-9a47065c5a3b</uuid>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  <name>instance-00000071</name>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServersTestJSON-server-26960618</nova:name>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:20:59</nova:creationTime>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:        <nova:user uuid="ae34cae3b83748ca86590adaf3f6dab6">tempest-ServersTestJSON-598601190-project-member</nova:user>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:        <nova:project uuid="f098dabf54514ea688ed89906cf2d3dc">tempest-ServersTestJSON-598601190</nova:project>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:        <nova:port uuid="e11f1d32-b7a1-4573-bf0c-82d8ef318172">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <entry name="serial">dcc91175-ce19-46f4-b2c3-9a47065c5a3b</entry>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <entry name="uuid">dcc91175-ce19-46f4-b2c3-9a47065c5a3b</entry>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk.config"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:c8:8c:dd"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <target dev="tape11f1d32-b7"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/console.log" append="off"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:20:59 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:20:59 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:20:59 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:20:59 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.904 186962 DEBUG nova.compute.manager [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Preparing to wait for external event network-vif-plugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.904 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Acquiring lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.905 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.905 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.906 186962 DEBUG nova.virt.libvirt.vif [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:20:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-26960618',display_name='tempest-ServersTestJSON-server-26960618',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-26960618',id=113,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFAkwrnwJIV2dHUeR6GiloLZ5fNRFscp4Nl3oMm+lk6fhQyprdJ6/R2wCe9xPzJASnddH6OtStPxb6jWU1aHmUG9N7bOxgbcEM0BlM7roptY13Spiu5wX1xCZ2XmCdEdrw==',key_name='tempest-keypair-392868756',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f098dabf54514ea688ed89906cf2d3dc',ramdisk_id='',reservation_id='r-dgnd33do',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-598601190',owner_user_name='tempest-ServersTestJSON-598601190-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:20:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ae34cae3b83748ca86590adaf3f6dab6',uuid=dcc91175-ce19-46f4-b2c3-9a47065c5a3b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "address": "fa:16:3e:c8:8c:dd", "network": {"id": "fb9b6d5b-9325-488c-9874-5dad63b487ef", "bridge": "br-int", "label": "tempest-ServersTestJSON-1033615045-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f098dabf54514ea688ed89906cf2d3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape11f1d32-b7", "ovs_interfaceid": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.906 186962 DEBUG nova.network.os_vif_util [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Converting VIF {"id": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "address": "fa:16:3e:c8:8c:dd", "network": {"id": "fb9b6d5b-9325-488c-9874-5dad63b487ef", "bridge": "br-int", "label": "tempest-ServersTestJSON-1033615045-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f098dabf54514ea688ed89906cf2d3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape11f1d32-b7", "ovs_interfaceid": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.907 186962 DEBUG nova.network.os_vif_util [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:8c:dd,bridge_name='br-int',has_traffic_filtering=True,id=e11f1d32-b7a1-4573-bf0c-82d8ef318172,network=Network(fb9b6d5b-9325-488c-9874-5dad63b487ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape11f1d32-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.907 186962 DEBUG os_vif [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:8c:dd,bridge_name='br-int',has_traffic_filtering=True,id=e11f1d32-b7a1-4573-bf0c-82d8ef318172,network=Network(fb9b6d5b-9325-488c-9874-5dad63b487ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape11f1d32-b7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.907 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.908 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.908 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.912 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.912 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape11f1d32-b7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.913 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape11f1d32-b7, col_values=(('external_ids', {'iface-id': 'e11f1d32-b7a1-4573-bf0c-82d8ef318172', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:8c:dd', 'vm-uuid': 'dcc91175-ce19-46f4-b2c3-9a47065c5a3b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.914 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:59 np0005539505 NetworkManager[55134]: <info>  [1764400859.9157] manager: (tape11f1d32-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.916 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.920 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.920 186962 INFO os_vif [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:8c:dd,bridge_name='br-int',has_traffic_filtering=True,id=e11f1d32-b7a1-4573-bf0c-82d8ef318172,network=Network(fb9b6d5b-9325-488c-9874-5dad63b487ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape11f1d32-b7')#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.984 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.984 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.984 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] No VIF found with MAC fa:16:3e:c8:8c:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:20:59 np0005539505 nova_compute[186958]: 2025-11-29 07:20:59.985 186962 INFO nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Using config drive#033[00m
Nov 29 02:21:00 np0005539505 nova_compute[186958]: 2025-11-29 07:21:00.066 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:00 np0005539505 nova_compute[186958]: 2025-11-29 07:21:00.351 186962 INFO nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Creating config drive at /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk.config#033[00m
Nov 29 02:21:00 np0005539505 nova_compute[186958]: 2025-11-29 07:21:00.358 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_trr1zf_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:21:00 np0005539505 nova_compute[186958]: 2025-11-29 07:21:00.493 186962 DEBUG oslo_concurrency.processutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_trr1zf_" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:21:00 np0005539505 kernel: tape11f1d32-b7: entered promiscuous mode
Nov 29 02:21:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:00Z|00515|binding|INFO|Claiming lport e11f1d32-b7a1-4573-bf0c-82d8ef318172 for this chassis.
Nov 29 02:21:00 np0005539505 NetworkManager[55134]: <info>  [1764400860.5531] manager: (tape11f1d32-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/253)
Nov 29 02:21:00 np0005539505 systemd-udevd[236014]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:21:00 np0005539505 nova_compute[186958]: 2025-11-29 07:21:00.553 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:00Z|00516|binding|INFO|e11f1d32-b7a1-4573-bf0c-82d8ef318172: Claiming fa:16:3e:c8:8c:dd 10.100.0.7
Nov 29 02:21:00 np0005539505 nova_compute[186958]: 2025-11-29 07:21:00.557 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:00 np0005539505 NetworkManager[55134]: <info>  [1764400860.5667] device (tape11f1d32-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.565 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:8c:dd 10.100.0.7'], port_security=['fa:16:3e:c8:8c:dd 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'dcc91175-ce19-46f4-b2c3-9a47065c5a3b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb9b6d5b-9325-488c-9874-5dad63b487ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f098dabf54514ea688ed89906cf2d3dc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '57732679-d152-43b8-8f82-648ae514f6e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b347a519-7cb9-4fa1-b41f-bde19795a156, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=e11f1d32-b7a1-4573-bf0c-82d8ef318172) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:21:00 np0005539505 NetworkManager[55134]: <info>  [1764400860.5676] device (tape11f1d32-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.567 104094 INFO neutron.agent.ovn.metadata.agent [-] Port e11f1d32-b7a1-4573-bf0c-82d8ef318172 in datapath fb9b6d5b-9325-488c-9874-5dad63b487ef bound to our chassis#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.569 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fb9b6d5b-9325-488c-9874-5dad63b487ef#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.579 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[60789cc7-74b1-4d7f-9e74-bdcbccb83334]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.580 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfb9b6d5b-91 in ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.582 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfb9b6d5b-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.582 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[94dc4fee-7a01-4cc0-96be-9f1a221fff14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.583 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8d856399-a0d8-47c0-9b03-d1a521f51af2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.594 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[13ac22e5-3452-4a4e-bb7d-c3b5f96a6c0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 systemd-machined[153285]: New machine qemu-59-instance-00000071.
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.613 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[17e2576f-5239-419c-95cd-6dfcfa118610]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 nova_compute[186958]: 2025-11-29 07:21:00.614 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:00 np0005539505 systemd[1]: Started Virtual Machine qemu-59-instance-00000071.
Nov 29 02:21:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:00Z|00517|binding|INFO|Setting lport e11f1d32-b7a1-4573-bf0c-82d8ef318172 up in Southbound
Nov 29 02:21:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:00Z|00518|binding|INFO|Setting lport e11f1d32-b7a1-4573-bf0c-82d8ef318172 ovn-installed in OVS
Nov 29 02:21:00 np0005539505 nova_compute[186958]: 2025-11-29 07:21:00.622 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:00 np0005539505 nova_compute[186958]: 2025-11-29 07:21:00.623 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.648 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c8259d67-587f-4ab8-9e5d-ca0c3b30a511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.652 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b2485d58-0f74-42d3-b280-243214bf3111]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 NetworkManager[55134]: <info>  [1764400860.6537] manager: (tapfb9b6d5b-90): new Veth device (/org/freedesktop/NetworkManager/Devices/254)
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.689 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c6fce3b5-a436-49d8-84d9-e6587cac8d1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.692 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2df92066-6b61-4b58-a506-9026457a2cf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 NetworkManager[55134]: <info>  [1764400860.7142] device (tapfb9b6d5b-90): carrier: link connected
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.721 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[dff941ca-f2c3-40f0-ae6d-6cba8aa135ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.736 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fb989cf8-3526-4dbf-b40b-2fe8f1806b2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb9b6d5b-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:73:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630829, 'reachable_time': 28791, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236147, 'error': None, 'target': 'ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.753 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[66fc9061-7860-4803-877e-dec3d06f04c2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2e:73d2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 630829, 'tstamp': 630829}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236148, 'error': None, 'target': 'ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.769 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2d4f6c-9db8-4d85-a713-de338be9f455]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfb9b6d5b-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2e:73:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630829, 'reachable_time': 28791, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236149, 'error': None, 'target': 'ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.806 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e65681ab-ebcd-4f70-940b-94e7ba15969c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.864 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a96c49-a107-4d0e-ad2a-96383c3f7128]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.865 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb9b6d5b-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.866 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.866 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb9b6d5b-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:00 np0005539505 NetworkManager[55134]: <info>  [1764400860.8693] manager: (tapfb9b6d5b-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Nov 29 02:21:00 np0005539505 nova_compute[186958]: 2025-11-29 07:21:00.869 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:00 np0005539505 kernel: tapfb9b6d5b-90: entered promiscuous mode
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.871 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfb9b6d5b-90, col_values=(('external_ids', {'iface-id': '412684c6-eee2-4a1c-8ae7-fabc6e42bc67'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:00Z|00519|binding|INFO|Releasing lport 412684c6-eee2-4a1c-8ae7-fabc6e42bc67 from this chassis (sb_readonly=0)
Nov 29 02:21:00 np0005539505 nova_compute[186958]: 2025-11-29 07:21:00.884 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.885 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fb9b6d5b-9325-488c-9874-5dad63b487ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fb9b6d5b-9325-488c-9874-5dad63b487ef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.886 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d504b2-d891-4b68-84da-9e6c406ff2e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.887 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-fb9b6d5b-9325-488c-9874-5dad63b487ef
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/fb9b6d5b-9325-488c-9874-5dad63b487ef.pid.haproxy
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID fb9b6d5b-9325-488c-9874-5dad63b487ef
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:21:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:00.889 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef', 'env', 'PROCESS_TAG=haproxy-fb9b6d5b-9325-488c-9874-5dad63b487ef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fb9b6d5b-9325-488c-9874-5dad63b487ef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.186 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400861.1855075, dcc91175-ce19-46f4-b2c3-9a47065c5a3b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.186 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] VM Started (Lifecycle Event)#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.206 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.211 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400861.1857576, dcc91175-ce19-46f4-b2c3-9a47065c5a3b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.211 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.232 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.238 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.255 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.305 186962 DEBUG nova.compute.manager [req-60f1b596-1d6f-48a1-a3f7-9a373e8ed6a0 req-e0ee4294-ba78-4630-92e6-e50dc1948fe5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Received event network-vif-deleted-c373f1d7-168e-494b-8e6f-c8af44b0db68 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:21:01 np0005539505 podman[236187]: 2025-11-29 07:21:01.308482733 +0000 UTC m=+0.061121775 container create 602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 02:21:01 np0005539505 systemd[1]: Started libpod-conmon-602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a.scope.
Nov 29 02:21:01 np0005539505 podman[236187]: 2025-11-29 07:21:01.277820143 +0000 UTC m=+0.030459205 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:21:01 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:21:01 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1dc5a478dd514389e76eb963ccf8b630147140e4e7b60ba8e04d219ef7c6d3f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:21:01 np0005539505 podman[236187]: 2025-11-29 07:21:01.400635708 +0000 UTC m=+0.153274750 container init 602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:21:01 np0005539505 podman[236187]: 2025-11-29 07:21:01.407274286 +0000 UTC m=+0.159913328 container start 602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:21:01 np0005539505 neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef[236202]: [NOTICE]   (236206) : New worker (236208) forked
Nov 29 02:21:01 np0005539505 neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef[236202]: [NOTICE]   (236206) : Loading success.
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.611 186962 DEBUG nova.network.neutron [req-2eab5e6a-1bc0-4267-8da9-ba8d89af58e1 req-bdc48fa2-d634-425c-b269-a889414ab41f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Updated VIF entry in instance network info cache for port e11f1d32-b7a1-4573-bf0c-82d8ef318172. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.612 186962 DEBUG nova.network.neutron [req-2eab5e6a-1bc0-4267-8da9-ba8d89af58e1 req-bdc48fa2-d634-425c-b269-a889414ab41f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Updating instance_info_cache with network_info: [{"id": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "address": "fa:16:3e:c8:8c:dd", "network": {"id": "fb9b6d5b-9325-488c-9874-5dad63b487ef", "bridge": "br-int", "label": "tempest-ServersTestJSON-1033615045-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f098dabf54514ea688ed89906cf2d3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape11f1d32-b7", "ovs_interfaceid": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.629 186962 DEBUG oslo_concurrency.lockutils [req-2eab5e6a-1bc0-4267-8da9-ba8d89af58e1 req-bdc48fa2-d634-425c-b269-a889414ab41f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-dcc91175-ce19-46f4-b2c3-9a47065c5a3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.881 186962 DEBUG nova.compute.manager [req-533a60f6-37fe-4703-ada0-6247919e75cf req-3ac10f86-9a32-4682-8962-e4086f8a480e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Received event network-vif-plugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.882 186962 DEBUG oslo_concurrency.lockutils [req-533a60f6-37fe-4703-ada0-6247919e75cf req-3ac10f86-9a32-4682-8962-e4086f8a480e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.882 186962 DEBUG oslo_concurrency.lockutils [req-533a60f6-37fe-4703-ada0-6247919e75cf req-3ac10f86-9a32-4682-8962-e4086f8a480e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.883 186962 DEBUG oslo_concurrency.lockutils [req-533a60f6-37fe-4703-ada0-6247919e75cf req-3ac10f86-9a32-4682-8962-e4086f8a480e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.883 186962 DEBUG nova.compute.manager [req-533a60f6-37fe-4703-ada0-6247919e75cf req-3ac10f86-9a32-4682-8962-e4086f8a480e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Processing event network-vif-plugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.883 186962 DEBUG nova.compute.manager [req-533a60f6-37fe-4703-ada0-6247919e75cf req-3ac10f86-9a32-4682-8962-e4086f8a480e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Received event network-vif-plugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.883 186962 DEBUG oslo_concurrency.lockutils [req-533a60f6-37fe-4703-ada0-6247919e75cf req-3ac10f86-9a32-4682-8962-e4086f8a480e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.884 186962 DEBUG oslo_concurrency.lockutils [req-533a60f6-37fe-4703-ada0-6247919e75cf req-3ac10f86-9a32-4682-8962-e4086f8a480e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.884 186962 DEBUG oslo_concurrency.lockutils [req-533a60f6-37fe-4703-ada0-6247919e75cf req-3ac10f86-9a32-4682-8962-e4086f8a480e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.884 186962 DEBUG nova.compute.manager [req-533a60f6-37fe-4703-ada0-6247919e75cf req-3ac10f86-9a32-4682-8962-e4086f8a480e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] No waiting events found dispatching network-vif-plugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.885 186962 WARNING nova.compute.manager [req-533a60f6-37fe-4703-ada0-6247919e75cf req-3ac10f86-9a32-4682-8962-e4086f8a480e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Received unexpected event network-vif-plugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.885 186962 DEBUG nova.compute.manager [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.889 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400861.8896866, dcc91175-ce19-46f4-b2c3-9a47065c5a3b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.890 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.892 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.895 186962 INFO nova.virt.libvirt.driver [-] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Instance spawned successfully.#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.896 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.914 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.917 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.925 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.926 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.926 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.927 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.927 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.928 186962 DEBUG nova.virt.libvirt.driver [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:21:01 np0005539505 nova_compute[186958]: 2025-11-29 07:21:01.939 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:21:02 np0005539505 nova_compute[186958]: 2025-11-29 07:21:02.020 186962 INFO nova.compute.manager [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Took 4.76 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:21:02 np0005539505 nova_compute[186958]: 2025-11-29 07:21:02.021 186962 DEBUG nova.compute.manager [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:21:02 np0005539505 nova_compute[186958]: 2025-11-29 07:21:02.108 186962 INFO nova.compute.manager [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Took 8.16 seconds to build instance.#033[00m
Nov 29 02:21:02 np0005539505 nova_compute[186958]: 2025-11-29 07:21:02.128 186962 DEBUG oslo_concurrency.lockutils [None req-fdb3c789-59cf-49ca-8aad-8f3aed350d8b ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:02 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:02Z|00520|binding|INFO|Releasing lport 412684c6-eee2-4a1c-8ae7-fabc6e42bc67 from this chassis (sb_readonly=0)
Nov 29 02:21:02 np0005539505 nova_compute[186958]: 2025-11-29 07:21:02.488 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:04 np0005539505 NetworkManager[55134]: <info>  [1764400864.2078] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/256)
Nov 29 02:21:04 np0005539505 NetworkManager[55134]: <info>  [1764400864.2088] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/257)
Nov 29 02:21:04 np0005539505 nova_compute[186958]: 2025-11-29 07:21:04.207 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:04 np0005539505 nova_compute[186958]: 2025-11-29 07:21:04.317 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:04Z|00521|binding|INFO|Releasing lport 412684c6-eee2-4a1c-8ae7-fabc6e42bc67 from this chassis (sb_readonly=0)
Nov 29 02:21:04 np0005539505 nova_compute[186958]: 2025-11-29 07:21:04.332 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:04 np0005539505 nova_compute[186958]: 2025-11-29 07:21:04.647 186962 DEBUG nova.compute.manager [req-1d614552-9a82-4fff-890f-d5fedb4d0bf9 req-8ed2a68f-5f47-4c8f-86a0-2764e6876917 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Received event network-changed-e11f1d32-b7a1-4573-bf0c-82d8ef318172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:21:04 np0005539505 nova_compute[186958]: 2025-11-29 07:21:04.647 186962 DEBUG nova.compute.manager [req-1d614552-9a82-4fff-890f-d5fedb4d0bf9 req-8ed2a68f-5f47-4c8f-86a0-2764e6876917 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Refreshing instance network info cache due to event network-changed-e11f1d32-b7a1-4573-bf0c-82d8ef318172. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:21:04 np0005539505 nova_compute[186958]: 2025-11-29 07:21:04.648 186962 DEBUG oslo_concurrency.lockutils [req-1d614552-9a82-4fff-890f-d5fedb4d0bf9 req-8ed2a68f-5f47-4c8f-86a0-2764e6876917 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-dcc91175-ce19-46f4-b2c3-9a47065c5a3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:21:04 np0005539505 nova_compute[186958]: 2025-11-29 07:21:04.648 186962 DEBUG oslo_concurrency.lockutils [req-1d614552-9a82-4fff-890f-d5fedb4d0bf9 req-8ed2a68f-5f47-4c8f-86a0-2764e6876917 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-dcc91175-ce19-46f4-b2c3-9a47065c5a3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:21:04 np0005539505 nova_compute[186958]: 2025-11-29 07:21:04.648 186962 DEBUG nova.network.neutron [req-1d614552-9a82-4fff-890f-d5fedb4d0bf9 req-8ed2a68f-5f47-4c8f-86a0-2764e6876917 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Refreshing network info cache for port e11f1d32-b7a1-4573-bf0c-82d8ef318172 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:21:04 np0005539505 nova_compute[186958]: 2025-11-29 07:21:04.915 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:05 np0005539505 nova_compute[186958]: 2025-11-29 07:21:05.069 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:05 np0005539505 nova_compute[186958]: 2025-11-29 07:21:05.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:06 np0005539505 nova_compute[186958]: 2025-11-29 07:21:06.839 186962 DEBUG nova.network.neutron [req-1d614552-9a82-4fff-890f-d5fedb4d0bf9 req-8ed2a68f-5f47-4c8f-86a0-2764e6876917 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Updated VIF entry in instance network info cache for port e11f1d32-b7a1-4573-bf0c-82d8ef318172. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:21:06 np0005539505 nova_compute[186958]: 2025-11-29 07:21:06.839 186962 DEBUG nova.network.neutron [req-1d614552-9a82-4fff-890f-d5fedb4d0bf9 req-8ed2a68f-5f47-4c8f-86a0-2764e6876917 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Updating instance_info_cache with network_info: [{"id": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "address": "fa:16:3e:c8:8c:dd", "network": {"id": "fb9b6d5b-9325-488c-9874-5dad63b487ef", "bridge": "br-int", "label": "tempest-ServersTestJSON-1033615045-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f098dabf54514ea688ed89906cf2d3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape11f1d32-b7", "ovs_interfaceid": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:21:06 np0005539505 nova_compute[186958]: 2025-11-29 07:21:06.877 186962 DEBUG oslo_concurrency.lockutils [req-1d614552-9a82-4fff-890f-d5fedb4d0bf9 req-8ed2a68f-5f47-4c8f-86a0-2764e6876917 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-dcc91175-ce19-46f4-b2c3-9a47065c5a3b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:21:09 np0005539505 nova_compute[186958]: 2025-11-29 07:21:09.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:09 np0005539505 podman[236219]: 2025-11-29 07:21:09.733924619 +0000 UTC m=+0.053671654 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:21:09 np0005539505 podman[236218]: 2025-11-29 07:21:09.744383545 +0000 UTC m=+0.068438532 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:21:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:09Z|00522|binding|INFO|Releasing lport 412684c6-eee2-4a1c-8ae7-fabc6e42bc67 from this chassis (sb_readonly=0)
Nov 29 02:21:09 np0005539505 nova_compute[186958]: 2025-11-29 07:21:09.801 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:09 np0005539505 nova_compute[186958]: 2025-11-29 07:21:09.917 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:10 np0005539505 nova_compute[186958]: 2025-11-29 07:21:10.071 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:10 np0005539505 nova_compute[186958]: 2025-11-29 07:21:10.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:10.570 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:21:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:10.572 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:21:10 np0005539505 nova_compute[186958]: 2025-11-29 07:21:10.572 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:11 np0005539505 nova_compute[186958]: 2025-11-29 07:21:11.924 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:12 np0005539505 podman[236261]: 2025-11-29 07:21:12.738488517 +0000 UTC m=+0.064709387 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:21:13 np0005539505 nova_compute[186958]: 2025-11-29 07:21:13.701 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400858.6994562, aa4795d1-71b1-415f-ac22-5bb11775bc84 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:21:13 np0005539505 nova_compute[186958]: 2025-11-29 07:21:13.702 186962 INFO nova.compute.manager [-] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:21:13 np0005539505 nova_compute[186958]: 2025-11-29 07:21:13.729 186962 DEBUG nova.compute.manager [None req-7d24219f-9e60-4a79-9500-48a42c1e3b54 - - - - - -] [instance: aa4795d1-71b1-415f-ac22-5bb11775bc84] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:21:14 np0005539505 nova_compute[186958]: 2025-11-29 07:21:14.919 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:15 np0005539505 nova_compute[186958]: 2025-11-29 07:21:15.073 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:15.574 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:15Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c8:8c:dd 10.100.0.7
Nov 29 02:21:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:15Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c8:8c:dd 10.100.0.7
Nov 29 02:21:16 np0005539505 nova_compute[186958]: 2025-11-29 07:21:16.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:16 np0005539505 nova_compute[186958]: 2025-11-29 07:21:16.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:16 np0005539505 nova_compute[186958]: 2025-11-29 07:21:16.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:21:17 np0005539505 nova_compute[186958]: 2025-11-29 07:21:17.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:18 np0005539505 nova_compute[186958]: 2025-11-29 07:21:18.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.365 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.366 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.366 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.366 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.509 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.577 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.578 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.643 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.825 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.827 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5477MB free_disk=73.04522323608398GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.827 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.828 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:19 np0005539505 nova_compute[186958]: 2025-11-29 07:21:19.920 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:20 np0005539505 nova_compute[186958]: 2025-11-29 07:21:20.031 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance dcc91175-ce19-46f4-b2c3-9a47065c5a3b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:21:20 np0005539505 nova_compute[186958]: 2025-11-29 07:21:20.032 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:21:20 np0005539505 nova_compute[186958]: 2025-11-29 07:21:20.033 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:21:20 np0005539505 nova_compute[186958]: 2025-11-29 07:21:20.075 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:20 np0005539505 nova_compute[186958]: 2025-11-29 07:21:20.198 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:21:20 np0005539505 nova_compute[186958]: 2025-11-29 07:21:20.216 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:21:20 np0005539505 nova_compute[186958]: 2025-11-29 07:21:20.257 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:21:20 np0005539505 nova_compute[186958]: 2025-11-29 07:21:20.257 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:21 np0005539505 podman[236310]: 2025-11-29 07:21:21.748349674 +0000 UTC m=+0.074104503 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:21:21 np0005539505 podman[236311]: 2025-11-29 07:21:21.776234625 +0000 UTC m=+0.103321432 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:21:22 np0005539505 nova_compute[186958]: 2025-11-29 07:21:22.258 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:22 np0005539505 nova_compute[186958]: 2025-11-29 07:21:22.258 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:21:22 np0005539505 nova_compute[186958]: 2025-11-29 07:21:22.559 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:21:24 np0005539505 nova_compute[186958]: 2025-11-29 07:21:24.922 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:25 np0005539505 nova_compute[186958]: 2025-11-29 07:21:25.077 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:25 np0005539505 nova_compute[186958]: 2025-11-29 07:21:25.123 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:26 np0005539505 podman[236360]: 2025-11-29 07:21:26.722326592 +0000 UTC m=+0.057881394 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 29 02:21:26 np0005539505 podman[236361]: 2025-11-29 07:21:26.730789242 +0000 UTC m=+0.057630126 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 02:21:26 np0005539505 nova_compute[186958]: 2025-11-29 07:21:26.751 186962 DEBUG oslo_concurrency.lockutils [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Acquiring lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:26 np0005539505 nova_compute[186958]: 2025-11-29 07:21:26.752 186962 DEBUG oslo_concurrency.lockutils [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:26 np0005539505 nova_compute[186958]: 2025-11-29 07:21:26.752 186962 DEBUG oslo_concurrency.lockutils [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Acquiring lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:26 np0005539505 nova_compute[186958]: 2025-11-29 07:21:26.752 186962 DEBUG oslo_concurrency.lockutils [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:26 np0005539505 nova_compute[186958]: 2025-11-29 07:21:26.752 186962 DEBUG oslo_concurrency.lockutils [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:26 np0005539505 nova_compute[186958]: 2025-11-29 07:21:26.927 186962 INFO nova.compute.manager [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Terminating instance#033[00m
Nov 29 02:21:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:26.960 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:26.961 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:26.962 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.027 186962 DEBUG nova.compute.manager [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:21:27 np0005539505 kernel: tape11f1d32-b7 (unregistering): left promiscuous mode
Nov 29 02:21:27 np0005539505 NetworkManager[55134]: <info>  [1764400887.0544] device (tape11f1d32-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:21:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:27Z|00523|binding|INFO|Releasing lport e11f1d32-b7a1-4573-bf0c-82d8ef318172 from this chassis (sb_readonly=0)
Nov 29 02:21:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:27Z|00524|binding|INFO|Setting lport e11f1d32-b7a1-4573-bf0c-82d8ef318172 down in Southbound
Nov 29 02:21:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:21:27Z|00525|binding|INFO|Removing iface tape11f1d32-b7 ovn-installed in OVS
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.078 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.090 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:27.095 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:8c:dd 10.100.0.7'], port_security=['fa:16:3e:c8:8c:dd 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'dcc91175-ce19-46f4-b2c3-9a47065c5a3b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb9b6d5b-9325-488c-9874-5dad63b487ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f098dabf54514ea688ed89906cf2d3dc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '57732679-d152-43b8-8f82-648ae514f6e5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b347a519-7cb9-4fa1-b41f-bde19795a156, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=e11f1d32-b7a1-4573-bf0c-82d8ef318172) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:21:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:27.096 104094 INFO neutron.agent.ovn.metadata.agent [-] Port e11f1d32-b7a1-4573-bf0c-82d8ef318172 in datapath fb9b6d5b-9325-488c-9874-5dad63b487ef unbound from our chassis#033[00m
Nov 29 02:21:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:27.098 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb9b6d5b-9325-488c-9874-5dad63b487ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:21:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:27.100 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bf3e27f3-f7e5-4471-9086-e563c7942f7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:27.101 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef namespace which is not needed anymore#033[00m
Nov 29 02:21:27 np0005539505 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000071.scope: Deactivated successfully.
Nov 29 02:21:27 np0005539505 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d00000071.scope: Consumed 14.293s CPU time.
Nov 29 02:21:27 np0005539505 systemd-machined[153285]: Machine qemu-59-instance-00000071 terminated.
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.303 186962 INFO nova.virt.libvirt.driver [-] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Instance destroyed successfully.#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.304 186962 DEBUG nova.objects.instance [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lazy-loading 'resources' on Instance uuid dcc91175-ce19-46f4-b2c3-9a47065c5a3b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.396 186962 DEBUG nova.virt.libvirt.vif [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:20:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-26960618',display_name='tempest-ServersTestJSON-server-26960618',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestjson-server-26960618',id=113,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFAkwrnwJIV2dHUeR6GiloLZ5fNRFscp4Nl3oMm+lk6fhQyprdJ6/R2wCe9xPzJASnddH6OtStPxb6jWU1aHmUG9N7bOxgbcEM0BlM7roptY13Spiu5wX1xCZ2XmCdEdrw==',key_name='tempest-keypair-392868756',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:21:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f098dabf54514ea688ed89906cf2d3dc',ramdisk_id='',reservation_id='r-dgnd33do',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-598601190',owner_user_name='tempest-ServersTestJSON-598601190-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:21:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ae34cae3b83748ca86590adaf3f6dab6',uuid=dcc91175-ce19-46f4-b2c3-9a47065c5a3b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "address": "fa:16:3e:c8:8c:dd", "network": {"id": "fb9b6d5b-9325-488c-9874-5dad63b487ef", "bridge": "br-int", "label": "tempest-ServersTestJSON-1033615045-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f098dabf54514ea688ed89906cf2d3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape11f1d32-b7", "ovs_interfaceid": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.396 186962 DEBUG nova.network.os_vif_util [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Converting VIF {"id": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "address": "fa:16:3e:c8:8c:dd", "network": {"id": "fb9b6d5b-9325-488c-9874-5dad63b487ef", "bridge": "br-int", "label": "tempest-ServersTestJSON-1033615045-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f098dabf54514ea688ed89906cf2d3dc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape11f1d32-b7", "ovs_interfaceid": "e11f1d32-b7a1-4573-bf0c-82d8ef318172", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.397 186962 DEBUG nova.network.os_vif_util [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:8c:dd,bridge_name='br-int',has_traffic_filtering=True,id=e11f1d32-b7a1-4573-bf0c-82d8ef318172,network=Network(fb9b6d5b-9325-488c-9874-5dad63b487ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape11f1d32-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.397 186962 DEBUG os_vif [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:8c:dd,bridge_name='br-int',has_traffic_filtering=True,id=e11f1d32-b7a1-4573-bf0c-82d8ef318172,network=Network(fb9b6d5b-9325-488c-9874-5dad63b487ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape11f1d32-b7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.399 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.400 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape11f1d32-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.401 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.404 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.408 186962 INFO os_vif [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:8c:dd,bridge_name='br-int',has_traffic_filtering=True,id=e11f1d32-b7a1-4573-bf0c-82d8ef318172,network=Network(fb9b6d5b-9325-488c-9874-5dad63b487ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape11f1d32-b7')#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.409 186962 INFO nova.virt.libvirt.driver [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Deleting instance files /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b_del#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.410 186962 INFO nova.virt.libvirt.driver [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Deletion of /var/lib/nova/instances/dcc91175-ce19-46f4-b2c3-9a47065c5a3b_del complete#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.604 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:27 np0005539505 neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef[236202]: [NOTICE]   (236206) : haproxy version is 2.8.14-c23fe91
Nov 29 02:21:27 np0005539505 neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef[236202]: [NOTICE]   (236206) : path to executable is /usr/sbin/haproxy
Nov 29 02:21:27 np0005539505 neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef[236202]: [WARNING]  (236206) : Exiting Master process...
Nov 29 02:21:27 np0005539505 neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef[236202]: [WARNING]  (236206) : Exiting Master process...
Nov 29 02:21:27 np0005539505 neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef[236202]: [ALERT]    (236206) : Current worker (236208) exited with code 143 (Terminated)
Nov 29 02:21:27 np0005539505 neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef[236202]: [WARNING]  (236206) : All workers exited. Exiting... (0)
Nov 29 02:21:27 np0005539505 systemd[1]: libpod-602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a.scope: Deactivated successfully.
Nov 29 02:21:27 np0005539505 podman[236427]: 2025-11-29 07:21:27.642561102 +0000 UTC m=+0.449954698 container died 602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:21:27 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a-userdata-shm.mount: Deactivated successfully.
Nov 29 02:21:27 np0005539505 systemd[1]: var-lib-containers-storage-overlay-1dc5a478dd514389e76eb963ccf8b630147140e4e7b60ba8e04d219ef7c6d3f9-merged.mount: Deactivated successfully.
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.837 186962 INFO nova.compute.manager [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Took 0.81 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.838 186962 DEBUG oslo.service.loopingcall [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.838 186962 DEBUG nova.compute.manager [-] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.838 186962 DEBUG nova.network.neutron [-] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.996 186962 DEBUG nova.compute.manager [req-cc2ae69e-2c79-4d33-8088-4917d49481cd req-4a2d525b-4d00-45bf-b705-7aef0c109cc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Received event network-vif-unplugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.997 186962 DEBUG oslo_concurrency.lockutils [req-cc2ae69e-2c79-4d33-8088-4917d49481cd req-4a2d525b-4d00-45bf-b705-7aef0c109cc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.997 186962 DEBUG oslo_concurrency.lockutils [req-cc2ae69e-2c79-4d33-8088-4917d49481cd req-4a2d525b-4d00-45bf-b705-7aef0c109cc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.997 186962 DEBUG oslo_concurrency.lockutils [req-cc2ae69e-2c79-4d33-8088-4917d49481cd req-4a2d525b-4d00-45bf-b705-7aef0c109cc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.998 186962 DEBUG nova.compute.manager [req-cc2ae69e-2c79-4d33-8088-4917d49481cd req-4a2d525b-4d00-45bf-b705-7aef0c109cc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] No waiting events found dispatching network-vif-unplugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:21:27 np0005539505 nova_compute[186958]: 2025-11-29 07:21:27.998 186962 DEBUG nova.compute.manager [req-cc2ae69e-2c79-4d33-8088-4917d49481cd req-4a2d525b-4d00-45bf-b705-7aef0c109cc9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Received event network-vif-unplugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:21:28 np0005539505 podman[236427]: 2025-11-29 07:21:28.257819178 +0000 UTC m=+1.065212784 container cleanup 602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 02:21:28 np0005539505 systemd[1]: libpod-conmon-602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a.scope: Deactivated successfully.
Nov 29 02:21:28 np0005539505 nova_compute[186958]: 2025-11-29 07:21:28.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:29 np0005539505 podman[236475]: 2025-11-29 07:21:29.485035759 +0000 UTC m=+1.203925131 container remove 602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 29 02:21:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:29.493 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0ca365-d17e-4bbd-82b0-8d26afae95c5]: (4, ('Sat Nov 29 07:21:27 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef (602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a)\n602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a\nSat Nov 29 07:21:28 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef (602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a)\n602e4f6e79a023cac5303d7ab509aeccfc95939159f962e6fed0ee255946ed6a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:29.494 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5237b5de-d5d5-48d1-bdcc-18976ed5d8d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:29.496 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb9b6d5b-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:21:29 np0005539505 nova_compute[186958]: 2025-11-29 07:21:29.497 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:29 np0005539505 kernel: tapfb9b6d5b-90: left promiscuous mode
Nov 29 02:21:29 np0005539505 nova_compute[186958]: 2025-11-29 07:21:29.513 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:29.517 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[20c5345d-e7af-41f7-9089-7790479c94ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:29.536 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a4846354-8035-4c38-9d13-2fb0812b4604]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:29.537 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[554cb4af-4b9e-4c5e-82a5-d17e30864f40]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:29.552 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c064bfb9-f358-4d2e-b165-ad521b5a72df]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630822, 'reachable_time': 40268, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236491, 'error': None, 'target': 'ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:29.555 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fb9b6d5b-9325-488c-9874-5dad63b487ef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:21:29 np0005539505 systemd[1]: run-netns-ovnmeta\x2dfb9b6d5b\x2d9325\x2d488c\x2d9874\x2d5dad63b487ef.mount: Deactivated successfully.
Nov 29 02:21:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:29.555 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[f99fe69e-6c61-4eee-a128-f8d0b847ab8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:29 np0005539505 nova_compute[186958]: 2025-11-29 07:21:29.789 186962 DEBUG nova.network.neutron [-] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:21:29 np0005539505 nova_compute[186958]: 2025-11-29 07:21:29.857 186962 INFO nova.compute.manager [-] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Took 2.02 seconds to deallocate network for instance.#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.017 186962 DEBUG nova.compute.manager [req-fa6848ce-92a2-4bde-9fed-94998aa380b2 req-92d8848f-bfd1-4ba8-bae8-e94b28706d7e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Received event network-vif-deleted-e11f1d32-b7a1-4573-bf0c-82d8ef318172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.078 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.366 186962 DEBUG oslo_concurrency.lockutils [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.367 186962 DEBUG oslo_concurrency.lockutils [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.389 186962 DEBUG nova.compute.manager [req-2f13e55c-a9b5-4556-9f27-7a76af48bc31 req-b749defc-0d37-4d47-9f87-e9f9adf4e27b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Received event network-vif-plugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.389 186962 DEBUG oslo_concurrency.lockutils [req-2f13e55c-a9b5-4556-9f27-7a76af48bc31 req-b749defc-0d37-4d47-9f87-e9f9adf4e27b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.389 186962 DEBUG oslo_concurrency.lockutils [req-2f13e55c-a9b5-4556-9f27-7a76af48bc31 req-b749defc-0d37-4d47-9f87-e9f9adf4e27b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.389 186962 DEBUG oslo_concurrency.lockutils [req-2f13e55c-a9b5-4556-9f27-7a76af48bc31 req-b749defc-0d37-4d47-9f87-e9f9adf4e27b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.390 186962 DEBUG nova.compute.manager [req-2f13e55c-a9b5-4556-9f27-7a76af48bc31 req-b749defc-0d37-4d47-9f87-e9f9adf4e27b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] No waiting events found dispatching network-vif-plugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.390 186962 WARNING nova.compute.manager [req-2f13e55c-a9b5-4556-9f27-7a76af48bc31 req-b749defc-0d37-4d47-9f87-e9f9adf4e27b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Received unexpected event network-vif-plugged-e11f1d32-b7a1-4573-bf0c-82d8ef318172 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.434 186962 DEBUG nova.compute.provider_tree [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.487 186962 DEBUG nova.scheduler.client.report [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.570 186962 DEBUG oslo_concurrency.lockutils [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.615 186962 INFO nova.scheduler.client.report [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Deleted allocations for instance dcc91175-ce19-46f4-b2c3-9a47065c5a3b#033[00m
Nov 29 02:21:30 np0005539505 nova_compute[186958]: 2025-11-29 07:21:30.821 186962 DEBUG oslo_concurrency.lockutils [None req-82fbb450-3b25-4e08-9e2f-9b9849db32b8 ae34cae3b83748ca86590adaf3f6dab6 f098dabf54514ea688ed89906cf2d3dc - - default default] Lock "dcc91175-ce19-46f4-b2c3-9a47065c5a3b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.379 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.380 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.380 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.380 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.381 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.381 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.402 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.517 186962 DEBUG nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.642 186962 DEBUG nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.642 186962 WARNING nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.643 186962 WARNING nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.643 186962 WARNING nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.643 186962 WARNING nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.643 186962 WARNING nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.644 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Removable base files: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7 /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.644 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.644 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.645 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.645 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.645 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.646 186962 DEBUG nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.646 186962 DEBUG nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.646 186962 DEBUG nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.646 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Nov 29 02:21:32 np0005539505 nova_compute[186958]: 2025-11-29 07:21:32.947 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:33 np0005539505 nova_compute[186958]: 2025-11-29 07:21:33.130 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:35 np0005539505 nova_compute[186958]: 2025-11-29 07:21:35.080 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:37 np0005539505 nova_compute[186958]: 2025-11-29 07:21:37.435 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:40 np0005539505 nova_compute[186958]: 2025-11-29 07:21:40.082 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:40 np0005539505 podman[236493]: 2025-11-29 07:21:40.726914555 +0000 UTC m=+0.055652900 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350)
Nov 29 02:21:40 np0005539505 podman[236494]: 2025-11-29 07:21:40.747013555 +0000 UTC m=+0.076461760 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:21:42 np0005539505 nova_compute[186958]: 2025-11-29 07:21:42.301 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400887.300715, dcc91175-ce19-46f4-b2c3-9a47065c5a3b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:21:42 np0005539505 nova_compute[186958]: 2025-11-29 07:21:42.302 186962 INFO nova.compute.manager [-] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:21:42 np0005539505 nova_compute[186958]: 2025-11-29 07:21:42.330 186962 DEBUG nova.compute.manager [None req-f0f0cdf2-c991-451f-a1d3-290e47c81821 - - - - - -] [instance: dcc91175-ce19-46f4-b2c3-9a47065c5a3b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:21:42 np0005539505 nova_compute[186958]: 2025-11-29 07:21:42.437 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:43 np0005539505 podman[236533]: 2025-11-29 07:21:43.706207477 +0000 UTC m=+0.042836946 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:21:45 np0005539505 nova_compute[186958]: 2025-11-29 07:21:45.085 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:45.873 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:9b:10 10.100.0.2 2001:db8::f816:3eff:fe36:9b10'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe36:9b10/64', 'neutron:device_id': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=944fc855-be48-4f5c-ba58-0898fe543a04, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6897d2ce-b04d-4d85-9bb6-9da51e7d7f20) old=Port_Binding(mac=['fa:16:3e:36:9b:10 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f75dc671-4e0c-40f1-8afd-c16b5e416d95', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:21:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:45.874 104094 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6897d2ce-b04d-4d85-9bb6-9da51e7d7f20 in datapath f75dc671-4e0c-40f1-8afd-c16b5e416d95 updated#033[00m
Nov 29 02:21:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:45.875 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f75dc671-4e0c-40f1-8afd-c16b5e416d95, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:21:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:21:45.876 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[124a3737-773b-42dd-8eb7-6d4955fa5327]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:21:47 np0005539505 nova_compute[186958]: 2025-11-29 07:21:47.441 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:50 np0005539505 nova_compute[186958]: 2025-11-29 07:21:50.089 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:52 np0005539505 nova_compute[186958]: 2025-11-29 07:21:52.522 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:52 np0005539505 podman[236552]: 2025-11-29 07:21:52.712557366 +0000 UTC m=+0.049070653 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:21:52 np0005539505 podman[236553]: 2025-11-29 07:21:52.754743203 +0000 UTC m=+0.087194325 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 29 02:21:55 np0005539505 nova_compute[186958]: 2025-11-29 07:21:55.091 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:57 np0005539505 nova_compute[186958]: 2025-11-29 07:21:57.528 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:21:57 np0005539505 podman[236601]: 2025-11-29 07:21:57.722571855 +0000 UTC m=+0.058409178 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:21:57 np0005539505 podman[236602]: 2025-11-29 07:21:57.722131373 +0000 UTC m=+0.054825867 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Nov 29 02:22:00 np0005539505 nova_compute[186958]: 2025-11-29 07:22:00.094 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:02 np0005539505 nova_compute[186958]: 2025-11-29 07:22:02.533 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:05 np0005539505 nova_compute[186958]: 2025-11-29 07:22:05.095 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:05 np0005539505 nova_compute[186958]: 2025-11-29 07:22:05.643 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:07 np0005539505 nova_compute[186958]: 2025-11-29 07:22:07.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:07 np0005539505 nova_compute[186958]: 2025-11-29 07:22:07.537 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:10 np0005539505 nova_compute[186958]: 2025-11-29 07:22:10.098 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:10 np0005539505 nova_compute[186958]: 2025-11-29 07:22:10.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:11 np0005539505 nova_compute[186958]: 2025-11-29 07:22:11.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:11 np0005539505 podman[236640]: 2025-11-29 07:22:11.720184369 +0000 UTC m=+0.051927154 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:22:11 np0005539505 podman[236641]: 2025-11-29 07:22:11.7237275 +0000 UTC m=+0.050261187 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:22:12 np0005539505 nova_compute[186958]: 2025-11-29 07:22:12.539 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:14 np0005539505 podman[236685]: 2025-11-29 07:22:14.722296739 +0000 UTC m=+0.052099870 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:22:15 np0005539505 nova_compute[186958]: 2025-11-29 07:22:15.099 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:15.284 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:22:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:15.285 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:22:15 np0005539505 nova_compute[186958]: 2025-11-29 07:22:15.286 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:16 np0005539505 nova_compute[186958]: 2025-11-29 07:22:16.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:16 np0005539505 nova_compute[186958]: 2025-11-29 07:22:16.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:22:17 np0005539505 nova_compute[186958]: 2025-11-29 07:22:17.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:17 np0005539505 nova_compute[186958]: 2025-11-29 07:22:17.544 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:17 np0005539505 nova_compute[186958]: 2025-11-29 07:22:17.667 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:17 np0005539505 nova_compute[186958]: 2025-11-29 07:22:17.667 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:17 np0005539505 nova_compute[186958]: 2025-11-29 07:22:17.709 186962 DEBUG nova.compute.manager [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:22:17 np0005539505 nova_compute[186958]: 2025-11-29 07:22:17.963 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:17 np0005539505 nova_compute[186958]: 2025-11-29 07:22:17.963 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:17 np0005539505 nova_compute[186958]: 2025-11-29 07:22:17.970 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:22:17 np0005539505 nova_compute[186958]: 2025-11-29 07:22:17.971 186962 INFO nova.compute.claims [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.111 186962 DEBUG nova.compute.provider_tree [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.134 186962 DEBUG nova.scheduler.client.report [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.167 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.168 186962 DEBUG nova.compute.manager [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.258 186962 DEBUG nova.compute.manager [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.259 186962 DEBUG nova.network.neutron [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.280 186962 INFO nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.298 186962 DEBUG nova.compute.manager [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.372 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.431 186962 DEBUG nova.compute.manager [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.432 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.433 186962 INFO nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Creating image(s)#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.433 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.434 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.434 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.447 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.487 186962 DEBUG nova.policy [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.506 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.507 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.507 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.518 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.579 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.580 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.755 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk 1073741824" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.756 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.757 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.814 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.816 186962 DEBUG nova.virt.disk.api [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Checking if we can resize image /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.816 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.873 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.874 186962 DEBUG nova.virt.disk.api [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Cannot resize image /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.874 186962 DEBUG nova.objects.instance [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'migration_context' on Instance uuid ef5226ee-eaff-4a79-bb36-b60389141ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.888 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.889 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Ensure instance console log exists: /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.890 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.890 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:18 np0005539505 nova_compute[186958]: 2025-11-29 07:22:18.890 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.425 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.426 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.426 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.426 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.580 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.582 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5694MB free_disk=73.0737533569336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.582 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.582 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.664 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance ef5226ee-eaff-4a79-bb36-b60389141ed0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.665 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.665 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.706 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.728 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.750 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:22:19 np0005539505 nova_compute[186958]: 2025-11-29 07:22:19.751 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:20 np0005539505 nova_compute[186958]: 2025-11-29 07:22:20.101 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:20 np0005539505 nova_compute[186958]: 2025-11-29 07:22:20.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:20 np0005539505 nova_compute[186958]: 2025-11-29 07:22:20.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:22:20 np0005539505 nova_compute[186958]: 2025-11-29 07:22:20.892 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:22:20 np0005539505 nova_compute[186958]: 2025-11-29 07:22:20.892 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:21 np0005539505 nova_compute[186958]: 2025-11-29 07:22:21.352 186962 DEBUG nova.network.neutron [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Successfully created port: 0f6fb919-4a83-4171-91fa-fa47d25e2247 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:22:21 np0005539505 nova_compute[186958]: 2025-11-29 07:22:21.978 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:21 np0005539505 nova_compute[186958]: 2025-11-29 07:22:21.978 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:22:21 np0005539505 nova_compute[186958]: 2025-11-29 07:22:21.979 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:22:22 np0005539505 nova_compute[186958]: 2025-11-29 07:22:22.043 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:22:22 np0005539505 nova_compute[186958]: 2025-11-29 07:22:22.044 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:22:22 np0005539505 nova_compute[186958]: 2025-11-29 07:22:22.546 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:23 np0005539505 nova_compute[186958]: 2025-11-29 07:22:23.226 186962 DEBUG nova.network.neutron [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Successfully updated port: 0f6fb919-4a83-4171-91fa-fa47d25e2247 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:22:23 np0005539505 nova_compute[186958]: 2025-11-29 07:22:23.254 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "refresh_cache-ef5226ee-eaff-4a79-bb36-b60389141ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:22:23 np0005539505 nova_compute[186958]: 2025-11-29 07:22:23.254 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquired lock "refresh_cache-ef5226ee-eaff-4a79-bb36-b60389141ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:22:23 np0005539505 nova_compute[186958]: 2025-11-29 07:22:23.255 186962 DEBUG nova.network.neutron [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:22:23 np0005539505 nova_compute[186958]: 2025-11-29 07:22:23.351 186962 DEBUG nova.compute.manager [req-4431a185-0b31-4ea8-aff6-47bad2f0504c req-d52e79ff-98dc-414e-b76e-033249e11749 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received event network-changed-0f6fb919-4a83-4171-91fa-fa47d25e2247 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:22:23 np0005539505 nova_compute[186958]: 2025-11-29 07:22:23.352 186962 DEBUG nova.compute.manager [req-4431a185-0b31-4ea8-aff6-47bad2f0504c req-d52e79ff-98dc-414e-b76e-033249e11749 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Refreshing instance network info cache due to event network-changed-0f6fb919-4a83-4171-91fa-fa47d25e2247. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:22:23 np0005539505 nova_compute[186958]: 2025-11-29 07:22:23.352 186962 DEBUG oslo_concurrency.lockutils [req-4431a185-0b31-4ea8-aff6-47bad2f0504c req-d52e79ff-98dc-414e-b76e-033249e11749 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-ef5226ee-eaff-4a79-bb36-b60389141ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:22:23 np0005539505 nova_compute[186958]: 2025-11-29 07:22:23.473 186962 DEBUG nova.network.neutron [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:22:23 np0005539505 podman[236721]: 2025-11-29 07:22:23.719159348 +0000 UTC m=+0.052135751 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:22:23 np0005539505 podman[236722]: 2025-11-29 07:22:23.75344444 +0000 UTC m=+0.079288290 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:22:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:24.288 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.517 186962 DEBUG nova.network.neutron [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Updating instance_info_cache with network_info: [{"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.537 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Releasing lock "refresh_cache-ef5226ee-eaff-4a79-bb36-b60389141ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.538 186962 DEBUG nova.compute.manager [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Instance network_info: |[{"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.538 186962 DEBUG oslo_concurrency.lockutils [req-4431a185-0b31-4ea8-aff6-47bad2f0504c req-d52e79ff-98dc-414e-b76e-033249e11749 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-ef5226ee-eaff-4a79-bb36-b60389141ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.538 186962 DEBUG nova.network.neutron [req-4431a185-0b31-4ea8-aff6-47bad2f0504c req-d52e79ff-98dc-414e-b76e-033249e11749 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Refreshing network info cache for port 0f6fb919-4a83-4171-91fa-fa47d25e2247 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.541 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Start _get_guest_xml network_info=[{"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.546 186962 WARNING nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.550 186962 DEBUG nova.virt.libvirt.host [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.552 186962 DEBUG nova.virt.libvirt.host [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.559 186962 DEBUG nova.virt.libvirt.host [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.560 186962 DEBUG nova.virt.libvirt.host [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.562 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.562 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.562 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.563 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.563 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.563 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.563 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.563 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.564 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.564 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.564 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.564 186962 DEBUG nova.virt.hardware [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.569 186962 DEBUG nova.virt.libvirt.vif [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:22:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1189634294',display_name='tempest-ServerDiskConfigTestJSON-server-1189634294',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1189634294',id=117,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-o66e7759',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:22:18Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=ef5226ee-eaff-4a79-bb36-b60389141ed0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.570 186962 DEBUG nova.network.os_vif_util [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.570 186962 DEBUG nova.network.os_vif_util [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.571 186962 DEBUG nova.objects.instance [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_devices' on Instance uuid ef5226ee-eaff-4a79-bb36-b60389141ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.589 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  <uuid>ef5226ee-eaff-4a79-bb36-b60389141ed0</uuid>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  <name>instance-00000075</name>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1189634294</nova:name>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:22:24</nova:creationTime>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:        <nova:user uuid="000fb7b950024e16902cd58f2ea16ac9">tempest-ServerDiskConfigTestJSON-1282760174-project-member</nova:user>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:        <nova:project uuid="6d55e57bfd184513a304a61cc1cb3730">tempest-ServerDiskConfigTestJSON-1282760174</nova:project>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:        <nova:port uuid="0f6fb919-4a83-4171-91fa-fa47d25e2247">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <entry name="serial">ef5226ee-eaff-4a79-bb36-b60389141ed0</entry>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <entry name="uuid">ef5226ee-eaff-4a79-bb36-b60389141ed0</entry>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.config"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:48:6a:e8"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <target dev="tap0f6fb919-4a"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/console.log" append="off"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:22:24 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:22:24 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:22:24 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:22:24 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.590 186962 DEBUG nova.compute.manager [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Preparing to wait for external event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.591 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.592 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.592 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.593 186962 DEBUG nova.virt.libvirt.vif [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:22:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1189634294',display_name='tempest-ServerDiskConfigTestJSON-server-1189634294',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1189634294',id=117,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-o66e7759',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:22:18Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=ef5226ee-eaff-4a79-bb36-b60389141ed0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.593 186962 DEBUG nova.network.os_vif_util [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.594 186962 DEBUG nova.network.os_vif_util [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.594 186962 DEBUG os_vif [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.595 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.596 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.596 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.600 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.601 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f6fb919-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.602 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0f6fb919-4a, col_values=(('external_ids', {'iface-id': '0f6fb919-4a83-4171-91fa-fa47d25e2247', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:6a:e8', 'vm-uuid': 'ef5226ee-eaff-4a79-bb36-b60389141ed0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.604 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:24 np0005539505 NetworkManager[55134]: <info>  [1764400944.6058] manager: (tap0f6fb919-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.607 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.611 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.611 186962 INFO os_vif [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a')#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.667 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.668 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.668 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No VIF found with MAC fa:16:3e:48:6a:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:22:24 np0005539505 nova_compute[186958]: 2025-11-29 07:22:24.668 186962 INFO nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Using config drive#033[00m
Nov 29 02:22:25 np0005539505 nova_compute[186958]: 2025-11-29 07:22:25.102 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:25 np0005539505 nova_compute[186958]: 2025-11-29 07:22:25.437 186962 INFO nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Creating config drive at /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.config#033[00m
Nov 29 02:22:25 np0005539505 nova_compute[186958]: 2025-11-29 07:22:25.443 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeozlg8jj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:25 np0005539505 nova_compute[186958]: 2025-11-29 07:22:25.570 186962 DEBUG oslo_concurrency.processutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeozlg8jj" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:25 np0005539505 kernel: tap0f6fb919-4a: entered promiscuous mode
Nov 29 02:22:25 np0005539505 NetworkManager[55134]: <info>  [1764400945.6297] manager: (tap0f6fb919-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/259)
Nov 29 02:22:25 np0005539505 nova_compute[186958]: 2025-11-29 07:22:25.636 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:25Z|00526|binding|INFO|Claiming lport 0f6fb919-4a83-4171-91fa-fa47d25e2247 for this chassis.
Nov 29 02:22:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:25Z|00527|binding|INFO|0f6fb919-4a83-4171-91fa-fa47d25e2247: Claiming fa:16:3e:48:6a:e8 10.100.0.12
Nov 29 02:22:25 np0005539505 nova_compute[186958]: 2025-11-29 07:22:25.645 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.657 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:6a:e8 10.100.0.12'], port_security=['fa:16:3e:48:6a:e8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ef5226ee-eaff-4a79-bb36-b60389141ed0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '2', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=0f6fb919-4a83-4171-91fa-fa47d25e2247) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.659 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 0f6fb919-4a83-4171-91fa-fa47d25e2247 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 bound to our chassis#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.661 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958#033[00m
Nov 29 02:22:25 np0005539505 systemd-machined[153285]: New machine qemu-60-instance-00000075.
Nov 29 02:22:25 np0005539505 systemd-udevd[236790]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.673 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[86993ad3-fee3-40e1-82fa-8741d7b939b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.674 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b34af6b-e1 in ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.676 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b34af6b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.676 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[66cd99ab-c1ad-449b-9262-c0d113b4c7cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.678 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[355ca9e2-d8e7-4508-b6e6-0b30e55eb637]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 NetworkManager[55134]: <info>  [1764400945.6848] device (tap0f6fb919-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:22:25 np0005539505 NetworkManager[55134]: <info>  [1764400945.6860] device (tap0f6fb919-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.691 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[0e380bea-21d5-40bb-952d-c61daea4d68f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:25Z|00528|binding|INFO|Setting lport 0f6fb919-4a83-4171-91fa-fa47d25e2247 ovn-installed in OVS
Nov 29 02:22:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:25Z|00529|binding|INFO|Setting lport 0f6fb919-4a83-4171-91fa-fa47d25e2247 up in Southbound
Nov 29 02:22:25 np0005539505 nova_compute[186958]: 2025-11-29 07:22:25.702 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:25 np0005539505 systemd[1]: Started Virtual Machine qemu-60-instance-00000075.
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.716 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ba11b084-573b-415b-a113-0b09609a993a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.747 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[921a3b98-ccc9-4bd7-b3c3-d3b5682f8eea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 NetworkManager[55134]: <info>  [1764400945.7527] manager: (tap9b34af6b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/260)
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.751 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[776d8d8c-bdc6-4c62-b4e9-418667ad3bac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.781 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a0818972-b6f2-4921-bae9-3cfd19bfef7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.785 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2221384d-1168-4d3e-99d7-b970a6767ca6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 NetworkManager[55134]: <info>  [1764400945.8080] device (tap9b34af6b-e0): carrier: link connected
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.813 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[84818726-f0f7-4bfd-b60c-9356a7662d6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.832 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[caf4ad8f-fc4c-418d-9263-fac5a73b343f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 167], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639339, 'reachable_time': 37003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236822, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.848 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[302191eb-71ee-43ec-899f-d61d1bf0d964]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:40d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 639339, 'tstamp': 639339}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236823, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.864 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0ecbfb20-bb1e-4ef0-b99a-eec514c16c6b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 167], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639339, 'reachable_time': 37003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236824, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.893 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[19832978-098c-413b-bad1-d519409a28b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.955 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f7fc6f6c-f006-4f4c-8343-82362af8757d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.956 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.957 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.957 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b34af6b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:25 np0005539505 NetworkManager[55134]: <info>  [1764400945.9606] manager: (tap9b34af6b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Nov 29 02:22:25 np0005539505 nova_compute[186958]: 2025-11-29 07:22:25.960 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:25 np0005539505 kernel: tap9b34af6b-e0: entered promiscuous mode
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.963 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b34af6b-e0, col_values=(('external_ids', {'iface-id': '88f3bff1-58a0-4231-87c4-807c4c2657d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:25Z|00530|binding|INFO|Releasing lport 88f3bff1-58a0-4231-87c4-807c4c2657d5 from this chassis (sb_readonly=0)
Nov 29 02:22:25 np0005539505 nova_compute[186958]: 2025-11-29 07:22:25.965 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.966 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.967 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[86f0aa55-90ab-4369-9510-63d31ae4cbaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.968 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:22:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:25.971 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'env', 'PROCESS_TAG=haproxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:22:25 np0005539505 nova_compute[186958]: 2025-11-29 07:22:25.977 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:26 np0005539505 podman[236856]: 2025-11-29 07:22:26.31817995 +0000 UTC m=+0.046396978 container create ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 02:22:26 np0005539505 systemd[1]: Started libpod-conmon-ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa.scope.
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.358 186962 DEBUG nova.compute.manager [req-365cccb1-6254-4e3e-970b-e2e32a5a1039 req-342c9660-14a0-44de-bba1-23005d4e2b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.359 186962 DEBUG oslo_concurrency.lockutils [req-365cccb1-6254-4e3e-970b-e2e32a5a1039 req-342c9660-14a0-44de-bba1-23005d4e2b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.359 186962 DEBUG oslo_concurrency.lockutils [req-365cccb1-6254-4e3e-970b-e2e32a5a1039 req-342c9660-14a0-44de-bba1-23005d4e2b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.359 186962 DEBUG oslo_concurrency.lockutils [req-365cccb1-6254-4e3e-970b-e2e32a5a1039 req-342c9660-14a0-44de-bba1-23005d4e2b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.360 186962 DEBUG nova.compute.manager [req-365cccb1-6254-4e3e-970b-e2e32a5a1039 req-342c9660-14a0-44de-bba1-23005d4e2b63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Processing event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:22:26 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:22:26 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1450cc44cf3a96b30c56efc93a1ba1f8f6af08dee9bf69b4b69b0174db4165/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:22:26 np0005539505 podman[236856]: 2025-11-29 07:22:26.289332651 +0000 UTC m=+0.017549699 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:22:26 np0005539505 podman[236856]: 2025-11-29 07:22:26.398644162 +0000 UTC m=+0.126861210 container init ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:22:26 np0005539505 podman[236856]: 2025-11-29 07:22:26.404724975 +0000 UTC m=+0.132942003 container start ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 02:22:26 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[236872]: [NOTICE]   (236881) : New worker (236884) forked
Nov 29 02:22:26 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[236872]: [NOTICE]   (236881) : Loading success.
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.455 186962 DEBUG nova.compute.manager [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.456 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400946.4542844, ef5226ee-eaff-4a79-bb36-b60389141ed0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.456 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] VM Started (Lifecycle Event)#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.462 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.466 186962 INFO nova.virt.libvirt.driver [-] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Instance spawned successfully.#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.466 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.470 186962 DEBUG nova.network.neutron [req-4431a185-0b31-4ea8-aff6-47bad2f0504c req-d52e79ff-98dc-414e-b76e-033249e11749 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Updated VIF entry in instance network info cache for port 0f6fb919-4a83-4171-91fa-fa47d25e2247. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.470 186962 DEBUG nova.network.neutron [req-4431a185-0b31-4ea8-aff6-47bad2f0504c req-d52e79ff-98dc-414e-b76e-033249e11749 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Updating instance_info_cache with network_info: [{"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.486 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.489 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.494 186962 DEBUG oslo_concurrency.lockutils [req-4431a185-0b31-4ea8-aff6-47bad2f0504c req-d52e79ff-98dc-414e-b76e-033249e11749 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-ef5226ee-eaff-4a79-bb36-b60389141ed0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.498 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.498 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.499 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.499 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.500 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.500 186962 DEBUG nova.virt.libvirt.driver [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.533 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.533 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400946.4559238, ef5226ee-eaff-4a79-bb36-b60389141ed0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.534 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.596 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.601 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400946.4614415, ef5226ee-eaff-4a79-bb36-b60389141ed0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.601 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.621 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.624 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.647 186962 INFO nova.compute.manager [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Took 8.22 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.648 186962 DEBUG nova.compute.manager [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.649 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.738 186962 INFO nova.compute.manager [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Took 8.96 seconds to build instance.#033[00m
Nov 29 02:22:26 np0005539505 nova_compute[186958]: 2025-11-29 07:22:26.761 186962 DEBUG oslo_concurrency.lockutils [None req-42e818f2-b11a-4fb4-a335-2703faf0fc55 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:26.962 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:26.963 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:26.963 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:28 np0005539505 nova_compute[186958]: 2025-11-29 07:22:28.486 186962 DEBUG nova.compute.manager [req-6fec4d3a-acd8-474a-976e-dfa5596ac84c req-76ed8595-7b46-41de-b639-f16b629d7cff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:22:28 np0005539505 nova_compute[186958]: 2025-11-29 07:22:28.488 186962 DEBUG oslo_concurrency.lockutils [req-6fec4d3a-acd8-474a-976e-dfa5596ac84c req-76ed8595-7b46-41de-b639-f16b629d7cff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:28 np0005539505 nova_compute[186958]: 2025-11-29 07:22:28.488 186962 DEBUG oslo_concurrency.lockutils [req-6fec4d3a-acd8-474a-976e-dfa5596ac84c req-76ed8595-7b46-41de-b639-f16b629d7cff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:28 np0005539505 nova_compute[186958]: 2025-11-29 07:22:28.488 186962 DEBUG oslo_concurrency.lockutils [req-6fec4d3a-acd8-474a-976e-dfa5596ac84c req-76ed8595-7b46-41de-b639-f16b629d7cff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:28 np0005539505 nova_compute[186958]: 2025-11-29 07:22:28.488 186962 DEBUG nova.compute.manager [req-6fec4d3a-acd8-474a-976e-dfa5596ac84c req-76ed8595-7b46-41de-b639-f16b629d7cff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] No waiting events found dispatching network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:22:28 np0005539505 nova_compute[186958]: 2025-11-29 07:22:28.489 186962 WARNING nova.compute.manager [req-6fec4d3a-acd8-474a-976e-dfa5596ac84c req-76ed8595-7b46-41de-b639-f16b629d7cff 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received unexpected event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:22:28 np0005539505 podman[236893]: 2025-11-29 07:22:28.577962006 +0000 UTC m=+0.058026676 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd)
Nov 29 02:22:28 np0005539505 podman[236894]: 2025-11-29 07:22:28.580680743 +0000 UTC m=+0.056300507 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.schema-version=1.0)
Nov 29 02:22:29 np0005539505 nova_compute[186958]: 2025-11-29 07:22:29.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:29 np0005539505 nova_compute[186958]: 2025-11-29 07:22:29.604 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:30 np0005539505 nova_compute[186958]: 2025-11-29 07:22:30.150 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:30 np0005539505 nova_compute[186958]: 2025-11-29 07:22:30.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:30 np0005539505 nova_compute[186958]: 2025-11-29 07:22:30.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:22:31 np0005539505 nova_compute[186958]: 2025-11-29 07:22:31.202 186962 INFO nova.compute.manager [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Rebuilding instance#033[00m
Nov 29 02:22:31 np0005539505 nova_compute[186958]: 2025-11-29 07:22:31.416 186962 DEBUG nova.compute.manager [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:22:31 np0005539505 nova_compute[186958]: 2025-11-29 07:22:31.494 186962 DEBUG nova.objects.instance [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_requests' on Instance uuid ef5226ee-eaff-4a79-bb36-b60389141ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:31 np0005539505 nova_compute[186958]: 2025-11-29 07:22:31.511 186962 DEBUG nova.objects.instance [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_devices' on Instance uuid ef5226ee-eaff-4a79-bb36-b60389141ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:31 np0005539505 nova_compute[186958]: 2025-11-29 07:22:31.528 186962 DEBUG nova.objects.instance [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'resources' on Instance uuid ef5226ee-eaff-4a79-bb36-b60389141ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:31 np0005539505 nova_compute[186958]: 2025-11-29 07:22:31.549 186962 DEBUG nova.objects.instance [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'migration_context' on Instance uuid ef5226ee-eaff-4a79-bb36-b60389141ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:31 np0005539505 nova_compute[186958]: 2025-11-29 07:22:31.568 186962 DEBUG nova.objects.instance [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:22:31 np0005539505 nova_compute[186958]: 2025-11-29 07:22:31.571 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:22:34 np0005539505 nova_compute[186958]: 2025-11-29 07:22:34.607 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:35 np0005539505 nova_compute[186958]: 2025-11-29 07:22:35.152 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:39 np0005539505 nova_compute[186958]: 2025-11-29 07:22:39.610 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:39Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:48:6a:e8 10.100.0.12
Nov 29 02:22:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:39Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:48:6a:e8 10.100.0.12
Nov 29 02:22:40 np0005539505 nova_compute[186958]: 2025-11-29 07:22:40.155 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:40 np0005539505 nova_compute[186958]: 2025-11-29 07:22:40.251 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:22:41 np0005539505 nova_compute[186958]: 2025-11-29 07:22:41.629 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:22:42 np0005539505 podman[236949]: 2025-11-29 07:22:42.730534299 +0000 UTC m=+0.058384058 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:22:42 np0005539505 podman[236948]: 2025-11-29 07:22:42.732311429 +0000 UTC m=+0.062152004 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 02:22:44 np0005539505 nova_compute[186958]: 2025-11-29 07:22:44.614 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:44 np0005539505 kernel: tap0f6fb919-4a (unregistering): left promiscuous mode
Nov 29 02:22:44 np0005539505 NetworkManager[55134]: <info>  [1764400964.7927] device (tap0f6fb919-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:22:44 np0005539505 nova_compute[186958]: 2025-11-29 07:22:44.801 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:44Z|00531|binding|INFO|Releasing lport 0f6fb919-4a83-4171-91fa-fa47d25e2247 from this chassis (sb_readonly=0)
Nov 29 02:22:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:44Z|00532|binding|INFO|Setting lport 0f6fb919-4a83-4171-91fa-fa47d25e2247 down in Southbound
Nov 29 02:22:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:44Z|00533|binding|INFO|Removing iface tap0f6fb919-4a ovn-installed in OVS
Nov 29 02:22:44 np0005539505 nova_compute[186958]: 2025-11-29 07:22:44.803 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:44 np0005539505 nova_compute[186958]: 2025-11-29 07:22:44.819 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:44 np0005539505 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000075.scope: Deactivated successfully.
Nov 29 02:22:44 np0005539505 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000075.scope: Consumed 14.144s CPU time.
Nov 29 02:22:44 np0005539505 systemd-machined[153285]: Machine qemu-60-instance-00000075 terminated.
Nov 29 02:22:44 np0005539505 podman[236991]: 2025-11-29 07:22:44.863075955 +0000 UTC m=+0.049626709 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.157 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:45.480 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:6a:e8 10.100.0.12'], port_security=['fa:16:3e:48:6a:e8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ef5226ee-eaff-4a79-bb36-b60389141ed0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '4', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=0f6fb919-4a83-4171-91fa-fa47d25e2247) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:22:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:45.482 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 0f6fb919-4a83-4171-91fa-fa47d25e2247 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 unbound from our chassis#033[00m
Nov 29 02:22:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:45.484 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:22:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:45.486 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[695696b5-5766-4c5b-9311-e86649be855c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:45.487 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace which is not needed anymore#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.602 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Triggering sync for uuid ef5226ee-eaff-4a79-bb36-b60389141ed0 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.603 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.603 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.603 186962 INFO nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] During sync_power_state the instance has a pending task (rebuilding). Skip.#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.604 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.648 186962 INFO nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Instance shutdown successfully after 14 seconds.#033[00m
Nov 29 02:22:45 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[236872]: [NOTICE]   (236881) : haproxy version is 2.8.14-c23fe91
Nov 29 02:22:45 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[236872]: [NOTICE]   (236881) : path to executable is /usr/sbin/haproxy
Nov 29 02:22:45 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[236872]: [WARNING]  (236881) : Exiting Master process...
Nov 29 02:22:45 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[236872]: [WARNING]  (236881) : Exiting Master process...
Nov 29 02:22:45 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[236872]: [ALERT]    (236881) : Current worker (236884) exited with code 143 (Terminated)
Nov 29 02:22:45 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[236872]: [WARNING]  (236881) : All workers exited. Exiting... (0)
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.655 186962 INFO nova.virt.libvirt.driver [-] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Instance destroyed successfully.#033[00m
Nov 29 02:22:45 np0005539505 systemd[1]: libpod-ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa.scope: Deactivated successfully.
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.663 186962 INFO nova.virt.libvirt.driver [-] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Instance destroyed successfully.#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.664 186962 DEBUG nova.virt.libvirt.vif [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:22:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1189634294',display_name='tempest-ServerDiskConfigTestJSON-server-1189634294',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1189634294',id=117,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:22:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-o66e7759',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:22:29Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=ef5226ee-eaff-4a79-bb36-b60389141ed0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:22:45 np0005539505 podman[237049]: 2025-11-29 07:22:45.664979528 +0000 UTC m=+0.094887303 container died ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.664 186962 DEBUG nova.network.os_vif_util [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.666 186962 DEBUG nova.network.os_vif_util [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.667 186962 DEBUG os_vif [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.669 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.669 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f6fb919-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.671 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.673 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.677 186962 INFO os_vif [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a')#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.677 186962 INFO nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Deleting instance files /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0_del#033[00m
Nov 29 02:22:45 np0005539505 nova_compute[186958]: 2025-11-29 07:22:45.678 186962 INFO nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Deletion of /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0_del complete#033[00m
Nov 29 02:22:45 np0005539505 systemd[1]: var-lib-containers-storage-overlay-3a1450cc44cf3a96b30c56efc93a1ba1f8f6af08dee9bf69b4b69b0174db4165-merged.mount: Deactivated successfully.
Nov 29 02:22:45 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa-userdata-shm.mount: Deactivated successfully.
Nov 29 02:22:45 np0005539505 podman[237049]: 2025-11-29 07:22:45.929766081 +0000 UTC m=+0.359673856 container cleanup ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:22:46 np0005539505 podman[237077]: 2025-11-29 07:22:46.104850129 +0000 UTC m=+0.152463107 container remove ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 02:22:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:46.110 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[98c4f998-2876-4694-b419-24a50e6e1691]: (4, ('Sat Nov 29 07:22:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa)\nac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa\nSat Nov 29 07:22:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa)\nac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:46.112 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ee4ae3-2607-41c1-a525-6238c0d4da7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:46 np0005539505 systemd[1]: libpod-conmon-ac25c41115ac6ae88ec544389165772dff9c3145765af73a9ee45cf80e1253fa.scope: Deactivated successfully.
Nov 29 02:22:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:46.113 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:46 np0005539505 nova_compute[186958]: 2025-11-29 07:22:46.115 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:46 np0005539505 kernel: tap9b34af6b-e0: left promiscuous mode
Nov 29 02:22:46 np0005539505 nova_compute[186958]: 2025-11-29 07:22:46.117 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:46.122 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[02c2598f-24a4-4eec-8d76-283e06078c08]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:46 np0005539505 nova_compute[186958]: 2025-11-29 07:22:46.129 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:46.145 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[da46343c-b4cb-49c4-b782-926dae595eca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:46.147 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[66abbbad-1019-4825-8459-2c7963060c2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:46.163 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[19071798-7f4e-4c2e-929d-5596718380b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 639332, 'reachable_time': 17448, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237093, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:46 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9b34af6b\x2dedf9\x2d4b27\x2db1dc\x2d2b18c2eec958.mount: Deactivated successfully.
Nov 29 02:22:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:46.168 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:22:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:22:46.168 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[98228059-f756-4c6e-b31d-a6e2f671c912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.371 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.372 186962 INFO nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Creating image(s)#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.372 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.373 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.373 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.390 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.463 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.464 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.464 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.477 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.535 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.536 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.605 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk 1073741824" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.606 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.607 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.666 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.667 186962 DEBUG nova.virt.disk.api [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Checking if we can resize image /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.667 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.722 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.723 186962 DEBUG nova.virt.disk.api [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Cannot resize image /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.724 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.724 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Ensure instance console log exists: /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.724 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.725 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.725 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.727 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Start _get_guest_xml network_info=[{"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.734 186962 DEBUG nova.compute.manager [req-0a14ec98-42c7-4d9f-8cd7-d31fb7422af8 req-ce756f70-6902-4531-97dc-0237944072e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received event network-vif-unplugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.735 186962 DEBUG oslo_concurrency.lockutils [req-0a14ec98-42c7-4d9f-8cd7-d31fb7422af8 req-ce756f70-6902-4531-97dc-0237944072e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.735 186962 DEBUG oslo_concurrency.lockutils [req-0a14ec98-42c7-4d9f-8cd7-d31fb7422af8 req-ce756f70-6902-4531-97dc-0237944072e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.735 186962 DEBUG oslo_concurrency.lockutils [req-0a14ec98-42c7-4d9f-8cd7-d31fb7422af8 req-ce756f70-6902-4531-97dc-0237944072e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.736 186962 DEBUG nova.compute.manager [req-0a14ec98-42c7-4d9f-8cd7-d31fb7422af8 req-ce756f70-6902-4531-97dc-0237944072e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] No waiting events found dispatching network-vif-unplugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.736 186962 WARNING nova.compute.manager [req-0a14ec98-42c7-4d9f-8cd7-d31fb7422af8 req-ce756f70-6902-4531-97dc-0237944072e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received unexpected event network-vif-unplugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.738 186962 WARNING nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.746 186962 DEBUG nova.virt.libvirt.host [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.746 186962 DEBUG nova.virt.libvirt.host [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.749 186962 DEBUG nova.virt.libvirt.host [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.749 186962 DEBUG nova.virt.libvirt.host [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.750 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.750 186962 DEBUG nova.virt.hardware [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.751 186962 DEBUG nova.virt.hardware [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.751 186962 DEBUG nova.virt.hardware [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.751 186962 DEBUG nova.virt.hardware [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.751 186962 DEBUG nova.virt.hardware [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.751 186962 DEBUG nova.virt.hardware [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.751 186962 DEBUG nova.virt.hardware [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.752 186962 DEBUG nova.virt.hardware [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.752 186962 DEBUG nova.virt.hardware [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.752 186962 DEBUG nova.virt.hardware [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.752 186962 DEBUG nova.virt.hardware [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:22:47 np0005539505 nova_compute[186958]: 2025-11-29 07:22:47.752 186962 DEBUG nova.objects.instance [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'vcpu_model' on Instance uuid ef5226ee-eaff-4a79-bb36-b60389141ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.090 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.091 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.091 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.091 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.091 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.091 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.091 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.091 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.092 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.092 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.092 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.093 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.093 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.093 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.093 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.093 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.094 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.094 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.094 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:22:48.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:22:50 np0005539505 nova_compute[186958]: 2025-11-29 07:22:50.161 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:50 np0005539505 nova_compute[186958]: 2025-11-29 07:22:50.673 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.505 186962 DEBUG nova.compute.manager [req-b45e4e3a-3b58-4093-813c-34bca02ac1c6 req-3e4a20e9-59e4-4b2f-b185-56ab521b0325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.506 186962 DEBUG oslo_concurrency.lockutils [req-b45e4e3a-3b58-4093-813c-34bca02ac1c6 req-3e4a20e9-59e4-4b2f-b185-56ab521b0325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.506 186962 DEBUG oslo_concurrency.lockutils [req-b45e4e3a-3b58-4093-813c-34bca02ac1c6 req-3e4a20e9-59e4-4b2f-b185-56ab521b0325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.506 186962 DEBUG oslo_concurrency.lockutils [req-b45e4e3a-3b58-4093-813c-34bca02ac1c6 req-3e4a20e9-59e4-4b2f-b185-56ab521b0325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.506 186962 DEBUG nova.compute.manager [req-b45e4e3a-3b58-4093-813c-34bca02ac1c6 req-3e4a20e9-59e4-4b2f-b185-56ab521b0325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] No waiting events found dispatching network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.506 186962 WARNING nova.compute.manager [req-b45e4e3a-3b58-4093-813c-34bca02ac1c6 req-3e4a20e9-59e4-4b2f-b185-56ab521b0325 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received unexpected event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.511 186962 DEBUG nova.virt.libvirt.vif [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:22:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1189634294',display_name='tempest-ServerDiskConfigTestJSON-server-1189634294',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1189634294',id=117,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:22:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-o66e7759',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:22:46Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=ef5226ee-eaff-4a79-bb36-b60389141ed0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.512 186962 DEBUG nova.network.os_vif_util [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.513 186962 DEBUG nova.network.os_vif_util [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.514 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  <uuid>ef5226ee-eaff-4a79-bb36-b60389141ed0</uuid>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  <name>instance-00000075</name>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1189634294</nova:name>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:22:47</nova:creationTime>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:        <nova:user uuid="000fb7b950024e16902cd58f2ea16ac9">tempest-ServerDiskConfigTestJSON-1282760174-project-member</nova:user>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:        <nova:project uuid="6d55e57bfd184513a304a61cc1cb3730">tempest-ServerDiskConfigTestJSON-1282760174</nova:project>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="3372b7b2-657b-4c4d-9d9d-7c5b771a630a"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:        <nova:port uuid="0f6fb919-4a83-4171-91fa-fa47d25e2247">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <entry name="serial">ef5226ee-eaff-4a79-bb36-b60389141ed0</entry>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <entry name="uuid">ef5226ee-eaff-4a79-bb36-b60389141ed0</entry>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.config"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:48:6a:e8"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <target dev="tap0f6fb919-4a"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/console.log" append="off"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:22:51 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:22:51 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:22:51 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:22:51 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.515 186962 DEBUG nova.compute.manager [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Preparing to wait for external event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.515 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.516 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.516 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.517 186962 DEBUG nova.virt.libvirt.vif [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:22:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1189634294',display_name='tempest-ServerDiskConfigTestJSON-server-1189634294',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1189634294',id=117,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:22:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-o66e7759',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:22:46Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=ef5226ee-eaff-4a79-bb36-b60389141ed0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.517 186962 DEBUG nova.network.os_vif_util [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.517 186962 DEBUG nova.network.os_vif_util [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.518 186962 DEBUG os_vif [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.518 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.519 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.519 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.523 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.524 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f6fb919-4a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.525 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0f6fb919-4a, col_values=(('external_ids', {'iface-id': '0f6fb919-4a83-4171-91fa-fa47d25e2247', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:6a:e8', 'vm-uuid': 'ef5226ee-eaff-4a79-bb36-b60389141ed0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.526 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.528 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:22:51 np0005539505 NetworkManager[55134]: <info>  [1764400971.5277] manager: (tap0f6fb919-4a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.532 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.533 186962 INFO os_vif [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a')#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.926 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.927 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.927 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No VIF found with MAC fa:16:3e:48:6a:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:22:51 np0005539505 nova_compute[186958]: 2025-11-29 07:22:51.928 186962 INFO nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Using config drive#033[00m
Nov 29 02:22:52 np0005539505 nova_compute[186958]: 2025-11-29 07:22:52.766 186962 DEBUG nova.objects.instance [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'ec2_ids' on Instance uuid ef5226ee-eaff-4a79-bb36-b60389141ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:54 np0005539505 podman[237113]: 2025-11-29 07:22:54.735795446 +0000 UTC m=+0.061259989 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:22:54 np0005539505 podman[237114]: 2025-11-29 07:22:54.798577817 +0000 UTC m=+0.122530447 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:22:55 np0005539505 nova_compute[186958]: 2025-11-29 07:22:55.162 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:56 np0005539505 nova_compute[186958]: 2025-11-29 07:22:56.526 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:56 np0005539505 nova_compute[186958]: 2025-11-29 07:22:56.569 186962 DEBUG nova.objects.instance [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'keypairs' on Instance uuid ef5226ee-eaff-4a79-bb36-b60389141ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:22:58 np0005539505 nova_compute[186958]: 2025-11-29 07:22:58.332 186962 INFO nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Creating config drive at /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.config#033[00m
Nov 29 02:22:58 np0005539505 nova_compute[186958]: 2025-11-29 07:22:58.338 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsqjhqyxc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:22:58 np0005539505 nova_compute[186958]: 2025-11-29 07:22:58.468 186962 DEBUG oslo_concurrency.processutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsqjhqyxc" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:22:58 np0005539505 kernel: tap0f6fb919-4a: entered promiscuous mode
Nov 29 02:22:58 np0005539505 nova_compute[186958]: 2025-11-29 07:22:58.536 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:58Z|00534|binding|INFO|Claiming lport 0f6fb919-4a83-4171-91fa-fa47d25e2247 for this chassis.
Nov 29 02:22:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:58Z|00535|binding|INFO|0f6fb919-4a83-4171-91fa-fa47d25e2247: Claiming fa:16:3e:48:6a:e8 10.100.0.12
Nov 29 02:22:58 np0005539505 NetworkManager[55134]: <info>  [1764400978.5381] manager: (tap0f6fb919-4a): new Tun device (/org/freedesktop/NetworkManager/Devices/263)
Nov 29 02:22:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:22:58Z|00536|binding|INFO|Setting lport 0f6fb919-4a83-4171-91fa-fa47d25e2247 ovn-installed in OVS
Nov 29 02:22:58 np0005539505 nova_compute[186958]: 2025-11-29 07:22:58.552 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:58 np0005539505 nova_compute[186958]: 2025-11-29 07:22:58.555 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:22:58 np0005539505 systemd-udevd[237175]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:22:58 np0005539505 systemd-machined[153285]: New machine qemu-61-instance-00000075.
Nov 29 02:22:58 np0005539505 NetworkManager[55134]: <info>  [1764400978.5801] device (tap0f6fb919-4a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:22:58 np0005539505 NetworkManager[55134]: <info>  [1764400978.5810] device (tap0f6fb919-4a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:22:58 np0005539505 systemd[1]: Started Virtual Machine qemu-61-instance-00000075.
Nov 29 02:22:58 np0005539505 podman[237180]: 2025-11-29 07:22:58.679644904 +0000 UTC m=+0.065330345 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:22:58 np0005539505 podman[237181]: 2025-11-29 07:22:58.681300941 +0000 UTC m=+0.067237789 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:22:59 np0005539505 nova_compute[186958]: 2025-11-29 07:22:59.846 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Removed pending event for ef5226ee-eaff-4a79-bb36-b60389141ed0 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:22:59 np0005539505 nova_compute[186958]: 2025-11-29 07:22:59.848 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400979.845501, ef5226ee-eaff-4a79-bb36-b60389141ed0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:22:59 np0005539505 nova_compute[186958]: 2025-11-29 07:22:59.848 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] VM Started (Lifecycle Event)#033[00m
Nov 29 02:23:00 np0005539505 nova_compute[186958]: 2025-11-29 07:23:00.164 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:00Z|00537|binding|INFO|Setting lport 0f6fb919-4a83-4171-91fa-fa47d25e2247 up in Southbound
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.191 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:6a:e8 10.100.0.12'], port_security=['fa:16:3e:48:6a:e8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ef5226ee-eaff-4a79-bb36-b60389141ed0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '5', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=0f6fb919-4a83-4171-91fa-fa47d25e2247) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.193 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 0f6fb919-4a83-4171-91fa-fa47d25e2247 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 bound to our chassis#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.195 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.206 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[94db8f9e-cf49-4857-9757-973c2629be78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.208 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b34af6b-e1 in ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.210 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b34af6b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.210 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[31fd29b2-9c16-484c-b3e6-ccb08178b494]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.212 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[48501723-5528-4804-afbe-b8570cd26bd7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 nova_compute[186958]: 2025-11-29 07:23:00.213 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:00 np0005539505 nova_compute[186958]: 2025-11-29 07:23:00.219 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400979.8471994, ef5226ee-eaff-4a79-bb36-b60389141ed0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:00 np0005539505 nova_compute[186958]: 2025-11-29 07:23:00.219 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.227 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[ceabbb25-aca8-4e07-849f-cd75b2cd5909]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.251 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[22efe8e2-2351-444d-8eb7-3663292627e7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.282 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[d84886b2-1a1f-49a4-acd5-9542cea137d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 nova_compute[186958]: 2025-11-29 07:23:00.289 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:00 np0005539505 NetworkManager[55134]: <info>  [1764400980.2895] manager: (tap9b34af6b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/264)
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.289 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[78a9dacd-6de1-4ed2-b363-752e10cf2f33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 nova_compute[186958]: 2025-11-29 07:23:00.293 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.315 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1815ea7d-04fe-433a-b9dc-6f7d6743a2d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.318 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a28ea7fc-2acc-4819-b58b-38df37cb3b67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 nova_compute[186958]: 2025-11-29 07:23:00.332 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:23:00 np0005539505 NetworkManager[55134]: <info>  [1764400980.3454] device (tap9b34af6b-e0): carrier: link connected
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.350 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[01c23cc2-a275-42ad-9be4-2388019640a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.365 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[225b88e4-6a8a-438d-9376-d7a83b86d26c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642792, 'reachable_time': 38459, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237257, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.380 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[03f9c85b-6203-4d15-b8b3-1a0f0455fa95]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:40d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 642792, 'tstamp': 642792}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237258, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.396 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef6e278-405c-4bf7-b0d5-f1f2eb56ec92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 170], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642792, 'reachable_time': 38459, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237259, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.424 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0f89c59e-9e00-4110-aac0-42c2100a45e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.489 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[af6060af-de84-45e4-9e1b-58481fa33f60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.491 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.491 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.491 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b34af6b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:00 np0005539505 nova_compute[186958]: 2025-11-29 07:23:00.493 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:00 np0005539505 kernel: tap9b34af6b-e0: entered promiscuous mode
Nov 29 02:23:00 np0005539505 NetworkManager[55134]: <info>  [1764400980.4944] manager: (tap9b34af6b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.496 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b34af6b-e0, col_values=(('external_ids', {'iface-id': '88f3bff1-58a0-4231-87c4-807c4c2657d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:00 np0005539505 nova_compute[186958]: 2025-11-29 07:23:00.497 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:00Z|00538|binding|INFO|Releasing lport 88f3bff1-58a0-4231-87c4-807c4c2657d5 from this chassis (sb_readonly=0)
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.499 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:23:00 np0005539505 nova_compute[186958]: 2025-11-29 07:23:00.499 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.499 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[082fe393-3d6d-413e-a6af-cd6bd07de03f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.500 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:23:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:00.501 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'env', 'PROCESS_TAG=haproxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:23:00 np0005539505 nova_compute[186958]: 2025-11-29 07:23:00.510 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:00 np0005539505 podman[237292]: 2025-11-29 07:23:00.806283113 +0000 UTC m=+0.019197026 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.069 186962 DEBUG nova.compute.manager [req-aa62a4a0-de1b-4a4f-84cd-0a371d669755 req-ff6833c4-2c74-4c12-bbbc-3c72604dd78f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.071 186962 DEBUG oslo_concurrency.lockutils [req-aa62a4a0-de1b-4a4f-84cd-0a371d669755 req-ff6833c4-2c74-4c12-bbbc-3c72604dd78f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.071 186962 DEBUG oslo_concurrency.lockutils [req-aa62a4a0-de1b-4a4f-84cd-0a371d669755 req-ff6833c4-2c74-4c12-bbbc-3c72604dd78f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.071 186962 DEBUG oslo_concurrency.lockutils [req-aa62a4a0-de1b-4a4f-84cd-0a371d669755 req-ff6833c4-2c74-4c12-bbbc-3c72604dd78f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.072 186962 DEBUG nova.compute.manager [req-aa62a4a0-de1b-4a4f-84cd-0a371d669755 req-ff6833c4-2c74-4c12-bbbc-3c72604dd78f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Processing event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.072 186962 DEBUG nova.compute.manager [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.075 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764400981.0756748, ef5226ee-eaff-4a79-bb36-b60389141ed0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.076 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.078 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.080 186962 INFO nova.virt.libvirt.driver [-] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Instance spawned successfully.#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.081 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.105 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.108 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.116 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.116 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.116 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.117 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.117 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.118 186962 DEBUG nova.virt.libvirt.driver [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.158 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.240 186962 DEBUG nova.compute.manager [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.529 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.691 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.691 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.692 186962 DEBUG nova.objects.instance [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:23:01 np0005539505 nova_compute[186958]: 2025-11-29 07:23:01.848 186962 DEBUG oslo_concurrency.lockutils [None req-77baea17-d7d9-4034-92bc-97159d16f08a 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:01 np0005539505 podman[237292]: 2025-11-29 07:23:01.901776806 +0000 UTC m=+1.114690689 container create 09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 02:23:02 np0005539505 systemd[1]: Started libpod-conmon-09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3.scope.
Nov 29 02:23:02 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:23:02 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cbd7479dde81ceb0b5ae96fc5b9e8552d471a887ced04e3861a0b9f3e38efd4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:23:02 np0005539505 podman[237292]: 2025-11-29 07:23:02.061326243 +0000 UTC m=+1.274240126 container init 09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:23:02 np0005539505 podman[237292]: 2025-11-29 07:23:02.06759178 +0000 UTC m=+1.280505663 container start 09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:23:02 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237307]: [NOTICE]   (237311) : New worker (237313) forked
Nov 29 02:23:02 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237307]: [NOTICE]   (237311) : Loading success.
Nov 29 02:23:05 np0005539505 nova_compute[186958]: 2025-11-29 07:23:05.165 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:05 np0005539505 nova_compute[186958]: 2025-11-29 07:23:05.978 186962 DEBUG nova.compute.manager [req-490fe826-1f23-469a-8568-0e1f78298ef4 req-21c1e804-dd1b-439a-818e-e8d7b667334e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:05 np0005539505 nova_compute[186958]: 2025-11-29 07:23:05.978 186962 DEBUG oslo_concurrency.lockutils [req-490fe826-1f23-469a-8568-0e1f78298ef4 req-21c1e804-dd1b-439a-818e-e8d7b667334e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:05 np0005539505 nova_compute[186958]: 2025-11-29 07:23:05.979 186962 DEBUG oslo_concurrency.lockutils [req-490fe826-1f23-469a-8568-0e1f78298ef4 req-21c1e804-dd1b-439a-818e-e8d7b667334e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:05 np0005539505 nova_compute[186958]: 2025-11-29 07:23:05.979 186962 DEBUG oslo_concurrency.lockutils [req-490fe826-1f23-469a-8568-0e1f78298ef4 req-21c1e804-dd1b-439a-818e-e8d7b667334e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:05 np0005539505 nova_compute[186958]: 2025-11-29 07:23:05.979 186962 DEBUG nova.compute.manager [req-490fe826-1f23-469a-8568-0e1f78298ef4 req-21c1e804-dd1b-439a-818e-e8d7b667334e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] No waiting events found dispatching network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:05 np0005539505 nova_compute[186958]: 2025-11-29 07:23:05.979 186962 WARNING nova.compute.manager [req-490fe826-1f23-469a-8568-0e1f78298ef4 req-21c1e804-dd1b-439a-818e-e8d7b667334e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received unexpected event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:23:06 np0005539505 nova_compute[186958]: 2025-11-29 07:23:06.532 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.236 186962 DEBUG oslo_concurrency.lockutils [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.237 186962 DEBUG oslo_concurrency.lockutils [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.237 186962 DEBUG oslo_concurrency.lockutils [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.237 186962 DEBUG oslo_concurrency.lockutils [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.238 186962 DEBUG oslo_concurrency.lockutils [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.283 186962 INFO nova.compute.manager [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Terminating instance#033[00m
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.307 186962 DEBUG nova.compute.manager [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:23:07 np0005539505 kernel: tap0f6fb919-4a (unregistering): left promiscuous mode
Nov 29 02:23:07 np0005539505 NetworkManager[55134]: <info>  [1764400987.3316] device (tap0f6fb919-4a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:23:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:07Z|00539|binding|INFO|Releasing lport 0f6fb919-4a83-4171-91fa-fa47d25e2247 from this chassis (sb_readonly=0)
Nov 29 02:23:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:07Z|00540|binding|INFO|Setting lport 0f6fb919-4a83-4171-91fa-fa47d25e2247 down in Southbound
Nov 29 02:23:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:07Z|00541|binding|INFO|Removing iface tap0f6fb919-4a ovn-installed in OVS
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.336 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.339 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.354 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:6a:e8 10.100.0.12'], port_security=['fa:16:3e:48:6a:e8 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ef5226ee-eaff-4a79-bb36-b60389141ed0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '6', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=0f6fb919-4a83-4171-91fa-fa47d25e2247) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.354 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.356 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 0f6fb919-4a83-4171-91fa-fa47d25e2247 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 unbound from our chassis#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.359 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.360 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1f524865-398e-427d-9abd-d61c09fad8c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.361 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace which is not needed anymore#033[00m
Nov 29 02:23:07 np0005539505 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000075.scope: Deactivated successfully.
Nov 29 02:23:07 np0005539505 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000075.scope: Consumed 7.613s CPU time.
Nov 29 02:23:07 np0005539505 systemd-machined[153285]: Machine qemu-61-instance-00000075 terminated.
Nov 29 02:23:07 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237307]: [NOTICE]   (237311) : haproxy version is 2.8.14-c23fe91
Nov 29 02:23:07 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237307]: [NOTICE]   (237311) : path to executable is /usr/sbin/haproxy
Nov 29 02:23:07 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237307]: [WARNING]  (237311) : Exiting Master process...
Nov 29 02:23:07 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237307]: [ALERT]    (237311) : Current worker (237313) exited with code 143 (Terminated)
Nov 29 02:23:07 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237307]: [WARNING]  (237311) : All workers exited. Exiting... (0)
Nov 29 02:23:07 np0005539505 systemd[1]: libpod-09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3.scope: Deactivated successfully.
Nov 29 02:23:07 np0005539505 podman[237347]: 2025-11-29 07:23:07.536638234 +0000 UTC m=+0.084605771 container died 09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.570 186962 INFO nova.virt.libvirt.driver [-] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Instance destroyed successfully.#033[00m
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.571 186962 DEBUG nova.objects.instance [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'resources' on Instance uuid ef5226ee-eaff-4a79-bb36-b60389141ed0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:07 np0005539505 systemd[1]: var-lib-containers-storage-overlay-5cbd7479dde81ceb0b5ae96fc5b9e8552d471a887ced04e3861a0b9f3e38efd4-merged.mount: Deactivated successfully.
Nov 29 02:23:07 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3-userdata-shm.mount: Deactivated successfully.
Nov 29 02:23:07 np0005539505 podman[237347]: 2025-11-29 07:23:07.654824038 +0000 UTC m=+0.202791575 container cleanup 09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:23:07 np0005539505 systemd[1]: libpod-conmon-09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3.scope: Deactivated successfully.
Nov 29 02:23:07 np0005539505 podman[237396]: 2025-11-29 07:23:07.71376833 +0000 UTC m=+0.040441498 container remove 09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.718 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e68d62b1-76d9-4a8b-9045-93beb1f1c3b1]: (4, ('Sat Nov 29 07:23:07 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3)\n09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3\nSat Nov 29 07:23:07 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3)\n09384dfc7589db063013365e25e97968ac94bd405621b298184ca2e605d8d1f3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.720 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c703e09b-0139-43cd-a948-445067477ec1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.721 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:07 np0005539505 kernel: tap9b34af6b-e0: left promiscuous mode
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.723 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:07 np0005539505 nova_compute[186958]: 2025-11-29 07:23:07.737 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.739 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d9b5c952-56ea-42ad-a297-bf00fcff16e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.756 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ffa7830c-0d6c-413a-b845-59c1ac2a9b6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.758 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b70e5a53-938a-40be-b0c4-5c6d1106fc6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.773 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[96fae58f-10d3-456d-a87f-be34529a76aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 642786, 'reachable_time': 31862, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237415, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.775 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:23:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:07.775 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[74e78af1-b3bb-44ed-a5b8-c63fa62a5817]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:07 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9b34af6b\x2dedf9\x2d4b27\x2db1dc\x2d2b18c2eec958.mount: Deactivated successfully.
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.967 186962 DEBUG nova.compute.manager [req-63ac81d8-c40d-4da7-a063-ef5c8f2c4e06 req-59bba87f-1447-4997-8cca-c07d734630bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received event network-vif-unplugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.967 186962 DEBUG oslo_concurrency.lockutils [req-63ac81d8-c40d-4da7-a063-ef5c8f2c4e06 req-59bba87f-1447-4997-8cca-c07d734630bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.967 186962 DEBUG oslo_concurrency.lockutils [req-63ac81d8-c40d-4da7-a063-ef5c8f2c4e06 req-59bba87f-1447-4997-8cca-c07d734630bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.967 186962 DEBUG oslo_concurrency.lockutils [req-63ac81d8-c40d-4da7-a063-ef5c8f2c4e06 req-59bba87f-1447-4997-8cca-c07d734630bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.968 186962 DEBUG nova.compute.manager [req-63ac81d8-c40d-4da7-a063-ef5c8f2c4e06 req-59bba87f-1447-4997-8cca-c07d734630bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] No waiting events found dispatching network-vif-unplugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.968 186962 DEBUG nova.compute.manager [req-63ac81d8-c40d-4da7-a063-ef5c8f2c4e06 req-59bba87f-1447-4997-8cca-c07d734630bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received event network-vif-unplugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.987 186962 DEBUG nova.virt.libvirt.vif [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:22:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1189634294',display_name='tempest-ServerDiskConfigTestJSON-server-1189634294',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1189634294',id=117,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:23:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-o66e7759',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:23:01Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=ef5226ee-eaff-4a79-bb36-b60389141ed0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.987 186962 DEBUG nova.network.os_vif_util [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "address": "fa:16:3e:48:6a:e8", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0f6fb919-4a", "ovs_interfaceid": "0f6fb919-4a83-4171-91fa-fa47d25e2247", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.988 186962 DEBUG nova.network.os_vif_util [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.988 186962 DEBUG os_vif [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.990 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.990 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f6fb919-4a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.994 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.997 186962 INFO os_vif [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:6a:e8,bridge_name='br-int',has_traffic_filtering=True,id=0f6fb919-4a83-4171-91fa-fa47d25e2247,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0f6fb919-4a')#033[00m
Nov 29 02:23:08 np0005539505 nova_compute[186958]: 2025-11-29 07:23:08.998 186962 INFO nova.virt.libvirt.driver [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Deleting instance files /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0_del#033[00m
Nov 29 02:23:09 np0005539505 nova_compute[186958]: 2025-11-29 07:23:09.000 186962 INFO nova.virt.libvirt.driver [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Deletion of /var/lib/nova/instances/ef5226ee-eaff-4a79-bb36-b60389141ed0_del complete#033[00m
Nov 29 02:23:09 np0005539505 nova_compute[186958]: 2025-11-29 07:23:09.098 186962 INFO nova.compute.manager [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Took 1.79 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:23:09 np0005539505 nova_compute[186958]: 2025-11-29 07:23:09.099 186962 DEBUG oslo.service.loopingcall [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:23:09 np0005539505 nova_compute[186958]: 2025-11-29 07:23:09.099 186962 DEBUG nova.compute.manager [-] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:23:09 np0005539505 nova_compute[186958]: 2025-11-29 07:23:09.099 186962 DEBUG nova.network.neutron [-] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:23:10 np0005539505 nova_compute[186958]: 2025-11-29 07:23:10.168 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.526 186962 DEBUG nova.network.neutron [-] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.584 186962 INFO nova.compute.manager [-] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Took 3.48 seconds to deallocate network for instance.#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.609 186962 DEBUG nova.compute.manager [req-e741a7e2-e025-4397-a99e-640ecd31b3ad req-9e2c37c6-2b03-4aeb-8da9-ed0bebda5fd1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received event network-vif-deleted-0f6fb919-4a83-4171-91fa-fa47d25e2247 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.697 186962 DEBUG oslo_concurrency.lockutils [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.697 186962 DEBUG oslo_concurrency.lockutils [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.763 186962 DEBUG nova.compute.provider_tree [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.773 186962 DEBUG nova.compute.manager [req-9ed0880f-0607-4773-a251-ae5b77cce81a req-c29570cb-89f2-4b19-a7b8-8643fdae4894 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.773 186962 DEBUG oslo_concurrency.lockutils [req-9ed0880f-0607-4773-a251-ae5b77cce81a req-c29570cb-89f2-4b19-a7b8-8643fdae4894 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.774 186962 DEBUG oslo_concurrency.lockutils [req-9ed0880f-0607-4773-a251-ae5b77cce81a req-c29570cb-89f2-4b19-a7b8-8643fdae4894 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.774 186962 DEBUG oslo_concurrency.lockutils [req-9ed0880f-0607-4773-a251-ae5b77cce81a req-c29570cb-89f2-4b19-a7b8-8643fdae4894 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.774 186962 DEBUG nova.compute.manager [req-9ed0880f-0607-4773-a251-ae5b77cce81a req-c29570cb-89f2-4b19-a7b8-8643fdae4894 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] No waiting events found dispatching network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.774 186962 WARNING nova.compute.manager [req-9ed0880f-0607-4773-a251-ae5b77cce81a req-c29570cb-89f2-4b19-a7b8-8643fdae4894 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Received unexpected event network-vif-plugged-0f6fb919-4a83-4171-91fa-fa47d25e2247 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.784 186962 DEBUG nova.scheduler.client.report [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.832 186962 DEBUG oslo_concurrency.lockutils [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.873 186962 INFO nova.scheduler.client.report [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Deleted allocations for instance ef5226ee-eaff-4a79-bb36-b60389141ed0#033[00m
Nov 29 02:23:12 np0005539505 nova_compute[186958]: 2025-11-29 07:23:12.984 186962 DEBUG oslo_concurrency.lockutils [None req-8826f6ef-07f6-4179-8f8a-4280b36e09da 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "ef5226ee-eaff-4a79-bb36-b60389141ed0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:13 np0005539505 podman[237417]: 2025-11-29 07:23:13.731007152 +0000 UTC m=+0.050138010 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:23:13 np0005539505 nova_compute[186958]: 2025-11-29 07:23:13.730 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:13 np0005539505 nova_compute[186958]: 2025-11-29 07:23:13.730 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:13 np0005539505 nova_compute[186958]: 2025-11-29 07:23:13.731 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:13 np0005539505 podman[237416]: 2025-11-29 07:23:13.73131759 +0000 UTC m=+0.055618665 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-type=git)
Nov 29 02:23:13 np0005539505 nova_compute[186958]: 2025-11-29 07:23:13.993 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.309 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.309 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.327 186962 DEBUG nova.compute.manager [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.465 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.465 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.471 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.472 186962 INFO nova.compute.claims [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.609 186962 DEBUG nova.compute.provider_tree [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.637 186962 DEBUG nova.scheduler.client.report [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.661 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.662 186962 DEBUG nova.compute.manager [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.766 186962 DEBUG nova.compute.manager [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.767 186962 DEBUG nova.network.neutron [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.794 186962 INFO nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.820 186962 DEBUG nova.compute.manager [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.997 186962 DEBUG nova.compute.manager [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.998 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:23:14 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.999 186962 INFO nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Creating image(s)#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:14.999 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.000 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.000 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.012 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.088 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.089 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.090 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.100 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.157 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.158 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.175 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.275 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk 1073741824" returned: 0 in 0.116s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.275 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.276 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.332 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.334 186962 DEBUG nova.virt.disk.api [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Checking if we can resize image /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.334 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.390 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.391 186962 DEBUG nova.virt.disk.api [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Cannot resize image /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.392 186962 DEBUG nova.objects.instance [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'migration_context' on Instance uuid dbeeb9f5-635c-4a02-9525-1135c83a03a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.429 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.430 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Ensure instance console log exists: /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.430 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.430 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.431 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:15 np0005539505 nova_compute[186958]: 2025-11-29 07:23:15.530 186962 DEBUG nova.policy [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:23:15 np0005539505 podman[237479]: 2025-11-29 07:23:15.716047382 +0000 UTC m=+0.049484211 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 29 02:23:16 np0005539505 nova_compute[186958]: 2025-11-29 07:23:16.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:16 np0005539505 nova_compute[186958]: 2025-11-29 07:23:16.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:23:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:17.319 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:17.320 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:23:17 np0005539505 nova_compute[186958]: 2025-11-29 07:23:17.320 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:18 np0005539505 nova_compute[186958]: 2025-11-29 07:23:18.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.030 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.301 186962 DEBUG nova.network.neutron [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Successfully created port: 3e0864a9-c020-42b5-9976-8e419ce63072 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.405 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.405 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.405 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.405 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.544 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.545 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5702MB free_disk=73.07375717163086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.545 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.545 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.778 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance dbeeb9f5-635c-4a02-9525-1135c83a03a2 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.778 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.778 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.858 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.879 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.909 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:23:19 np0005539505 nova_compute[186958]: 2025-11-29 07:23:19.910 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.365s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:20 np0005539505 nova_compute[186958]: 2025-11-29 07:23:20.170 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.376 186962 DEBUG nova.network.neutron [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Successfully updated port: 3e0864a9-c020-42b5-9976-8e419ce63072 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.394 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "refresh_cache-dbeeb9f5-635c-4a02-9525-1135c83a03a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.394 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquired lock "refresh_cache-dbeeb9f5-635c-4a02-9525-1135c83a03a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.395 186962 DEBUG nova.network.neutron [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.543 186962 DEBUG nova.compute.manager [req-e8a4a48c-29b5-438a-9fa1-7206ba262bd8 req-ddd7c2be-906c-4a05-a9ca-daaaf26cabe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received event network-changed-3e0864a9-c020-42b5-9976-8e419ce63072 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.543 186962 DEBUG nova.compute.manager [req-e8a4a48c-29b5-438a-9fa1-7206ba262bd8 req-ddd7c2be-906c-4a05-a9ca-daaaf26cabe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Refreshing instance network info cache due to event network-changed-3e0864a9-c020-42b5-9976-8e419ce63072. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.544 186962 DEBUG oslo_concurrency.lockutils [req-e8a4a48c-29b5-438a-9fa1-7206ba262bd8 req-ddd7c2be-906c-4a05-a9ca-daaaf26cabe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-dbeeb9f5-635c-4a02-9525-1135c83a03a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.910 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.911 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.911 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.932 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:23:21 np0005539505 nova_compute[186958]: 2025-11-29 07:23:21.932 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:23:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:22.322 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:22 np0005539505 nova_compute[186958]: 2025-11-29 07:23:22.437 186962 DEBUG nova.network.neutron [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:23:22 np0005539505 nova_compute[186958]: 2025-11-29 07:23:22.569 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764400987.568339, ef5226ee-eaff-4a79-bb36-b60389141ed0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:22 np0005539505 nova_compute[186958]: 2025-11-29 07:23:22.569 186962 INFO nova.compute.manager [-] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:23:22 np0005539505 nova_compute[186958]: 2025-11-29 07:23:22.594 186962 DEBUG nova.compute.manager [None req-097b9290-2abd-4025-9d24-806d3a04f279 - - - - - -] [instance: ef5226ee-eaff-4a79-bb36-b60389141ed0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:24 np0005539505 nova_compute[186958]: 2025-11-29 07:23:24.034 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:25 np0005539505 nova_compute[186958]: 2025-11-29 07:23:25.175 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:25 np0005539505 podman[237501]: 2025-11-29 07:23:25.71316655 +0000 UTC m=+0.049157902 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:23:25 np0005539505 podman[237502]: 2025-11-29 07:23:25.751051452 +0000 UTC m=+0.081528768 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.367 186962 DEBUG nova.network.neutron [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Updating instance_info_cache with network_info: [{"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.426 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Releasing lock "refresh_cache-dbeeb9f5-635c-4a02-9525-1135c83a03a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.426 186962 DEBUG nova.compute.manager [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Instance network_info: |[{"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.426 186962 DEBUG oslo_concurrency.lockutils [req-e8a4a48c-29b5-438a-9fa1-7206ba262bd8 req-ddd7c2be-906c-4a05-a9ca-daaaf26cabe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-dbeeb9f5-635c-4a02-9525-1135c83a03a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.427 186962 DEBUG nova.network.neutron [req-e8a4a48c-29b5-438a-9fa1-7206ba262bd8 req-ddd7c2be-906c-4a05-a9ca-daaaf26cabe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Refreshing network info cache for port 3e0864a9-c020-42b5-9976-8e419ce63072 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.430 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Start _get_guest_xml network_info=[{"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.435 186962 WARNING nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.439 186962 DEBUG nova.virt.libvirt.host [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.440 186962 DEBUG nova.virt.libvirt.host [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.442 186962 DEBUG nova.virt.libvirt.host [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.443 186962 DEBUG nova.virt.libvirt.host [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.444 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.444 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.444 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.445 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.445 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.445 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.445 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.445 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.446 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.446 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.446 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.446 186962 DEBUG nova.virt.hardware [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.450 186962 DEBUG nova.virt.libvirt.vif [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-45658509',display_name='tempest-ServerDiskConfigTestJSON-server-45658509',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-45658509',id=120,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-12vq4c90',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:23:14Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=dbeeb9f5-635c-4a02-9525-1135c83a03a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.451 186962 DEBUG nova.network.os_vif_util [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.452 186962 DEBUG nova.network.os_vif_util [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.453 186962 DEBUG nova.objects.instance [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_devices' on Instance uuid dbeeb9f5-635c-4a02-9525-1135c83a03a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.478 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  <uuid>dbeeb9f5-635c-4a02-9525-1135c83a03a2</uuid>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  <name>instance-00000078</name>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-45658509</nova:name>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:23:26</nova:creationTime>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:        <nova:user uuid="000fb7b950024e16902cd58f2ea16ac9">tempest-ServerDiskConfigTestJSON-1282760174-project-member</nova:user>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:        <nova:project uuid="6d55e57bfd184513a304a61cc1cb3730">tempest-ServerDiskConfigTestJSON-1282760174</nova:project>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:        <nova:port uuid="3e0864a9-c020-42b5-9976-8e419ce63072">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <entry name="serial">dbeeb9f5-635c-4a02-9525-1135c83a03a2</entry>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <entry name="uuid">dbeeb9f5-635c-4a02-9525-1135c83a03a2</entry>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.config"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:a0:e4:cf"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <target dev="tap3e0864a9-c0"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/console.log" append="off"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:23:26 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:23:26 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:23:26 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:23:26 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.479 186962 DEBUG nova.compute.manager [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Preparing to wait for external event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.480 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.480 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.480 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.481 186962 DEBUG nova.virt.libvirt.vif [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-45658509',display_name='tempest-ServerDiskConfigTestJSON-server-45658509',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-45658509',id=120,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-12vq4c90',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:23:14Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=dbeeb9f5-635c-4a02-9525-1135c83a03a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.481 186962 DEBUG nova.network.os_vif_util [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.482 186962 DEBUG nova.network.os_vif_util [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.483 186962 DEBUG os_vif [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.483 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.484 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.484 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.486 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.487 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e0864a9-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.487 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e0864a9-c0, col_values=(('external_ids', {'iface-id': '3e0864a9-c020-42b5-9976-8e419ce63072', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:e4:cf', 'vm-uuid': 'dbeeb9f5-635c-4a02-9525-1135c83a03a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.489 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:26 np0005539505 NetworkManager[55134]: <info>  [1764401006.4896] manager: (tap3e0864a9-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.491 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.494 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.495 186962 INFO os_vif [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0')#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.581 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.581 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.582 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No VIF found with MAC fa:16:3e:a0:e4:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:23:26 np0005539505 nova_compute[186958]: 2025-11-29 07:23:26.582 186962 INFO nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Using config drive#033[00m
Nov 29 02:23:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:26.964 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:26.964 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:26.964 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:29 np0005539505 nova_compute[186958]: 2025-11-29 07:23:29.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:23:29 np0005539505 podman[237552]: 2025-11-29 07:23:29.731127397 +0000 UTC m=+0.058091496 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 02:23:29 np0005539505 podman[237553]: 2025-11-29 07:23:29.731280831 +0000 UTC m=+0.056923222 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:23:30 np0005539505 nova_compute[186958]: 2025-11-29 07:23:30.176 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:30 np0005539505 nova_compute[186958]: 2025-11-29 07:23:30.404 186962 INFO nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Creating config drive at /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.config#033[00m
Nov 29 02:23:30 np0005539505 nova_compute[186958]: 2025-11-29 07:23:30.409 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2a3sb6c2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:30 np0005539505 nova_compute[186958]: 2025-11-29 07:23:30.535 186962 DEBUG oslo_concurrency.processutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2a3sb6c2" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:30 np0005539505 kernel: tap3e0864a9-c0: entered promiscuous mode
Nov 29 02:23:30 np0005539505 NetworkManager[55134]: <info>  [1764401010.5936] manager: (tap3e0864a9-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Nov 29 02:23:30 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:30Z|00542|binding|INFO|Claiming lport 3e0864a9-c020-42b5-9976-8e419ce63072 for this chassis.
Nov 29 02:23:30 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:30Z|00543|binding|INFO|3e0864a9-c020-42b5-9976-8e419ce63072: Claiming fa:16:3e:a0:e4:cf 10.100.0.14
Nov 29 02:23:30 np0005539505 nova_compute[186958]: 2025-11-29 07:23:30.594 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:30 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:30Z|00544|binding|INFO|Setting lport 3e0864a9-c020-42b5-9976-8e419ce63072 ovn-installed in OVS
Nov 29 02:23:30 np0005539505 nova_compute[186958]: 2025-11-29 07:23:30.608 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:30 np0005539505 nova_compute[186958]: 2025-11-29 07:23:30.611 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:30 np0005539505 systemd-udevd[237609]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:23:30 np0005539505 NetworkManager[55134]: <info>  [1764401010.6371] device (tap3e0864a9-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:23:30 np0005539505 systemd-machined[153285]: New machine qemu-62-instance-00000078.
Nov 29 02:23:30 np0005539505 NetworkManager[55134]: <info>  [1764401010.6390] device (tap3e0864a9-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:23:30 np0005539505 systemd[1]: Started Virtual Machine qemu-62-instance-00000078.
Nov 29 02:23:31 np0005539505 nova_compute[186958]: 2025-11-29 07:23:31.275 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401011.2750466, dbeeb9f5-635c-4a02-9525-1135c83a03a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:31 np0005539505 nova_compute[186958]: 2025-11-29 07:23:31.276 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] VM Started (Lifecycle Event)#033[00m
Nov 29 02:23:31 np0005539505 nova_compute[186958]: 2025-11-29 07:23:31.495 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:31Z|00545|binding|INFO|Setting lport 3e0864a9-c020-42b5-9976-8e419ce63072 up in Southbound
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.862 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:e4:cf 10.100.0.14'], port_security=['fa:16:3e:a0:e4:cf 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dbeeb9f5-635c-4a02-9525-1135c83a03a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '2', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=3e0864a9-c020-42b5-9976-8e419ce63072) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.863 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 3e0864a9-c020-42b5-9976-8e419ce63072 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 bound to our chassis#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.864 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.875 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4a0649-5478-4b8b-98d6-4ae3bf1a71a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.876 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b34af6b-e1 in ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.878 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b34af6b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.878 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2e71e20d-8586-4e9e-a996-7f4b42719826]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.879 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e573a558-706c-46b7-9214-322542b414c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.890 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[c223ae77-3cb7-4c5f-9075-4cd1d4977cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:31 np0005539505 nova_compute[186958]: 2025-11-29 07:23:31.907 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:31 np0005539505 nova_compute[186958]: 2025-11-29 07:23:31.910 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401011.275297, dbeeb9f5-635c-4a02-9525-1135c83a03a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:31 np0005539505 nova_compute[186958]: 2025-11-29 07:23:31.911 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.913 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[aef3a590-977d-4b43-9e58-541cf621179c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:31 np0005539505 nova_compute[186958]: 2025-11-29 07:23:31.928 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:31 np0005539505 nova_compute[186958]: 2025-11-29 07:23:31.931 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.942 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c25827e0-17b3-4586-855a-2452daf63f31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.947 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[912b6936-112c-4e06-9239-9c776666ddb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:31 np0005539505 NetworkManager[55134]: <info>  [1764401011.9483] manager: (tap9b34af6b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/268)
Nov 29 02:23:31 np0005539505 nova_compute[186958]: 2025-11-29 07:23:31.957 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.976 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[79e4acfd-e9a9-4ae8-bd48-b7d494cd5f98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:31.979 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e11c0e-7fff-4953-9aa1-2e4ca62d1ea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:32 np0005539505 NetworkManager[55134]: <info>  [1764401012.0002] device (tap9b34af6b-e0): carrier: link connected
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.005 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc36aca-142e-4203-a10e-388e04a98b4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.019 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b56328c5-cd7d-4869-9469-2071e7cb00c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 173], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645958, 'reachable_time': 30581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237650, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.034 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cd4156ef-b199-464d-962b-e72041bf7462]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:40d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 645958, 'tstamp': 645958}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237651, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.048 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[00023c21-5f0f-4099-b778-ab81eb7f5ac6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 173], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645958, 'reachable_time': 30581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237652, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.078 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[03eccc27-0b3c-4fd0-8cb6-edffe7b9616c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.136 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b2c191-7d4b-47d6-ad92-23711f432cc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.137 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.138 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.138 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b34af6b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:32 np0005539505 NetworkManager[55134]: <info>  [1764401012.1405] manager: (tap9b34af6b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Nov 29 02:23:32 np0005539505 nova_compute[186958]: 2025-11-29 07:23:32.139 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:32 np0005539505 kernel: tap9b34af6b-e0: entered promiscuous mode
Nov 29 02:23:32 np0005539505 nova_compute[186958]: 2025-11-29 07:23:32.141 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.144 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b34af6b-e0, col_values=(('external_ids', {'iface-id': '88f3bff1-58a0-4231-87c4-807c4c2657d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:32Z|00546|binding|INFO|Releasing lport 88f3bff1-58a0-4231-87c4-807c4c2657d5 from this chassis (sb_readonly=0)
Nov 29 02:23:32 np0005539505 nova_compute[186958]: 2025-11-29 07:23:32.145 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:32 np0005539505 nova_compute[186958]: 2025-11-29 07:23:32.146 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.148 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.149 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a62dc010-25f7-4fd7-a4d6-836ed86a4ea1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.149 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:23:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:32.150 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'env', 'PROCESS_TAG=haproxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:23:32 np0005539505 nova_compute[186958]: 2025-11-29 07:23:32.158 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:32 np0005539505 podman[237685]: 2025-11-29 07:23:32.537786882 +0000 UTC m=+0.072168434 container create 6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:23:32 np0005539505 systemd[1]: Started libpod-conmon-6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff.scope.
Nov 29 02:23:32 np0005539505 podman[237685]: 2025-11-29 07:23:32.503052039 +0000 UTC m=+0.037433621 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:23:32 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:23:32 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5ba8272764977493f64b450f0273ab4c224bf650b220c92762247317a0f7b7af/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:23:32 np0005539505 podman[237685]: 2025-11-29 07:23:32.625767272 +0000 UTC m=+0.160148844 container init 6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:23:32 np0005539505 podman[237685]: 2025-11-29 07:23:32.630605499 +0000 UTC m=+0.164987041 container start 6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:23:32 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237701]: [NOTICE]   (237705) : New worker (237707) forked
Nov 29 02:23:32 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237701]: [NOTICE]   (237705) : Loading success.
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.292 186962 DEBUG nova.compute.manager [req-9a2e3418-6df7-44d5-ab36-3056e37507f7 req-5ad59879-6038-49b9-ac54-1cc82dafc5fa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.293 186962 DEBUG oslo_concurrency.lockutils [req-9a2e3418-6df7-44d5-ab36-3056e37507f7 req-5ad59879-6038-49b9-ac54-1cc82dafc5fa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.294 186962 DEBUG oslo_concurrency.lockutils [req-9a2e3418-6df7-44d5-ab36-3056e37507f7 req-5ad59879-6038-49b9-ac54-1cc82dafc5fa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.294 186962 DEBUG oslo_concurrency.lockutils [req-9a2e3418-6df7-44d5-ab36-3056e37507f7 req-5ad59879-6038-49b9-ac54-1cc82dafc5fa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.294 186962 DEBUG nova.compute.manager [req-9a2e3418-6df7-44d5-ab36-3056e37507f7 req-5ad59879-6038-49b9-ac54-1cc82dafc5fa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Processing event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.295 186962 DEBUG nova.compute.manager [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.299 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401013.2997026, dbeeb9f5-635c-4a02-9525-1135c83a03a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.300 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.301 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.304 186962 INFO nova.virt.libvirt.driver [-] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Instance spawned successfully.#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.305 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.326 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.333 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.336 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.336 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.337 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.337 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.337 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.338 186962 DEBUG nova.virt.libvirt.driver [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.397 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.460 186962 INFO nova.compute.manager [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Took 18.46 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.461 186962 DEBUG nova.compute.manager [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.566 186962 INFO nova.compute.manager [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Took 19.15 seconds to build instance.#033[00m
Nov 29 02:23:33 np0005539505 nova_compute[186958]: 2025-11-29 07:23:33.588 186962 DEBUG oslo_concurrency.lockutils [None req-d37f358e-3010-482c-8c39-aa71bcdc8b67 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.279s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:34 np0005539505 nova_compute[186958]: 2025-11-29 07:23:34.377 186962 DEBUG nova.network.neutron [req-e8a4a48c-29b5-438a-9fa1-7206ba262bd8 req-ddd7c2be-906c-4a05-a9ca-daaaf26cabe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Updated VIF entry in instance network info cache for port 3e0864a9-c020-42b5-9976-8e419ce63072. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:23:34 np0005539505 nova_compute[186958]: 2025-11-29 07:23:34.378 186962 DEBUG nova.network.neutron [req-e8a4a48c-29b5-438a-9fa1-7206ba262bd8 req-ddd7c2be-906c-4a05-a9ca-daaaf26cabe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Updating instance_info_cache with network_info: [{"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:23:34 np0005539505 nova_compute[186958]: 2025-11-29 07:23:34.397 186962 DEBUG oslo_concurrency.lockutils [req-e8a4a48c-29b5-438a-9fa1-7206ba262bd8 req-ddd7c2be-906c-4a05-a9ca-daaaf26cabe3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-dbeeb9f5-635c-4a02-9525-1135c83a03a2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:23:35 np0005539505 nova_compute[186958]: 2025-11-29 07:23:35.178 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:35 np0005539505 nova_compute[186958]: 2025-11-29 07:23:35.513 186962 DEBUG nova.compute.manager [req-35e44983-8198-46d9-894a-55561235c4af req-df17a435-5b71-48f2-bae9-d96d67ab88bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:23:35 np0005539505 nova_compute[186958]: 2025-11-29 07:23:35.514 186962 DEBUG oslo_concurrency.lockutils [req-35e44983-8198-46d9-894a-55561235c4af req-df17a435-5b71-48f2-bae9-d96d67ab88bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:35 np0005539505 nova_compute[186958]: 2025-11-29 07:23:35.514 186962 DEBUG oslo_concurrency.lockutils [req-35e44983-8198-46d9-894a-55561235c4af req-df17a435-5b71-48f2-bae9-d96d67ab88bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:35 np0005539505 nova_compute[186958]: 2025-11-29 07:23:35.515 186962 DEBUG oslo_concurrency.lockutils [req-35e44983-8198-46d9-894a-55561235c4af req-df17a435-5b71-48f2-bae9-d96d67ab88bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:35 np0005539505 nova_compute[186958]: 2025-11-29 07:23:35.515 186962 DEBUG nova.compute.manager [req-35e44983-8198-46d9-894a-55561235c4af req-df17a435-5b71-48f2-bae9-d96d67ab88bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] No waiting events found dispatching network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:23:35 np0005539505 nova_compute[186958]: 2025-11-29 07:23:35.515 186962 WARNING nova.compute.manager [req-35e44983-8198-46d9-894a-55561235c4af req-df17a435-5b71-48f2-bae9-d96d67ab88bd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received unexpected event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:23:36 np0005539505 nova_compute[186958]: 2025-11-29 07:23:36.497 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:39 np0005539505 nova_compute[186958]: 2025-11-29 07:23:39.641 186962 INFO nova.compute.manager [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Rebuilding instance#033[00m
Nov 29 02:23:40 np0005539505 nova_compute[186958]: 2025-11-29 07:23:40.005 186962 DEBUG nova.compute.manager [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:40 np0005539505 nova_compute[186958]: 2025-11-29 07:23:40.147 186962 DEBUG nova.objects.instance [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_requests' on Instance uuid dbeeb9f5-635c-4a02-9525-1135c83a03a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:40 np0005539505 nova_compute[186958]: 2025-11-29 07:23:40.166 186962 DEBUG nova.objects.instance [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_devices' on Instance uuid dbeeb9f5-635c-4a02-9525-1135c83a03a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:40 np0005539505 nova_compute[186958]: 2025-11-29 07:23:40.182 186962 DEBUG nova.objects.instance [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'resources' on Instance uuid dbeeb9f5-635c-4a02-9525-1135c83a03a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:40 np0005539505 nova_compute[186958]: 2025-11-29 07:23:40.183 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:40 np0005539505 nova_compute[186958]: 2025-11-29 07:23:40.199 186962 DEBUG nova.objects.instance [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'migration_context' on Instance uuid dbeeb9f5-635c-4a02-9525-1135c83a03a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:40 np0005539505 nova_compute[186958]: 2025-11-29 07:23:40.216 186962 DEBUG nova.objects.instance [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:23:40 np0005539505 nova_compute[186958]: 2025-11-29 07:23:40.219 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:23:41 np0005539505 nova_compute[186958]: 2025-11-29 07:23:41.500 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:44 np0005539505 podman[237729]: 2025-11-29 07:23:44.757337521 +0000 UTC m=+0.079003607 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, com.redhat.component=ubi9-minimal-container)
Nov 29 02:23:44 np0005539505 podman[237730]: 2025-11-29 07:23:44.781542796 +0000 UTC m=+0.096093531 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:23:45 np0005539505 nova_compute[186958]: 2025-11-29 07:23:45.185 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:46 np0005539505 nova_compute[186958]: 2025-11-29 07:23:46.503 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:46Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a0:e4:cf 10.100.0.14
Nov 29 02:23:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:46Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a0:e4:cf 10.100.0.14
Nov 29 02:23:46 np0005539505 podman[237773]: 2025-11-29 07:23:46.743086842 +0000 UTC m=+0.075719264 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Nov 29 02:23:50 np0005539505 nova_compute[186958]: 2025-11-29 07:23:50.187 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:50 np0005539505 nova_compute[186958]: 2025-11-29 07:23:50.266 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:23:51 np0005539505 nova_compute[186958]: 2025-11-29 07:23:51.506 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:52 np0005539505 kernel: tap3e0864a9-c0 (unregistering): left promiscuous mode
Nov 29 02:23:52 np0005539505 NetworkManager[55134]: <info>  [1764401032.5157] device (tap3e0864a9-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:23:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:52Z|00547|binding|INFO|Releasing lport 3e0864a9-c020-42b5-9976-8e419ce63072 from this chassis (sb_readonly=0)
Nov 29 02:23:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:52Z|00548|binding|INFO|Setting lport 3e0864a9-c020-42b5-9976-8e419ce63072 down in Southbound
Nov 29 02:23:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:52Z|00549|binding|INFO|Removing iface tap3e0864a9-c0 ovn-installed in OVS
Nov 29 02:23:52 np0005539505 nova_compute[186958]: 2025-11-29 07:23:52.525 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:52 np0005539505 nova_compute[186958]: 2025-11-29 07:23:52.542 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:52 np0005539505 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000078.scope: Deactivated successfully.
Nov 29 02:23:52 np0005539505 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000078.scope: Consumed 13.645s CPU time.
Nov 29 02:23:52 np0005539505 systemd-machined[153285]: Machine qemu-62-instance-00000078 terminated.
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.650 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:e4:cf 10.100.0.14'], port_security=['fa:16:3e:a0:e4:cf 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dbeeb9f5-635c-4a02-9525-1135c83a03a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '4', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=3e0864a9-c020-42b5-9976-8e419ce63072) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.652 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 3e0864a9-c020-42b5-9976-8e419ce63072 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 unbound from our chassis#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.654 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.655 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1b4a01aa-c2fc-4318-8900-6ed1f4f92ce4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.655 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace which is not needed anymore#033[00m
Nov 29 02:23:52 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237701]: [NOTICE]   (237705) : haproxy version is 2.8.14-c23fe91
Nov 29 02:23:52 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237701]: [NOTICE]   (237705) : path to executable is /usr/sbin/haproxy
Nov 29 02:23:52 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237701]: [WARNING]  (237705) : Exiting Master process...
Nov 29 02:23:52 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237701]: [ALERT]    (237705) : Current worker (237707) exited with code 143 (Terminated)
Nov 29 02:23:52 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[237701]: [WARNING]  (237705) : All workers exited. Exiting... (0)
Nov 29 02:23:52 np0005539505 systemd[1]: libpod-6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff.scope: Deactivated successfully.
Nov 29 02:23:52 np0005539505 podman[237814]: 2025-11-29 07:23:52.805917029 +0000 UTC m=+0.058912738 container died 6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:23:52 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff-userdata-shm.mount: Deactivated successfully.
Nov 29 02:23:52 np0005539505 systemd[1]: var-lib-containers-storage-overlay-5ba8272764977493f64b450f0273ab4c224bf650b220c92762247317a0f7b7af-merged.mount: Deactivated successfully.
Nov 29 02:23:52 np0005539505 podman[237814]: 2025-11-29 07:23:52.846348064 +0000 UTC m=+0.099343773 container cleanup 6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:23:52 np0005539505 systemd[1]: libpod-conmon-6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff.scope: Deactivated successfully.
Nov 29 02:23:52 np0005539505 podman[237859]: 2025-11-29 07:23:52.917397175 +0000 UTC m=+0.046012914 container remove 6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.924 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a60e85b4-d546-4437-a8cc-04a268ac7fe7]: (4, ('Sat Nov 29 07:23:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff)\n6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff\nSat Nov 29 07:23:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff)\n6d5fe5ff53be3d4c3faaa36bc47ed5fa8d9a63441d42aa7504295278354b2dff\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.926 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[81599856-8e78-4a84-93e9-3d5f4a22d80f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.927 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:52 np0005539505 kernel: tap9b34af6b-e0: left promiscuous mode
Nov 29 02:23:52 np0005539505 nova_compute[186958]: 2025-11-29 07:23:52.930 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:52 np0005539505 nova_compute[186958]: 2025-11-29 07:23:52.944 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:52 np0005539505 nova_compute[186958]: 2025-11-29 07:23:52.945 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.948 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9d549427-d7c8-4f87-b4f2-cfc5aece3cfa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.964 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5dd94404-6864-43a8-9362-ae04501adc5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.966 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[864bc536-c268-45cb-84ca-55c605d2f879]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.982 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a32dce-7e9c-4027-9c7d-bc516b2389c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 645952, 'reachable_time': 34688, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237878, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.985 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:23:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:52.985 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[791630c3-aa31-4a13-845c-c8fd5c9fcc91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:52 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9b34af6b\x2dedf9\x2d4b27\x2db1dc\x2d2b18c2eec958.mount: Deactivated successfully.
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.280 186962 INFO nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Instance shutdown successfully after 13 seconds.#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.284 186962 INFO nova.virt.libvirt.driver [-] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Instance destroyed successfully.#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.288 186962 INFO nova.virt.libvirt.driver [-] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Instance destroyed successfully.#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.289 186962 DEBUG nova.virt.libvirt.vif [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-45658509',display_name='tempest-ServerDiskConfigTestJSON-server-45658509',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-45658509',id=120,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:23:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-12vq4c90',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:23:38Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=dbeeb9f5-635c-4a02-9525-1135c83a03a2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.289 186962 DEBUG nova.network.os_vif_util [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.290 186962 DEBUG nova.network.os_vif_util [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.290 186962 DEBUG os_vif [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.292 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.292 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e0864a9-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.294 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.295 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.297 186962 INFO os_vif [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0')#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.298 186962 INFO nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Deleting instance files /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2_del#033[00m
Nov 29 02:23:53 np0005539505 nova_compute[186958]: 2025-11-29 07:23:53.299 186962 INFO nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Deletion of /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2_del complete#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.528 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.528 186962 INFO nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Creating image(s)#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.529 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.529 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.530 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.542 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.597 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.598 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "923f30c548f83d073f1130ce28fd6a6debb4b123" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.599 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.612 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.674 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.676 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.764 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123,backing_fmt=raw /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk 1073741824" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.765 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "923f30c548f83d073f1130ce28fd6a6debb4b123" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.766 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.829 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.830 186962 DEBUG nova.virt.disk.api [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Checking if we can resize image /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.830 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.890 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.891 186962 DEBUG nova.virt.disk.api [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Cannot resize image /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.891 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.892 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Ensure instance console log exists: /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.892 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.892 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.893 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.895 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Start _get_guest_xml network_info=[{"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.901 186962 WARNING nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.908 186962 DEBUG nova.virt.libvirt.host [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.909 186962 DEBUG nova.virt.libvirt.host [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.912 186962 DEBUG nova.virt.libvirt.host [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.912 186962 DEBUG nova.virt.libvirt.host [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.914 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.914 186962 DEBUG nova.virt.hardware [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:09Z,direct_url=<?>,disk_format='qcow2',id=3372b7b2-657b-4c4d-9d9d-7c5b771a630a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.914 186962 DEBUG nova.virt.hardware [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.914 186962 DEBUG nova.virt.hardware [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.915 186962 DEBUG nova.virt.hardware [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.915 186962 DEBUG nova.virt.hardware [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.915 186962 DEBUG nova.virt.hardware [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.915 186962 DEBUG nova.virt.hardware [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.915 186962 DEBUG nova.virt.hardware [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.916 186962 DEBUG nova.virt.hardware [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.916 186962 DEBUG nova.virt.hardware [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.916 186962 DEBUG nova.virt.hardware [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:23:54 np0005539505 nova_compute[186958]: 2025-11-29 07:23:54.916 186962 DEBUG nova.objects.instance [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'vcpu_model' on Instance uuid dbeeb9f5-635c-4a02-9525-1135c83a03a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.056 186962 DEBUG nova.virt.libvirt.vif [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-45658509',display_name='tempest-ServerDiskConfigTestJSON-server-45658509',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-45658509',id=120,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:23:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-12vq4c90',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:23:54Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=dbeeb9f5-635c-4a02-9525-1135c83a03a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.057 186962 DEBUG nova.network.os_vif_util [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.057 186962 DEBUG nova.network.os_vif_util [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.059 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  <uuid>dbeeb9f5-635c-4a02-9525-1135c83a03a2</uuid>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  <name>instance-00000078</name>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-45658509</nova:name>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:23:54</nova:creationTime>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:        <nova:user uuid="000fb7b950024e16902cd58f2ea16ac9">tempest-ServerDiskConfigTestJSON-1282760174-project-member</nova:user>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:        <nova:project uuid="6d55e57bfd184513a304a61cc1cb3730">tempest-ServerDiskConfigTestJSON-1282760174</nova:project>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="3372b7b2-657b-4c4d-9d9d-7c5b771a630a"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:        <nova:port uuid="3e0864a9-c020-42b5-9976-8e419ce63072">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <entry name="serial">dbeeb9f5-635c-4a02-9525-1135c83a03a2</entry>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <entry name="uuid">dbeeb9f5-635c-4a02-9525-1135c83a03a2</entry>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.config"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:a0:e4:cf"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <target dev="tap3e0864a9-c0"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/console.log" append="off"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:23:55 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:23:55 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:23:55 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:23:55 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.059 186962 DEBUG nova.compute.manager [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Preparing to wait for external event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.060 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.060 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.060 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.060 186962 DEBUG nova.virt.libvirt.vif [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-45658509',display_name='tempest-ServerDiskConfigTestJSON-server-45658509',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-45658509',id=120,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:23:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-12vq4c90',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:23:54Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=dbeeb9f5-635c-4a02-9525-1135c83a03a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.061 186962 DEBUG nova.network.os_vif_util [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.061 186962 DEBUG nova.network.os_vif_util [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.061 186962 DEBUG os_vif [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.062 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.062 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.063 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.065 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.065 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e0864a9-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.066 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e0864a9-c0, col_values=(('external_ids', {'iface-id': '3e0864a9-c020-42b5-9976-8e419ce63072', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:e4:cf', 'vm-uuid': 'dbeeb9f5-635c-4a02-9525-1135c83a03a2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:55 np0005539505 NetworkManager[55134]: <info>  [1764401035.0693] manager: (tap3e0864a9-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/270)
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.070 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.072 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.074 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.075 186962 INFO os_vif [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0')#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.188 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.381 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.381 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.381 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No VIF found with MAC fa:16:3e:a0:e4:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.382 186962 INFO nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Using config drive#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.475 186962 DEBUG nova.objects.instance [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'ec2_ids' on Instance uuid dbeeb9f5-635c-4a02-9525-1135c83a03a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:55 np0005539505 nova_compute[186958]: 2025-11-29 07:23:55.923 186962 DEBUG nova.objects.instance [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'keypairs' on Instance uuid dbeeb9f5-635c-4a02-9525-1135c83a03a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:23:56 np0005539505 nova_compute[186958]: 2025-11-29 07:23:56.654 186962 INFO nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Creating config drive at /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.config#033[00m
Nov 29 02:23:56 np0005539505 nova_compute[186958]: 2025-11-29 07:23:56.659 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp91_cigx9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:23:56 np0005539505 podman[237897]: 2025-11-29 07:23:56.753781972 +0000 UTC m=+0.084354769 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:23:56 np0005539505 podman[237898]: 2025-11-29 07:23:56.815069796 +0000 UTC m=+0.145198510 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 02:23:56 np0005539505 nova_compute[186958]: 2025-11-29 07:23:56.819 186962 DEBUG oslo_concurrency.processutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp91_cigx9" returned: 0 in 0.159s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:23:56 np0005539505 kernel: tap3e0864a9-c0: entered promiscuous mode
Nov 29 02:23:56 np0005539505 NetworkManager[55134]: <info>  [1764401036.8786] manager: (tap3e0864a9-c0): new Tun device (/org/freedesktop/NetworkManager/Devices/271)
Nov 29 02:23:56 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:56Z|00550|binding|INFO|Claiming lport 3e0864a9-c020-42b5-9976-8e419ce63072 for this chassis.
Nov 29 02:23:56 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:56Z|00551|binding|INFO|3e0864a9-c020-42b5-9976-8e419ce63072: Claiming fa:16:3e:a0:e4:cf 10.100.0.14
Nov 29 02:23:56 np0005539505 nova_compute[186958]: 2025-11-29 07:23:56.879 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:56 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:56Z|00552|binding|INFO|Setting lport 3e0864a9-c020-42b5-9976-8e419ce63072 ovn-installed in OVS
Nov 29 02:23:56 np0005539505 nova_compute[186958]: 2025-11-29 07:23:56.892 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:56 np0005539505 nova_compute[186958]: 2025-11-29 07:23:56.895 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:56 np0005539505 systemd-udevd[237961]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:23:56 np0005539505 systemd-machined[153285]: New machine qemu-63-instance-00000078.
Nov 29 02:23:56 np0005539505 NetworkManager[55134]: <info>  [1764401036.9276] device (tap3e0864a9-c0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:23:56 np0005539505 NetworkManager[55134]: <info>  [1764401036.9283] device (tap3e0864a9-c0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:23:56 np0005539505 systemd[1]: Started Virtual Machine qemu-63-instance-00000078.
Nov 29 02:23:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:57Z|00553|binding|INFO|Setting lport 3e0864a9-c020-42b5-9976-8e419ce63072 up in Southbound
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.193 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:e4:cf 10.100.0.14'], port_security=['fa:16:3e:a0:e4:cf 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dbeeb9f5-635c-4a02-9525-1135c83a03a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '4', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=3e0864a9-c020-42b5-9976-8e419ce63072) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.195 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 3e0864a9-c020-42b5-9976-8e419ce63072 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 bound to our chassis#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.197 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.208 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a6097283-6e66-4138-a45d-6790e6e0822f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.209 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b34af6b-e1 in ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.212 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b34af6b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.212 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[866a65b4-98d3-4eae-af01-fda099c968e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.213 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[53d4a75c-8d13-4fc5-b1c6-153f7649d7f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.225 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[40644cf5-5c17-4df4-b6cd-486c14689741]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.248 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c7103291-fe20-43b7-8adf-1d847d15ca41]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.289 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[8b1476e0-84d4-4081-8090-adff7c254dde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 NetworkManager[55134]: <info>  [1764401037.2960] manager: (tap9b34af6b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/272)
Nov 29 02:23:57 np0005539505 systemd-udevd[237964]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.296 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ac49db8f-98c6-4be2-a951-d31bd474b5cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.330 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[890405d3-8303-4b5b-8a6a-bcdd9650238f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.335 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[0997ff71-924c-4f28-a397-6a80d63dc31c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 NetworkManager[55134]: <info>  [1764401037.3581] device (tap9b34af6b-e0): carrier: link connected
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.365 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[30410f45-8a5f-4627-a869-7949d2a792cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.382 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c666a863-45a6-432e-ac3d-8af670f16ff7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 176], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648494, 'reachable_time': 17890, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238002, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.392 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Removed pending event for dbeeb9f5-635c-4a02-9525-1135c83a03a2 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.393 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401037.391753, dbeeb9f5-635c-4a02-9525-1135c83a03a2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.393 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] VM Started (Lifecycle Event)#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.398 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[89ac65c6-5aac-4dcb-845c-01298e551066]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:40d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 648494, 'tstamp': 648494}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238003, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.414 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0a5ad0-3380-4798-b200-a9c635132ac7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 176], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648494, 'reachable_time': 17890, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238004, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.450 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a688a661-4bdf-4761-b228-50ea3a78a8e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.500 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[14d7ea9e-13ff-463b-ac0c-5561bbc192e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.502 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.502 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.502 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b34af6b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.504 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:57 np0005539505 NetworkManager[55134]: <info>  [1764401037.5052] manager: (tap9b34af6b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Nov 29 02:23:57 np0005539505 kernel: tap9b34af6b-e0: entered promiscuous mode
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.509 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b34af6b-e0, col_values=(('external_ids', {'iface-id': '88f3bff1-58a0-4231-87c4-807c4c2657d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:23:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:23:57Z|00554|binding|INFO|Releasing lport 88f3bff1-58a0-4231-87c4-807c4c2657d5 from this chassis (sb_readonly=0)
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.533 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.536 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.535 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.537 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[01017177-f520-4afe-8d08-178901e5fc29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.538 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:23:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:23:57.538 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'env', 'PROCESS_TAG=haproxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.541 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401037.391972, dbeeb9f5-635c-4a02-9525-1135c83a03a2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.541 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.545 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.579 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.582 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:23:57 np0005539505 nova_compute[186958]: 2025-11-29 07:23:57.719 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:23:57 np0005539505 podman[238036]: 2025-11-29 07:23:57.893247231 +0000 UTC m=+0.047196477 container create 6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 02:23:57 np0005539505 systemd[1]: Started libpod-conmon-6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb.scope.
Nov 29 02:23:57 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:23:57 np0005539505 podman[238036]: 2025-11-29 07:23:57.867325228 +0000 UTC m=+0.021274504 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:23:57 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c65a883a36497872adf83eef0e36bf2f86fa3dfff556a89b6e6909c23238f4c3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:23:57 np0005539505 podman[238036]: 2025-11-29 07:23:57.981404846 +0000 UTC m=+0.135354112 container init 6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 02:23:57 np0005539505 podman[238036]: 2025-11-29 07:23:57.98682172 +0000 UTC m=+0.140770966 container start 6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 29 02:23:58 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238051]: [NOTICE]   (238055) : New worker (238057) forked
Nov 29 02:23:58 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238051]: [NOTICE]   (238055) : Loading success.
Nov 29 02:24:00 np0005539505 nova_compute[186958]: 2025-11-29 07:24:00.071 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:00 np0005539505 nova_compute[186958]: 2025-11-29 07:24:00.190 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:00 np0005539505 podman[238066]: 2025-11-29 07:24:00.716071362 +0000 UTC m=+0.053321080 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:24:00 np0005539505 podman[238067]: 2025-11-29 07:24:00.716052392 +0000 UTC m=+0.051476378 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:24:01 np0005539505 nova_compute[186958]: 2025-11-29 07:24:01.036 186962 DEBUG nova.compute.manager [req-9b1354b6-73cc-4201-8735-7b062488a677 req-6f58159e-7da1-4ac4-b782-32597b333fab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received event network-vif-unplugged-3e0864a9-c020-42b5-9976-8e419ce63072 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:01 np0005539505 nova_compute[186958]: 2025-11-29 07:24:01.036 186962 DEBUG oslo_concurrency.lockutils [req-9b1354b6-73cc-4201-8735-7b062488a677 req-6f58159e-7da1-4ac4-b782-32597b333fab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:01 np0005539505 nova_compute[186958]: 2025-11-29 07:24:01.036 186962 DEBUG oslo_concurrency.lockutils [req-9b1354b6-73cc-4201-8735-7b062488a677 req-6f58159e-7da1-4ac4-b782-32597b333fab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:01 np0005539505 nova_compute[186958]: 2025-11-29 07:24:01.037 186962 DEBUG oslo_concurrency.lockutils [req-9b1354b6-73cc-4201-8735-7b062488a677 req-6f58159e-7da1-4ac4-b782-32597b333fab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:01 np0005539505 nova_compute[186958]: 2025-11-29 07:24:01.037 186962 DEBUG nova.compute.manager [req-9b1354b6-73cc-4201-8735-7b062488a677 req-6f58159e-7da1-4ac4-b782-32597b333fab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] No event matching network-vif-unplugged-3e0864a9-c020-42b5-9976-8e419ce63072 in dict_keys([('network-vif-plugged', '3e0864a9-c020-42b5-9976-8e419ce63072')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Nov 29 02:24:01 np0005539505 nova_compute[186958]: 2025-11-29 07:24:01.037 186962 WARNING nova.compute.manager [req-9b1354b6-73cc-4201-8735-7b062488a677 req-6f58159e-7da1-4ac4-b782-32597b333fab 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received unexpected event network-vif-unplugged-3e0864a9-c020-42b5-9976-8e419ce63072 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.153 186962 DEBUG nova.compute.manager [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.153 186962 DEBUG oslo_concurrency.lockutils [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.153 186962 DEBUG oslo_concurrency.lockutils [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.154 186962 DEBUG oslo_concurrency.lockutils [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.154 186962 DEBUG nova.compute.manager [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Processing event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.154 186962 DEBUG nova.compute.manager [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.154 186962 DEBUG oslo_concurrency.lockutils [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.154 186962 DEBUG oslo_concurrency.lockutils [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.155 186962 DEBUG oslo_concurrency.lockutils [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.155 186962 DEBUG nova.compute.manager [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] No waiting events found dispatching network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.155 186962 WARNING nova.compute.manager [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received unexpected event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.155 186962 DEBUG nova.compute.manager [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.155 186962 DEBUG oslo_concurrency.lockutils [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.155 186962 DEBUG oslo_concurrency.lockutils [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.156 186962 DEBUG oslo_concurrency.lockutils [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.156 186962 DEBUG nova.compute.manager [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] No waiting events found dispatching network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.156 186962 WARNING nova.compute.manager [req-cda71621-896f-48c2-ae0d-8b299c501d88 req-659c340d-2984-4779-b66f-10dadaacc141 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received unexpected event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.156 186962 DEBUG nova.compute.manager [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.160 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401043.1599834, dbeeb9f5-635c-4a02-9525-1135c83a03a2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.160 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.161 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.165 186962 INFO nova.virt.libvirt.driver [-] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Instance spawned successfully.#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.166 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.205 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.211 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.214 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.214 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.215 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.215 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.216 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.216 186962 DEBUG nova.virt.libvirt.driver [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.242 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.295 186962 DEBUG nova.compute.manager [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.387 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.387 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.387 186962 DEBUG nova.objects.instance [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Nov 29 02:24:03 np0005539505 nova_compute[186958]: 2025-11-29 07:24:03.486 186962 DEBUG oslo_concurrency.lockutils [None req-252bf5f5-1cac-4593-9d3c-3087f1e9d702 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:05 np0005539505 nova_compute[186958]: 2025-11-29 07:24:05.121 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:05 np0005539505 nova_compute[186958]: 2025-11-29 07:24:05.192 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.106 186962 DEBUG oslo_concurrency.lockutils [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.106 186962 DEBUG oslo_concurrency.lockutils [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.107 186962 DEBUG oslo_concurrency.lockutils [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.107 186962 DEBUG oslo_concurrency.lockutils [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.107 186962 DEBUG oslo_concurrency.lockutils [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.122 186962 INFO nova.compute.manager [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Terminating instance#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.139 186962 DEBUG nova.compute.manager [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:24:06 np0005539505 kernel: tap3e0864a9-c0 (unregistering): left promiscuous mode
Nov 29 02:24:06 np0005539505 NetworkManager[55134]: <info>  [1764401046.1567] device (tap3e0864a9-c0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:24:06 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:06Z|00555|binding|INFO|Releasing lport 3e0864a9-c020-42b5-9976-8e419ce63072 from this chassis (sb_readonly=0)
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.169 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:06Z|00556|binding|INFO|Setting lport 3e0864a9-c020-42b5-9976-8e419ce63072 down in Southbound
Nov 29 02:24:06 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:06Z|00557|binding|INFO|Removing iface tap3e0864a9-c0 ovn-installed in OVS
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.170 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.175 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:e4:cf 10.100.0.14'], port_security=['fa:16:3e:a0:e4:cf 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dbeeb9f5-635c-4a02-9525-1135c83a03a2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '6', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=3e0864a9-c020-42b5-9976-8e419ce63072) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.177 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 3e0864a9-c020-42b5-9976-8e419ce63072 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 unbound from our chassis#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.179 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.180 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[65a05a3e-48df-450e-aa94-bd378205a0b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.181 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace which is not needed anymore#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.186 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539505 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000078.scope: Deactivated successfully.
Nov 29 02:24:06 np0005539505 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000078.scope: Consumed 3.407s CPU time.
Nov 29 02:24:06 np0005539505 systemd-machined[153285]: Machine qemu-63-instance-00000078 terminated.
Nov 29 02:24:06 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238051]: [NOTICE]   (238055) : haproxy version is 2.8.14-c23fe91
Nov 29 02:24:06 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238051]: [NOTICE]   (238055) : path to executable is /usr/sbin/haproxy
Nov 29 02:24:06 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238051]: [WARNING]  (238055) : Exiting Master process...
Nov 29 02:24:06 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238051]: [ALERT]    (238055) : Current worker (238057) exited with code 143 (Terminated)
Nov 29 02:24:06 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238051]: [WARNING]  (238055) : All workers exited. Exiting... (0)
Nov 29 02:24:06 np0005539505 systemd[1]: libpod-6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb.scope: Deactivated successfully.
Nov 29 02:24:06 np0005539505 podman[238128]: 2025-11-29 07:24:06.311565337 +0000 UTC m=+0.043160033 container died 6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:24:06 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb-userdata-shm.mount: Deactivated successfully.
Nov 29 02:24:06 np0005539505 systemd[1]: var-lib-containers-storage-overlay-c65a883a36497872adf83eef0e36bf2f86fa3dfff556a89b6e6909c23238f4c3-merged.mount: Deactivated successfully.
Nov 29 02:24:06 np0005539505 podman[238128]: 2025-11-29 07:24:06.34631269 +0000 UTC m=+0.077907386 container cleanup 6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:24:06 np0005539505 systemd[1]: libpod-conmon-6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb.scope: Deactivated successfully.
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.393 186962 INFO nova.virt.libvirt.driver [-] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Instance destroyed successfully.#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.394 186962 DEBUG nova.objects.instance [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'resources' on Instance uuid dbeeb9f5-635c-4a02-9525-1135c83a03a2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.404 186962 DEBUG nova.virt.libvirt.vif [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-11-29T07:23:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-45658509',display_name='tempest-ServerDiskConfigTestJSON-server-45658509',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-45658509',id=120,image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-12vq4c90',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='3372b7b2-657b-4c4d-9d9d-7c5b771a630a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:24:03Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=dbeeb9f5-635c-4a02-9525-1135c83a03a2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.404 186962 DEBUG nova.network.os_vif_util [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3e0864a9-c020-42b5-9976-8e419ce63072", "address": "fa:16:3e:a0:e4:cf", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e0864a9-c0", "ovs_interfaceid": "3e0864a9-c020-42b5-9976-8e419ce63072", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.405 186962 DEBUG nova.network.os_vif_util [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.406 186962 DEBUG os_vif [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.408 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.408 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e0864a9-c0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.409 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.410 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539505 podman[238163]: 2025-11-29 07:24:06.411798894 +0000 UTC m=+0.042426682 container remove 6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.413 186962 INFO os_vif [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:e4:cf,bridge_name='br-int',has_traffic_filtering=True,id=3e0864a9-c020-42b5-9976-8e419ce63072,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e0864a9-c0')#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.413 186962 INFO nova.virt.libvirt.driver [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Deleting instance files /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2_del#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.414 186962 INFO nova.virt.libvirt.driver [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Deletion of /var/lib/nova/instances/dbeeb9f5-635c-4a02-9525-1135c83a03a2_del complete#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.416 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c52e0404-476c-4c19-bef2-eccf746c2da8]: (4, ('Sat Nov 29 07:24:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb)\n6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb\nSat Nov 29 07:24:06 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb)\n6075620673ea26ae2a27aa6e529d70147a3a9788fd116b9dc8278db7373c7ecb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.418 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[28417920-8ad6-432d-8d23-064aab2cdd31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.419 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.420 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539505 kernel: tap9b34af6b-e0: left promiscuous mode
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.431 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.432 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.433 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4b16b8-a105-464a-b3e3-303da5df1a0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.453 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb55d86-e723-4a90-94c8-02464079db5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.454 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[82de71b4-84b2-4c2d-be12-d6a04b2a330a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.467 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[564f4703-d95d-4961-990b-048fa8666287]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 648486, 'reachable_time': 36133, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238191, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:06 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9b34af6b\x2dedf9\x2d4b27\x2db1dc\x2d2b18c2eec958.mount: Deactivated successfully.
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.471 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:24:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:06.471 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[8fdfb5c6-6e9f-4c3f-9e4c-6dec232d7135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.496 186962 INFO nova.compute.manager [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.497 186962 DEBUG oslo.service.loopingcall [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.497 186962 DEBUG nova.compute.manager [-] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:24:06 np0005539505 nova_compute[186958]: 2025-11-29 07:24:06.497 186962 DEBUG nova.network.neutron [-] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:24:08 np0005539505 nova_compute[186958]: 2025-11-29 07:24:08.871 186962 DEBUG nova.compute.manager [req-c4b81e51-16c7-42cf-8e20-29cda5a7628f req-5c4f940f-f2e5-498e-97b3-6cdf3eaf1e88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received event network-vif-unplugged-3e0864a9-c020-42b5-9976-8e419ce63072 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:08 np0005539505 nova_compute[186958]: 2025-11-29 07:24:08.871 186962 DEBUG oslo_concurrency.lockutils [req-c4b81e51-16c7-42cf-8e20-29cda5a7628f req-5c4f940f-f2e5-498e-97b3-6cdf3eaf1e88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:08 np0005539505 nova_compute[186958]: 2025-11-29 07:24:08.871 186962 DEBUG oslo_concurrency.lockutils [req-c4b81e51-16c7-42cf-8e20-29cda5a7628f req-5c4f940f-f2e5-498e-97b3-6cdf3eaf1e88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:08 np0005539505 nova_compute[186958]: 2025-11-29 07:24:08.872 186962 DEBUG oslo_concurrency.lockutils [req-c4b81e51-16c7-42cf-8e20-29cda5a7628f req-5c4f940f-f2e5-498e-97b3-6cdf3eaf1e88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:08 np0005539505 nova_compute[186958]: 2025-11-29 07:24:08.872 186962 DEBUG nova.compute.manager [req-c4b81e51-16c7-42cf-8e20-29cda5a7628f req-5c4f940f-f2e5-498e-97b3-6cdf3eaf1e88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] No waiting events found dispatching network-vif-unplugged-3e0864a9-c020-42b5-9976-8e419ce63072 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:08 np0005539505 nova_compute[186958]: 2025-11-29 07:24:08.872 186962 DEBUG nova.compute.manager [req-c4b81e51-16c7-42cf-8e20-29cda5a7628f req-5c4f940f-f2e5-498e-97b3-6cdf3eaf1e88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received event network-vif-unplugged-3e0864a9-c020-42b5-9976-8e419ce63072 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:24:09 np0005539505 nova_compute[186958]: 2025-11-29 07:24:09.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:09 np0005539505 nova_compute[186958]: 2025-11-29 07:24:09.741 186962 DEBUG nova.network.neutron [-] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:09 np0005539505 nova_compute[186958]: 2025-11-29 07:24:09.762 186962 INFO nova.compute.manager [-] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Took 3.26 seconds to deallocate network for instance.#033[00m
Nov 29 02:24:09 np0005539505 nova_compute[186958]: 2025-11-29 07:24:09.847 186962 DEBUG nova.compute.manager [req-4b282e9b-49ea-4be2-80a4-f3b797009058 req-bbd9b80b-04ec-4f84-a468-90ae6e20946d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received event network-vif-deleted-3e0864a9-c020-42b5-9976-8e419ce63072 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:09 np0005539505 nova_compute[186958]: 2025-11-29 07:24:09.850 186962 DEBUG oslo_concurrency.lockutils [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:09 np0005539505 nova_compute[186958]: 2025-11-29 07:24:09.850 186962 DEBUG oslo_concurrency.lockutils [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:09 np0005539505 nova_compute[186958]: 2025-11-29 07:24:09.911 186962 DEBUG nova.compute.provider_tree [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:24:09 np0005539505 nova_compute[186958]: 2025-11-29 07:24:09.923 186962 DEBUG nova.scheduler.client.report [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:24:09 np0005539505 nova_compute[186958]: 2025-11-29 07:24:09.940 186962 DEBUG oslo_concurrency.lockutils [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.015 186962 INFO nova.scheduler.client.report [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Deleted allocations for instance dbeeb9f5-635c-4a02-9525-1135c83a03a2#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.091 186962 DEBUG oslo_concurrency.lockutils [None req-6711b999-d2c4-45ca-8598-73e099360a11 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.984s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.193 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.470 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.470 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.503 186962 DEBUG nova.compute.manager [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.611 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.612 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.618 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.618 186962 INFO nova.compute.claims [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.751 186962 DEBUG nova.compute.provider_tree [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.764 186962 DEBUG nova.scheduler.client.report [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.782 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.783 186962 DEBUG nova.compute.manager [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.846 186962 DEBUG nova.compute.manager [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.847 186962 DEBUG nova.network.neutron [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.873 186962 INFO nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:24:10 np0005539505 nova_compute[186958]: 2025-11-29 07:24:10.896 186962 DEBUG nova.compute.manager [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.042 186962 DEBUG nova.compute.manager [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.044 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.044 186962 INFO nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Creating image(s)#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.044 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.045 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.045 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.059 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.079 186962 DEBUG nova.compute.manager [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.080 186962 DEBUG oslo_concurrency.lockutils [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.080 186962 DEBUG oslo_concurrency.lockutils [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.080 186962 DEBUG oslo_concurrency.lockutils [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "dbeeb9f5-635c-4a02-9525-1135c83a03a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.081 186962 DEBUG nova.compute.manager [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] No waiting events found dispatching network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.081 186962 WARNING nova.compute.manager [req-547700c3-f9c7-4ffb-a2d3-92397b5128d9 req-484b641e-9544-4b53-b609-96f44ea5579a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Received unexpected event network-vif-plugged-3e0864a9-c020-42b5-9976-8e419ce63072 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.120 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.121 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.122 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.132 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.149 186962 DEBUG nova.policy [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.187 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.188 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.461 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.819 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk 1073741824" returned: 0 in 0.631s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.819 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.820 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.877 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.878 186962 DEBUG nova.virt.disk.api [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Checking if we can resize image /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.879 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.938 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.939 186962 DEBUG nova.virt.disk.api [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Cannot resize image /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.939 186962 DEBUG nova.objects.instance [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'migration_context' on Instance uuid 2702fe48-44d0-408d-8d10-fd635e3779c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.957 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.958 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Ensure instance console log exists: /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.958 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.959 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:11 np0005539505 nova_compute[186958]: 2025-11-29 07:24:11.959 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:12 np0005539505 nova_compute[186958]: 2025-11-29 07:24:12.157 186962 DEBUG nova.network.neutron [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Successfully created port: 3484baf0-bfbb-4b67-b841-a369f9a2c534 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:24:13 np0005539505 nova_compute[186958]: 2025-11-29 07:24:13.232 186962 DEBUG nova.network.neutron [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Successfully updated port: 3484baf0-bfbb-4b67-b841-a369f9a2c534 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:24:13 np0005539505 nova_compute[186958]: 2025-11-29 07:24:13.246 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:13 np0005539505 nova_compute[186958]: 2025-11-29 07:24:13.247 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquired lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:13 np0005539505 nova_compute[186958]: 2025-11-29 07:24:13.247 186962 DEBUG nova.network.neutron [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:24:13 np0005539505 nova_compute[186958]: 2025-11-29 07:24:13.387 186962 DEBUG nova.compute.manager [req-4511b089-ad0f-46fe-ae74-54afbfa3bb76 req-9bdfaa5f-2f8e-48e0-bed0-e9cc1c966727 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-changed-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:13 np0005539505 nova_compute[186958]: 2025-11-29 07:24:13.387 186962 DEBUG nova.compute.manager [req-4511b089-ad0f-46fe-ae74-54afbfa3bb76 req-9bdfaa5f-2f8e-48e0-bed0-e9cc1c966727 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Refreshing instance network info cache due to event network-changed-3484baf0-bfbb-4b67-b841-a369f9a2c534. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:24:13 np0005539505 nova_compute[186958]: 2025-11-29 07:24:13.388 186962 DEBUG oslo_concurrency.lockutils [req-4511b089-ad0f-46fe-ae74-54afbfa3bb76 req-9bdfaa5f-2f8e-48e0-bed0-e9cc1c966727 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:13 np0005539505 nova_compute[186958]: 2025-11-29 07:24:13.447 186962 DEBUG nova.network.neutron [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:24:14 np0005539505 nova_compute[186958]: 2025-11-29 07:24:14.897 186962 DEBUG nova.network.neutron [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating instance_info_cache with network_info: [{"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.162 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Releasing lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.163 186962 DEBUG nova.compute.manager [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Instance network_info: |[{"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.163 186962 DEBUG oslo_concurrency.lockutils [req-4511b089-ad0f-46fe-ae74-54afbfa3bb76 req-9bdfaa5f-2f8e-48e0-bed0-e9cc1c966727 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.164 186962 DEBUG nova.network.neutron [req-4511b089-ad0f-46fe-ae74-54afbfa3bb76 req-9bdfaa5f-2f8e-48e0-bed0-e9cc1c966727 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Refreshing network info cache for port 3484baf0-bfbb-4b67-b841-a369f9a2c534 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.167 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Start _get_guest_xml network_info=[{"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.173 186962 WARNING nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.177 186962 DEBUG nova.virt.libvirt.host [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.178 186962 DEBUG nova.virt.libvirt.host [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.181 186962 DEBUG nova.virt.libvirt.host [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.182 186962 DEBUG nova.virt.libvirt.host [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.183 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.183 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.183 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.183 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.184 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.184 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.184 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.184 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.185 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.185 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.185 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.185 186962 DEBUG nova.virt.hardware [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.188 186962 DEBUG nova.virt.libvirt.vif [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:24:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-864835491',display_name='tempest-ServerDiskConfigTestJSON-server-864835491',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-864835491',id=122,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-ybkvc9v9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:10Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=2702fe48-44d0-408d-8d10-fd635e3779c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.189 186962 DEBUG nova.network.os_vif_util [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.189 186962 DEBUG nova.network.os_vif_util [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.190 186962 DEBUG nova.objects.instance [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2702fe48-44d0-408d-8d10-fd635e3779c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.195 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.337 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  <uuid>2702fe48-44d0-408d-8d10-fd635e3779c9</uuid>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  <name>instance-0000007a</name>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-864835491</nova:name>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:24:15</nova:creationTime>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:        <nova:user uuid="000fb7b950024e16902cd58f2ea16ac9">tempest-ServerDiskConfigTestJSON-1282760174-project-member</nova:user>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:        <nova:project uuid="6d55e57bfd184513a304a61cc1cb3730">tempest-ServerDiskConfigTestJSON-1282760174</nova:project>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:        <nova:port uuid="3484baf0-bfbb-4b67-b841-a369f9a2c534">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <entry name="serial">2702fe48-44d0-408d-8d10-fd635e3779c9</entry>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <entry name="uuid">2702fe48-44d0-408d-8d10-fd635e3779c9</entry>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.config"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:34:30:8b"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <target dev="tap3484baf0-bf"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/console.log" append="off"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:24:15 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:24:15 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:24:15 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:24:15 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.338 186962 DEBUG nova.compute.manager [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Preparing to wait for external event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.339 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.339 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.339 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.340 186962 DEBUG nova.virt.libvirt.vif [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:24:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-864835491',display_name='tempest-ServerDiskConfigTestJSON-server-864835491',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-864835491',id=122,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-ybkvc9v9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:24:10Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=2702fe48-44d0-408d-8d10-fd635e3779c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.340 186962 DEBUG nova.network.os_vif_util [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.340 186962 DEBUG nova.network.os_vif_util [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.341 186962 DEBUG os_vif [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.341 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.342 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.342 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.345 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.345 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3484baf0-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.345 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3484baf0-bf, col_values=(('external_ids', {'iface-id': '3484baf0-bfbb-4b67-b841-a369f9a2c534', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:30:8b', 'vm-uuid': '2702fe48-44d0-408d-8d10-fd635e3779c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.346 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:15 np0005539505 NetworkManager[55134]: <info>  [1764401055.3476] manager: (tap3484baf0-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.349 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.352 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.352 186962 INFO os_vif [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf')#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.412 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.413 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.413 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No VIF found with MAC fa:16:3e:34:30:8b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.414 186962 INFO nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Using config drive#033[00m
Nov 29 02:24:15 np0005539505 podman[238211]: 2025-11-29 07:24:15.473024696 +0000 UTC m=+0.084102781 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:24:15 np0005539505 podman[238210]: 2025-11-29 07:24:15.494673219 +0000 UTC m=+0.105853347 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6, release=1755695350, container_name=openstack_network_exporter)
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.751 186962 INFO nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Creating config drive at /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.config#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.755 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoob0z41g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.879 186962 DEBUG oslo_concurrency.processutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoob0z41g" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:15 np0005539505 kernel: tap3484baf0-bf: entered promiscuous mode
Nov 29 02:24:15 np0005539505 NetworkManager[55134]: <info>  [1764401055.9276] manager: (tap3484baf0-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/275)
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.928 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:15Z|00558|binding|INFO|Claiming lport 3484baf0-bfbb-4b67-b841-a369f9a2c534 for this chassis.
Nov 29 02:24:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:15Z|00559|binding|INFO|3484baf0-bfbb-4b67-b841-a369f9a2c534: Claiming fa:16:3e:34:30:8b 10.100.0.12
Nov 29 02:24:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:15.935 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:30:8b 10.100.0.12'], port_security=['fa:16:3e:34:30:8b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '2', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=3484baf0-bfbb-4b67-b841-a369f9a2c534) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:24:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:15.937 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 3484baf0-bfbb-4b67-b841-a369f9a2c534 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 bound to our chassis#033[00m
Nov 29 02:24:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:15.939 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958#033[00m
Nov 29 02:24:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:15Z|00560|binding|INFO|Setting lport 3484baf0-bfbb-4b67-b841-a369f9a2c534 ovn-installed in OVS
Nov 29 02:24:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:15Z|00561|binding|INFO|Setting lport 3484baf0-bfbb-4b67-b841-a369f9a2c534 up in Southbound
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.948 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:15.949 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cbee3dc4-6d88-454e-b9e0-fa62211617b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:15.949 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b34af6b-e1 in ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:24:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:15.951 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b34af6b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:24:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:15.952 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[160c9d0f-67f5-41e8-baf9-e59887ec5c24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:15 np0005539505 nova_compute[186958]: 2025-11-29 07:24:15.952 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:15.952 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6c286504-76d1-4105-a362-b7a9a7b6d87f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:15 np0005539505 systemd-udevd[238269]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:24:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:15.964 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[b44ba186-9a43-4081-86e7-c2b5ff09c944]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:15 np0005539505 NetworkManager[55134]: <info>  [1764401055.9689] device (tap3484baf0-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:24:15 np0005539505 NetworkManager[55134]: <info>  [1764401055.9699] device (tap3484baf0-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:24:15 np0005539505 systemd-machined[153285]: New machine qemu-64-instance-0000007a.
Nov 29 02:24:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:15.988 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ace7bf54-fdea-48d2-b34e-80511aa465a0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:15 np0005539505 systemd[1]: Started Virtual Machine qemu-64-instance-0000007a.
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.017 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b93fa26f-b509-457b-a670-7a6578e74186]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:16 np0005539505 systemd-udevd[238276]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.022 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7848dd32-5343-4218-a870-2e38f3e5eddb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:16 np0005539505 NetworkManager[55134]: <info>  [1764401056.0232] manager: (tap9b34af6b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/276)
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.053 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[83684735-977e-4dc1-b081-6c655f393263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.056 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf4823d-9eeb-4e0f-844c-2414407be9cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:16 np0005539505 NetworkManager[55134]: <info>  [1764401056.0752] device (tap9b34af6b-e0): carrier: link connected
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.080 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2dca5af6-e090-4591-a6f1-f793e982950b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.096 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ec45326a-f1c9-4aa8-af22-5c6f3279c28c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 179], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 650365, 'reachable_time': 21472, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238304, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.111 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[173ad498-dd33-4f55-b3e7-121add8bc587]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:40d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 650365, 'tstamp': 650365}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238305, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.127 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a721a3e4-d6f3-4537-97f9-9b71b405e5fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 179], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 650365, 'reachable_time': 21472, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238306, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.158 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[99c18a9e-3a99-413a-b890-20ef3302b2fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.219 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7036b8-3d65-444a-9c61-049d5c61bf47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.221 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.221 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.222 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b34af6b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.223 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:16 np0005539505 NetworkManager[55134]: <info>  [1764401056.2244] manager: (tap9b34af6b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Nov 29 02:24:16 np0005539505 kernel: tap9b34af6b-e0: entered promiscuous mode
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.225 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.226 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b34af6b-e0, col_values=(('external_ids', {'iface-id': '88f3bff1-58a0-4231-87c4-807c4c2657d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.227 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:16 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:16Z|00562|binding|INFO|Releasing lport 88f3bff1-58a0-4231-87c4-807c4c2657d5 from this chassis (sb_readonly=0)
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.241 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.242 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.243 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9e23aa4a-0c35-4b05-abbe-1b81ccdc33a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.244 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:24:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:16.244 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'env', 'PROCESS_TAG=haproxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.384 186962 DEBUG nova.compute.manager [req-f3c5a062-52b8-4e48-82d1-39edd812c0bf req-a7eb749c-72e7-4314-a7fe-f64efa4e1378 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.384 186962 DEBUG oslo_concurrency.lockutils [req-f3c5a062-52b8-4e48-82d1-39edd812c0bf req-a7eb749c-72e7-4314-a7fe-f64efa4e1378 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.385 186962 DEBUG oslo_concurrency.lockutils [req-f3c5a062-52b8-4e48-82d1-39edd812c0bf req-a7eb749c-72e7-4314-a7fe-f64efa4e1378 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.385 186962 DEBUG oslo_concurrency.lockutils [req-f3c5a062-52b8-4e48-82d1-39edd812c0bf req-a7eb749c-72e7-4314-a7fe-f64efa4e1378 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.385 186962 DEBUG nova.compute.manager [req-f3c5a062-52b8-4e48-82d1-39edd812c0bf req-a7eb749c-72e7-4314-a7fe-f64efa4e1378 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Processing event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:24:16 np0005539505 podman[238338]: 2025-11-29 07:24:16.588480366 +0000 UTC m=+0.056216362 container create 9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:24:16 np0005539505 systemd[1]: Started libpod-conmon-9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46.scope.
Nov 29 02:24:16 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:24:16 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ddb8afd5f7e8a82301006d5a2b59bed038dbc4d39c7269cdbe87a2b3efd32eb8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:24:16 np0005539505 podman[238338]: 2025-11-29 07:24:16.552981751 +0000 UTC m=+0.020717777 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:24:16 np0005539505 podman[238338]: 2025-11-29 07:24:16.658469826 +0000 UTC m=+0.126205842 container init 9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 02:24:16 np0005539505 podman[238338]: 2025-11-29 07:24:16.669651723 +0000 UTC m=+0.137387729 container start 9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:24:16 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238359]: [NOTICE]   (238364) : New worker (238367) forked
Nov 29 02:24:16 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238359]: [NOTICE]   (238364) : Loading success.
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.726 186962 DEBUG nova.compute.manager [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.726 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401056.7254465, 2702fe48-44d0-408d-8d10-fd635e3779c9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.727 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] VM Started (Lifecycle Event)#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.731 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.735 186962 INFO nova.virt.libvirt.driver [-] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Instance spawned successfully.#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.735 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.959 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.963 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.964 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.964 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.965 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.965 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.965 186962 DEBUG nova.virt.libvirt.driver [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:24:16 np0005539505 nova_compute[186958]: 2025-11-29 07:24:16.970 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.009 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.009 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401056.7294917, 2702fe48-44d0-408d-8d10-fd635e3779c9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.009 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.065 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.069 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401056.7313814, 2702fe48-44d0-408d-8d10-fd635e3779c9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.069 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.104 186962 DEBUG nova.network.neutron [req-4511b089-ad0f-46fe-ae74-54afbfa3bb76 req-9bdfaa5f-2f8e-48e0-bed0-e9cc1c966727 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updated VIF entry in instance network info cache for port 3484baf0-bfbb-4b67-b841-a369f9a2c534. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.105 186962 DEBUG nova.network.neutron [req-4511b089-ad0f-46fe-ae74-54afbfa3bb76 req-9bdfaa5f-2f8e-48e0-bed0-e9cc1c966727 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating instance_info_cache with network_info: [{"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.542 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.542 186962 DEBUG oslo_concurrency.lockutils [req-4511b089-ad0f-46fe-ae74-54afbfa3bb76 req-9bdfaa5f-2f8e-48e0-bed0-e9cc1c966727 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.546 186962 INFO nova.compute.manager [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Took 6.50 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.546 186962 DEBUG nova.compute.manager [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.547 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.591 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:24:17 np0005539505 podman[238376]: 2025-11-29 07:24:17.709939056 +0000 UTC m=+0.048223686 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.865 186962 INFO nova.compute.manager [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Took 7.28 seconds to build instance.#033[00m
Nov 29 02:24:17 np0005539505 nova_compute[186958]: 2025-11-29 07:24:17.895 186962 DEBUG oslo_concurrency.lockutils [None req-ec723e36-1673-4eea-9bbf-0d32057cfa5d 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:18 np0005539505 nova_compute[186958]: 2025-11-29 07:24:18.477 186962 DEBUG nova.compute.manager [req-2a8d3c5e-4198-4fd5-ae2a-a230c44a90f6 req-d27f857d-7990-4d55-8ce2-a8ffd1f74bec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:18 np0005539505 nova_compute[186958]: 2025-11-29 07:24:18.477 186962 DEBUG oslo_concurrency.lockutils [req-2a8d3c5e-4198-4fd5-ae2a-a230c44a90f6 req-d27f857d-7990-4d55-8ce2-a8ffd1f74bec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:18 np0005539505 nova_compute[186958]: 2025-11-29 07:24:18.478 186962 DEBUG oslo_concurrency.lockutils [req-2a8d3c5e-4198-4fd5-ae2a-a230c44a90f6 req-d27f857d-7990-4d55-8ce2-a8ffd1f74bec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:18 np0005539505 nova_compute[186958]: 2025-11-29 07:24:18.478 186962 DEBUG oslo_concurrency.lockutils [req-2a8d3c5e-4198-4fd5-ae2a-a230c44a90f6 req-d27f857d-7990-4d55-8ce2-a8ffd1f74bec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:18 np0005539505 nova_compute[186958]: 2025-11-29 07:24:18.478 186962 DEBUG nova.compute.manager [req-2a8d3c5e-4198-4fd5-ae2a-a230c44a90f6 req-d27f857d-7990-4d55-8ce2-a8ffd1f74bec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] No waiting events found dispatching network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:18 np0005539505 nova_compute[186958]: 2025-11-29 07:24:18.479 186962 WARNING nova.compute.manager [req-2a8d3c5e-4198-4fd5-ae2a-a230c44a90f6 req-d27f857d-7990-4d55-8ce2-a8ffd1f74bec 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received unexpected event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:24:20 np0005539505 nova_compute[186958]: 2025-11-29 07:24:20.197 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:20 np0005539505 nova_compute[186958]: 2025-11-29 07:24:20.347 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:20 np0005539505 nova_compute[186958]: 2025-11-29 07:24:20.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:20 np0005539505 nova_compute[186958]: 2025-11-29 07:24:20.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:21 np0005539505 nova_compute[186958]: 2025-11-29 07:24:21.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:21 np0005539505 nova_compute[186958]: 2025-11-29 07:24:21.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:24:21 np0005539505 nova_compute[186958]: 2025-11-29 07:24:21.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:24:21 np0005539505 nova_compute[186958]: 2025-11-29 07:24:21.392 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401046.3911955, dbeeb9f5-635c-4a02-9525-1135c83a03a2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:21 np0005539505 nova_compute[186958]: 2025-11-29 07:24:21.392 186962 INFO nova.compute.manager [-] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:24:21 np0005539505 nova_compute[186958]: 2025-11-29 07:24:21.553 186962 DEBUG nova.compute.manager [None req-1af8d47f-e6b5-4188-ad73-cd917c76adeb - - - - - -] [instance: dbeeb9f5-635c-4a02-9525-1135c83a03a2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:22 np0005539505 nova_compute[186958]: 2025-11-29 07:24:22.369 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:22 np0005539505 nova_compute[186958]: 2025-11-29 07:24:22.370 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:22 np0005539505 nova_compute[186958]: 2025-11-29 07:24:22.370 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:24:22 np0005539505 nova_compute[186958]: 2025-11-29 07:24:22.370 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2702fe48-44d0-408d-8d10-fd635e3779c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:24:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:23.943 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:24:23 np0005539505 nova_compute[186958]: 2025-11-29 07:24:23.943 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:23.945 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:24:24 np0005539505 nova_compute[186958]: 2025-11-29 07:24:24.759 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating instance_info_cache with network_info: [{"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:24 np0005539505 nova_compute[186958]: 2025-11-29 07:24:24.783 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:24 np0005539505 nova_compute[186958]: 2025-11-29 07:24:24.783 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:24:24 np0005539505 nova_compute[186958]: 2025-11-29 07:24:24.783 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:24 np0005539505 nova_compute[186958]: 2025-11-29 07:24:24.807 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:24 np0005539505 nova_compute[186958]: 2025-11-29 07:24:24.807 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:24 np0005539505 nova_compute[186958]: 2025-11-29 07:24:24.808 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:24 np0005539505 nova_compute[186958]: 2025-11-29 07:24:24.808 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:24:24 np0005539505 nova_compute[186958]: 2025-11-29 07:24:24.881 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:24 np0005539505 nova_compute[186958]: 2025-11-29 07:24:24.949 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:24 np0005539505 nova_compute[186958]: 2025-11-29 07:24:24.951 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.009 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.185 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.187 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5551MB free_disk=73.07317352294922GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.187 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.188 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.199 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.234 186962 INFO nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating resource usage from migration e6c22fe2-15f6-43e8-b46c-ad0badaec107#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.273 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Migration e6c22fe2-15f6-43e8-b46c-ad0badaec107 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.274 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.274 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.339 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.356 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.357 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.387 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.418 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.538 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.544 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.559 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.582 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:24:25 np0005539505 nova_compute[186958]: 2025-11-29 07:24:25.582 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:26 np0005539505 nova_compute[186958]: 2025-11-29 07:24:26.528 186962 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:26 np0005539505 nova_compute[186958]: 2025-11-29 07:24:26.529 186962 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquired lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:26 np0005539505 nova_compute[186958]: 2025-11-29 07:24:26.529 186962 DEBUG nova.network.neutron [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:24:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:26.965 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:26.966 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:26.966 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:27 np0005539505 podman[238404]: 2025-11-29 07:24:27.743334731 +0000 UTC m=+0.076484195 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 02:24:27 np0005539505 podman[238403]: 2025-11-29 07:24:27.75564906 +0000 UTC m=+0.083752332 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:24:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:27.948 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:29Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:34:30:8b 10.100.0.12
Nov 29 02:24:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:29Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:34:30:8b 10.100.0.12
Nov 29 02:24:30 np0005539505 nova_compute[186958]: 2025-11-29 07:24:30.202 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:30 np0005539505 nova_compute[186958]: 2025-11-29 07:24:30.246 186962 DEBUG nova.network.neutron [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating instance_info_cache with network_info: [{"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:30 np0005539505 nova_compute[186958]: 2025-11-29 07:24:30.260 186962 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Releasing lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:30 np0005539505 nova_compute[186958]: 2025-11-29 07:24:30.373 186962 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 29 02:24:30 np0005539505 nova_compute[186958]: 2025-11-29 07:24:30.374 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Creating file /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/f5417c0a70284f778c707082d5f0ac74.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 29 02:24:30 np0005539505 nova_compute[186958]: 2025-11-29 07:24:30.374 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/f5417c0a70284f778c707082d5f0ac74.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:30 np0005539505 nova_compute[186958]: 2025-11-29 07:24:30.540 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:30 np0005539505 nova_compute[186958]: 2025-11-29 07:24:30.807 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/f5417c0a70284f778c707082d5f0ac74.tmp" returned: 1 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:30 np0005539505 nova_compute[186958]: 2025-11-29 07:24:30.808 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/f5417c0a70284f778c707082d5f0ac74.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 02:24:30 np0005539505 nova_compute[186958]: 2025-11-29 07:24:30.809 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Creating directory /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 29 02:24:30 np0005539505 nova_compute[186958]: 2025-11-29 07:24:30.809 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:31 np0005539505 nova_compute[186958]: 2025-11-29 07:24:31.019 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:31 np0005539505 nova_compute[186958]: 2025-11-29 07:24:31.024 186962 DEBUG nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:24:31 np0005539505 nova_compute[186958]: 2025-11-29 07:24:31.177 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:24:31 np0005539505 podman[238462]: 2025-11-29 07:24:31.744410282 +0000 UTC m=+0.068611653 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:24:31 np0005539505 podman[238463]: 2025-11-29 07:24:31.748981921 +0000 UTC m=+0.068468039 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:24:33 np0005539505 kernel: tap3484baf0-bf (unregistering): left promiscuous mode
Nov 29 02:24:33 np0005539505 NetworkManager[55134]: <info>  [1764401073.2023] device (tap3484baf0-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:24:33 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:33Z|00563|binding|INFO|Releasing lport 3484baf0-bfbb-4b67-b841-a369f9a2c534 from this chassis (sb_readonly=0)
Nov 29 02:24:33 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:33Z|00564|binding|INFO|Setting lport 3484baf0-bfbb-4b67-b841-a369f9a2c534 down in Southbound
Nov 29 02:24:33 np0005539505 ovn_controller[95143]: 2025-11-29T07:24:33Z|00565|binding|INFO|Removing iface tap3484baf0-bf ovn-installed in OVS
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.209 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.212 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.216 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:30:8b 10.100.0.12'], port_security=['fa:16:3e:34:30:8b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '2702fe48-44d0-408d-8d10-fd635e3779c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '4', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=3484baf0-bfbb-4b67-b841-a369f9a2c534) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.217 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 3484baf0-bfbb-4b67-b841-a369f9a2c534 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 unbound from our chassis#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.219 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.220 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8e0d24d0-dc69-4698-b734-584fef7cecb7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.221 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace which is not needed anymore#033[00m
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.228 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:33 np0005539505 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Nov 29 02:24:33 np0005539505 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d0000007a.scope: Consumed 13.191s CPU time.
Nov 29 02:24:33 np0005539505 systemd-machined[153285]: Machine qemu-64-instance-0000007a terminated.
Nov 29 02:24:33 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238359]: [NOTICE]   (238364) : haproxy version is 2.8.14-c23fe91
Nov 29 02:24:33 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238359]: [NOTICE]   (238364) : path to executable is /usr/sbin/haproxy
Nov 29 02:24:33 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238359]: [WARNING]  (238364) : Exiting Master process...
Nov 29 02:24:33 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238359]: [ALERT]    (238364) : Current worker (238367) exited with code 143 (Terminated)
Nov 29 02:24:33 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238359]: [WARNING]  (238364) : All workers exited. Exiting... (0)
Nov 29 02:24:33 np0005539505 systemd[1]: libpod-9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46.scope: Deactivated successfully.
Nov 29 02:24:33 np0005539505 podman[238526]: 2025-11-29 07:24:33.339164957 +0000 UTC m=+0.044257834 container died 9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 29 02:24:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46-userdata-shm.mount: Deactivated successfully.
Nov 29 02:24:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay-ddb8afd5f7e8a82301006d5a2b59bed038dbc4d39c7269cdbe87a2b3efd32eb8-merged.mount: Deactivated successfully.
Nov 29 02:24:33 np0005539505 podman[238526]: 2025-11-29 07:24:33.375520306 +0000 UTC m=+0.080613173 container cleanup 9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:24:33 np0005539505 systemd[1]: libpod-conmon-9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46.scope: Deactivated successfully.
Nov 29 02:24:33 np0005539505 podman[238556]: 2025-11-29 07:24:33.433901788 +0000 UTC m=+0.040126447 container remove 9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.434 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.439 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dadb4e89-2364-4bae-9633-33bb3d4734b1]: (4, ('Sat Nov 29 07:24:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46)\n9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46\nSat Nov 29 07:24:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46)\n9a5cf7859f10f641009ad27313aa7bb4330d4f472c625ca5dbe252499339bb46\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.439 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.441 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b3b98e-4e8c-4cb5-867b-2df33487ea01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.442 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.444 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:33 np0005539505 kernel: tap9b34af6b-e0: left promiscuous mode
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.459 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.462 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0a918c59-b49f-41ed-9c9a-4f77406cc7d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.478 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[07726656-4b9b-4b65-adda-a099a69589b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.479 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[52fa35b7-eb50-4a07-8de3-385284a61449]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.495 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[79058db2-27e4-4ab1-9a5d-3533ba7a48db]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 650359, 'reachable_time': 31288, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238591, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.497 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:24:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:24:33.497 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[e4325ff2-0e8e-4d7c-b910-b9258c66a384]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:24:33 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9b34af6b\x2dedf9\x2d4b27\x2db1dc\x2d2b18c2eec958.mount: Deactivated successfully.
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.657 186962 DEBUG nova.compute.manager [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-unplugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.658 186962 DEBUG oslo_concurrency.lockutils [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.658 186962 DEBUG oslo_concurrency.lockutils [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.658 186962 DEBUG oslo_concurrency.lockutils [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.659 186962 DEBUG nova.compute.manager [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] No waiting events found dispatching network-vif-unplugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:33 np0005539505 nova_compute[186958]: 2025-11-29 07:24:33.659 186962 WARNING nova.compute.manager [req-e6da2939-7cb3-4532-be1e-9c7e53a974f9 req-d161dea3-0984-4d7a-9eaf-537d3f74967c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received unexpected event network-vif-unplugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.044 186962 INFO nova.virt.libvirt.driver [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.049 186962 INFO nova.virt.libvirt.driver [-] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Instance destroyed successfully.#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.050 186962 DEBUG nova.virt.libvirt.vif [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:24:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-864835491',display_name='tempest-ServerDiskConfigTestJSON-server-864835491',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-864835491',id=122,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-ybkvc9v9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:24:26Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=2702fe48-44d0-408d-8d10-fd635e3779c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:34:30:8b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.050 186962 DEBUG nova.network.os_vif_util [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:34:30:8b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.051 186962 DEBUG nova.network.os_vif_util [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.052 186962 DEBUG os_vif [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.054 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.054 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3484baf0-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.055 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.059 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.061 186962 INFO os_vif [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf')#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.065 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.123 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.125 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.186 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.188 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Copying file /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9_resize/disk to 192.168.122.100:/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.188 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9_resize/disk 192.168.122.100:/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.798 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "scp -r /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9_resize/disk 192.168.122.100:/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk" returned: 0 in 0.610s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.799 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Copying file /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:24:34 np0005539505 nova_compute[186958]: 2025-11-29 07:24:34.800 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9_resize/disk.config 192.168.122.100:/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:35 np0005539505 nova_compute[186958]: 2025-11-29 07:24:35.036 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "scp -C -r /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9_resize/disk.config 192.168.122.100:/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.config" returned: 0 in 0.236s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:35 np0005539505 nova_compute[186958]: 2025-11-29 07:24:35.038 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Copying file /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:24:35 np0005539505 nova_compute[186958]: 2025-11-29 07:24:35.038 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9_resize/disk.info 192.168.122.100:/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:24:35 np0005539505 nova_compute[186958]: 2025-11-29 07:24:35.204 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:35 np0005539505 nova_compute[186958]: 2025-11-29 07:24:35.283 186962 DEBUG oslo_concurrency.processutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "scp -C -r /var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9_resize/disk.info 192.168.122.100:/var/lib/nova/instances/2702fe48-44d0-408d-8d10-fd635e3779c9/disk.info" returned: 0 in 0.245s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:24:35 np0005539505 nova_compute[186958]: 2025-11-29 07:24:35.730 186962 DEBUG neutronclient.v2_0.client [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 3484baf0-bfbb-4b67-b841-a369f9a2c534 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:24:36 np0005539505 nova_compute[186958]: 2025-11-29 07:24:36.299 186962 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:36 np0005539505 nova_compute[186958]: 2025-11-29 07:24:36.300 186962 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:36 np0005539505 nova_compute[186958]: 2025-11-29 07:24:36.300 186962 DEBUG oslo_concurrency.lockutils [None req-64149074-e7fe-4f3f-abf6-7eb0a269d8aa 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:36 np0005539505 nova_compute[186958]: 2025-11-29 07:24:36.718 186962 DEBUG nova.compute.manager [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:36 np0005539505 nova_compute[186958]: 2025-11-29 07:24:36.719 186962 DEBUG oslo_concurrency.lockutils [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:36 np0005539505 nova_compute[186958]: 2025-11-29 07:24:36.719 186962 DEBUG oslo_concurrency.lockutils [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:36 np0005539505 nova_compute[186958]: 2025-11-29 07:24:36.719 186962 DEBUG oslo_concurrency.lockutils [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:36 np0005539505 nova_compute[186958]: 2025-11-29 07:24:36.720 186962 DEBUG nova.compute.manager [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] No waiting events found dispatching network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:36 np0005539505 nova_compute[186958]: 2025-11-29 07:24:36.720 186962 WARNING nova.compute.manager [req-23f4ae49-089d-4d87-ad8b-b22b49bf73f7 req-c2a4959c-99e3-46e3-af8f-d3fd48d11bd6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received unexpected event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 29 02:24:37 np0005539505 nova_compute[186958]: 2025-11-29 07:24:37.788 186962 DEBUG nova.compute.manager [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-changed-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:37 np0005539505 nova_compute[186958]: 2025-11-29 07:24:37.788 186962 DEBUG nova.compute.manager [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Refreshing instance network info cache due to event network-changed-3484baf0-bfbb-4b67-b841-a369f9a2c534. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:24:37 np0005539505 nova_compute[186958]: 2025-11-29 07:24:37.789 186962 DEBUG oslo_concurrency.lockutils [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:37 np0005539505 nova_compute[186958]: 2025-11-29 07:24:37.789 186962 DEBUG oslo_concurrency.lockutils [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:37 np0005539505 nova_compute[186958]: 2025-11-29 07:24:37.789 186962 DEBUG nova.network.neutron [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Refreshing network info cache for port 3484baf0-bfbb-4b67-b841-a369f9a2c534 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:24:39 np0005539505 nova_compute[186958]: 2025-11-29 07:24:39.058 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:39 np0005539505 nova_compute[186958]: 2025-11-29 07:24:39.483 186962 DEBUG nova.network.neutron [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updated VIF entry in instance network info cache for port 3484baf0-bfbb-4b67-b841-a369f9a2c534. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:24:39 np0005539505 nova_compute[186958]: 2025-11-29 07:24:39.483 186962 DEBUG nova.network.neutron [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating instance_info_cache with network_info: [{"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:39 np0005539505 nova_compute[186958]: 2025-11-29 07:24:39.875 186962 DEBUG oslo_concurrency.lockutils [req-c89c3a90-90d3-437d-b105-ac43d27e0b44 req-11701fdd-9974-4395-80a4-63e08560be7d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:40 np0005539505 nova_compute[186958]: 2025-11-29 07:24:40.205 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:40 np0005539505 nova_compute[186958]: 2025-11-29 07:24:40.357 186962 DEBUG nova.compute.manager [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:40 np0005539505 nova_compute[186958]: 2025-11-29 07:24:40.358 186962 DEBUG oslo_concurrency.lockutils [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:40 np0005539505 nova_compute[186958]: 2025-11-29 07:24:40.358 186962 DEBUG oslo_concurrency.lockutils [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:40 np0005539505 nova_compute[186958]: 2025-11-29 07:24:40.358 186962 DEBUG oslo_concurrency.lockutils [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:40 np0005539505 nova_compute[186958]: 2025-11-29 07:24:40.359 186962 DEBUG nova.compute.manager [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] No waiting events found dispatching network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:40 np0005539505 nova_compute[186958]: 2025-11-29 07:24:40.359 186962 WARNING nova.compute.manager [req-8a1539a9-9bfc-471e-818c-cad0d9236cef req-09db0890-b6da-467a-8a68-404d349cbd5b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received unexpected event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 for instance with vm_state active and task_state resize_finish.#033[00m
Nov 29 02:24:41 np0005539505 nova_compute[186958]: 2025-11-29 07:24:41.590 186962 DEBUG oslo_concurrency.lockutils [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:41 np0005539505 nova_compute[186958]: 2025-11-29 07:24:41.591 186962 DEBUG oslo_concurrency.lockutils [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:41 np0005539505 nova_compute[186958]: 2025-11-29 07:24:41.591 186962 DEBUG nova.compute.manager [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Going to confirm migration 18 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 29 02:24:41 np0005539505 nova_compute[186958]: 2025-11-29 07:24:41.625 186962 DEBUG nova.objects.instance [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'info_cache' on Instance uuid 2702fe48-44d0-408d-8d10-fd635e3779c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:24:42 np0005539505 nova_compute[186958]: 2025-11-29 07:24:42.077 186962 DEBUG neutronclient.v2_0.client [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 3484baf0-bfbb-4b67-b841-a369f9a2c534 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:24:42 np0005539505 nova_compute[186958]: 2025-11-29 07:24:42.078 186962 DEBUG oslo_concurrency.lockutils [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:24:42 np0005539505 nova_compute[186958]: 2025-11-29 07:24:42.079 186962 DEBUG oslo_concurrency.lockutils [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquired lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:24:42 np0005539505 nova_compute[186958]: 2025-11-29 07:24:42.079 186962 DEBUG nova.network.neutron [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:24:42 np0005539505 nova_compute[186958]: 2025-11-29 07:24:42.497 186962 DEBUG nova.compute.manager [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:24:42 np0005539505 nova_compute[186958]: 2025-11-29 07:24:42.498 186962 DEBUG oslo_concurrency.lockutils [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:42 np0005539505 nova_compute[186958]: 2025-11-29 07:24:42.498 186962 DEBUG oslo_concurrency.lockutils [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:42 np0005539505 nova_compute[186958]: 2025-11-29 07:24:42.498 186962 DEBUG oslo_concurrency.lockutils [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:42 np0005539505 nova_compute[186958]: 2025-11-29 07:24:42.499 186962 DEBUG nova.compute.manager [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] No waiting events found dispatching network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:24:42 np0005539505 nova_compute[186958]: 2025-11-29 07:24:42.499 186962 WARNING nova.compute.manager [req-a2606dcf-5427-4e1a-91db-9be593f5223a req-eb7380d0-4699-41ba-a7da-189ec5ac3f40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Received unexpected event network-vif-plugged-3484baf0-bfbb-4b67-b841-a369f9a2c534 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.222 186962 DEBUG nova.network.neutron [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Updating instance_info_cache with network_info: [{"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.250 186962 DEBUG oslo_concurrency.lockutils [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Releasing lock "refresh_cache-2702fe48-44d0-408d-8d10-fd635e3779c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.250 186962 DEBUG nova.objects.instance [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'migration_context' on Instance uuid 2702fe48-44d0-408d-8d10-fd635e3779c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.279 186962 DEBUG nova.virt.libvirt.vif [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:24:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-864835491',display_name='tempest-ServerDiskConfigTestJSON-server-864835491',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-864835491',id=122,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:24:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-ybkvc9v9',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:24:40Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=2702fe48-44d0-408d-8d10-fd635e3779c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.279 186962 DEBUG nova.network.os_vif_util [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "address": "fa:16:3e:34:30:8b", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3484baf0-bf", "ovs_interfaceid": "3484baf0-bfbb-4b67-b841-a369f9a2c534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.280 186962 DEBUG nova.network.os_vif_util [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.280 186962 DEBUG os_vif [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.282 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.282 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3484baf0-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.283 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.285 186962 INFO os_vif [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:30:8b,bridge_name='br-int',has_traffic_filtering=True,id=3484baf0-bfbb-4b67-b841-a369f9a2c534,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3484baf0-bf')#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.285 186962 DEBUG oslo_concurrency.lockutils [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.286 186962 DEBUG oslo_concurrency.lockutils [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.372 186962 DEBUG nova.compute.provider_tree [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.392 186962 DEBUG nova.scheduler.client.report [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.471 186962 DEBUG oslo_concurrency.lockutils [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.608 186962 INFO nova.scheduler.client.report [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Deleted allocation for migration e6c22fe2-15f6-43e8-b46c-ad0badaec107#033[00m
Nov 29 02:24:43 np0005539505 nova_compute[186958]: 2025-11-29 07:24:43.691 186962 DEBUG oslo_concurrency.lockutils [None req-c40bce44-76ed-49a4-b5d6-7e0f706db039 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2702fe48-44d0-408d-8d10-fd635e3779c9" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:24:44 np0005539505 nova_compute[186958]: 2025-11-29 07:24:44.062 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:45 np0005539505 nova_compute[186958]: 2025-11-29 07:24:45.206 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:45 np0005539505 podman[238605]: 2025-11-29 07:24:45.716298957 +0000 UTC m=+0.048381941 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:24:45 np0005539505 podman[238604]: 2025-11-29 07:24:45.721032861 +0000 UTC m=+0.055399829 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.092 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.094 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.094 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.094 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.094 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.094 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:24:48.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:24:48 np0005539505 nova_compute[186958]: 2025-11-29 07:24:48.472 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401073.470836, 2702fe48-44d0-408d-8d10-fd635e3779c9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:24:48 np0005539505 nova_compute[186958]: 2025-11-29 07:24:48.472 186962 INFO nova.compute.manager [-] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:24:48 np0005539505 nova_compute[186958]: 2025-11-29 07:24:48.572 186962 DEBUG nova.compute.manager [None req-b2351465-9e97-4532-a3d7-d3df31fdc203 - - - - - -] [instance: 2702fe48-44d0-408d-8d10-fd635e3779c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:24:48 np0005539505 podman[238646]: 2025-11-29 07:24:48.708971775 +0000 UTC m=+0.042777022 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 02:24:49 np0005539505 nova_compute[186958]: 2025-11-29 07:24:49.065 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:50 np0005539505 nova_compute[186958]: 2025-11-29 07:24:50.208 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:54 np0005539505 nova_compute[186958]: 2025-11-29 07:24:54.067 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:55 np0005539505 nova_compute[186958]: 2025-11-29 07:24:55.211 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:24:58 np0005539505 podman[238663]: 2025-11-29 07:24:58.706533979 +0000 UTC m=+0.043669687 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:24:58 np0005539505 podman[238664]: 2025-11-29 07:24:58.733194363 +0000 UTC m=+0.067930943 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 02:24:59 np0005539505 nova_compute[186958]: 2025-11-29 07:24:59.070 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:00 np0005539505 nova_compute[186958]: 2025-11-29 07:25:00.251 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:02 np0005539505 podman[238711]: 2025-11-29 07:25:02.73420208 +0000 UTC m=+0.067239135 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:25:02 np0005539505 podman[238712]: 2025-11-29 07:25:02.757708085 +0000 UTC m=+0.088221948 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.073 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.581 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.582 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.598 186962 DEBUG nova.compute.manager [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.708 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.709 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.715 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.715 186962 INFO nova.compute.claims [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.824 186962 DEBUG nova.compute.provider_tree [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.838 186962 DEBUG nova.scheduler.client.report [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.870 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.871 186962 DEBUG nova.compute.manager [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.953 186962 DEBUG nova.compute.manager [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.953 186962 DEBUG nova.network.neutron [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:25:04 np0005539505 nova_compute[186958]: 2025-11-29 07:25:04.982 186962 INFO nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.007 186962 DEBUG nova.compute.manager [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.167 186962 DEBUG nova.compute.manager [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.169 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.170 186962 INFO nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Creating image(s)#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.170 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.171 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.172 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.185 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.244 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.245 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.246 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.257 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.273 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.310 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.311 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.349 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.350 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.351 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.407 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.408 186962 DEBUG nova.virt.disk.api [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Checking if we can resize image /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.408 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.465 186962 DEBUG nova.policy [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '000fb7b950024e16902cd58f2ea16ac9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6d55e57bfd184513a304a61cc1cb3730', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.469 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.470 186962 DEBUG nova.virt.disk.api [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Cannot resize image /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.470 186962 DEBUG nova.objects.instance [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'migration_context' on Instance uuid 3b61bf63-8328-4d31-93e5-0a19ca27cd63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.492 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.492 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Ensure instance console log exists: /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.493 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.493 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:05 np0005539505 nova_compute[186958]: 2025-11-29 07:25:05.493 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:06 np0005539505 nova_compute[186958]: 2025-11-29 07:25:06.653 186962 DEBUG nova.network.neutron [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Successfully created port: bae72aab-bece-4ddf-8a55-f5925e45ca90 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:25:07 np0005539505 nova_compute[186958]: 2025-11-29 07:25:07.723 186962 DEBUG nova.network.neutron [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Successfully updated port: bae72aab-bece-4ddf-8a55-f5925e45ca90 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:25:07 np0005539505 nova_compute[186958]: 2025-11-29 07:25:07.740 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:07 np0005539505 nova_compute[186958]: 2025-11-29 07:25:07.740 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquired lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:07 np0005539505 nova_compute[186958]: 2025-11-29 07:25:07.740 186962 DEBUG nova.network.neutron [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:25:07 np0005539505 nova_compute[186958]: 2025-11-29 07:25:07.862 186962 DEBUG nova.network.neutron [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:25:08 np0005539505 nova_compute[186958]: 2025-11-29 07:25:08.127 186962 DEBUG nova.compute.manager [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-changed-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:08 np0005539505 nova_compute[186958]: 2025-11-29 07:25:08.127 186962 DEBUG nova.compute.manager [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Refreshing instance network info cache due to event network-changed-bae72aab-bece-4ddf-8a55-f5925e45ca90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:25:08 np0005539505 nova_compute[186958]: 2025-11-29 07:25:08.127 186962 DEBUG oslo_concurrency.lockutils [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.076 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.420 186962 DEBUG nova.network.neutron [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating instance_info_cache with network_info: [{"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.449 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Releasing lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.449 186962 DEBUG nova.compute.manager [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Instance network_info: |[{"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.450 186962 DEBUG oslo_concurrency.lockutils [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.451 186962 DEBUG nova.network.neutron [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Refreshing network info cache for port bae72aab-bece-4ddf-8a55-f5925e45ca90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.455 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Start _get_guest_xml network_info=[{"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.463 186962 WARNING nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.475 186962 DEBUG nova.virt.libvirt.host [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.475 186962 DEBUG nova.virt.libvirt.host [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.480 186962 DEBUG nova.virt.libvirt.host [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.480 186962 DEBUG nova.virt.libvirt.host [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.482 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.482 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.482 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.483 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.483 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.483 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.483 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.483 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.484 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.484 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.484 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.485 186962 DEBUG nova.virt.hardware [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.488 186962 DEBUG nova.virt.libvirt.vif [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-284566059',display_name='tempest-ServerDiskConfigTestJSON-server-284566059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-284566059',id=125,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-9kh0vk9d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:05Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=3b61bf63-8328-4d31-93e5-0a19ca27cd63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.488 186962 DEBUG nova.network.os_vif_util [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.489 186962 DEBUG nova.network.os_vif_util [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.490 186962 DEBUG nova.objects.instance [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3b61bf63-8328-4d31-93e5-0a19ca27cd63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.510 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  <uuid>3b61bf63-8328-4d31-93e5-0a19ca27cd63</uuid>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  <name>instance-0000007d</name>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-284566059</nova:name>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:25:09</nova:creationTime>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:        <nova:user uuid="000fb7b950024e16902cd58f2ea16ac9">tempest-ServerDiskConfigTestJSON-1282760174-project-member</nova:user>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:        <nova:project uuid="6d55e57bfd184513a304a61cc1cb3730">tempest-ServerDiskConfigTestJSON-1282760174</nova:project>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:        <nova:port uuid="bae72aab-bece-4ddf-8a55-f5925e45ca90">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <entry name="serial">3b61bf63-8328-4d31-93e5-0a19ca27cd63</entry>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <entry name="uuid">3b61bf63-8328-4d31-93e5-0a19ca27cd63</entry>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.config"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:04:b9:c5"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <target dev="tapbae72aab-be"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/console.log" append="off"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:25:09 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:25:09 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:25:09 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:25:09 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.512 186962 DEBUG nova.compute.manager [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Preparing to wait for external event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.512 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.512 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.513 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.513 186962 DEBUG nova.virt.libvirt.vif [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-284566059',display_name='tempest-ServerDiskConfigTestJSON-server-284566059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-284566059',id=125,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-9kh0vk9d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:05Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=3b61bf63-8328-4d31-93e5-0a19ca27cd63,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.514 186962 DEBUG nova.network.os_vif_util [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.514 186962 DEBUG nova.network.os_vif_util [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.515 186962 DEBUG os_vif [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.515 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.515 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.516 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.519 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.519 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbae72aab-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.520 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbae72aab-be, col_values=(('external_ids', {'iface-id': 'bae72aab-bece-4ddf-8a55-f5925e45ca90', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:04:b9:c5', 'vm-uuid': '3b61bf63-8328-4d31-93e5-0a19ca27cd63'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.521 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:09 np0005539505 NetworkManager[55134]: <info>  [1764401109.5223] manager: (tapbae72aab-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.524 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.527 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.528 186962 INFO os_vif [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be')#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.586 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.587 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.587 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] No VIF found with MAC fa:16:3e:04:b9:c5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.588 186962 INFO nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Using config drive#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.913 186962 INFO nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Creating config drive at /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.config#033[00m
Nov 29 02:25:09 np0005539505 nova_compute[186958]: 2025-11-29 07:25:09.919 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvhb2uw2o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.046 186962 DEBUG oslo_concurrency.processutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvhb2uw2o" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:10 np0005539505 kernel: tapbae72aab-be: entered promiscuous mode
Nov 29 02:25:10 np0005539505 NetworkManager[55134]: <info>  [1764401110.1043] manager: (tapbae72aab-be): new Tun device (/org/freedesktop/NetworkManager/Devices/279)
Nov 29 02:25:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:25:10Z|00566|binding|INFO|Claiming lport bae72aab-bece-4ddf-8a55-f5925e45ca90 for this chassis.
Nov 29 02:25:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:25:10Z|00567|binding|INFO|bae72aab-bece-4ddf-8a55-f5925e45ca90: Claiming fa:16:3e:04:b9:c5 10.100.0.9
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.104 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.114 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:b9:c5 10.100.0.9'], port_security=['fa:16:3e:04:b9:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '3b61bf63-8328-4d31-93e5-0a19ca27cd63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '2', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=bae72aab-bece-4ddf-8a55-f5925e45ca90) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:25:10Z|00568|binding|INFO|Setting lport bae72aab-bece-4ddf-8a55-f5925e45ca90 ovn-installed in OVS
Nov 29 02:25:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:25:10Z|00569|binding|INFO|Setting lport bae72aab-bece-4ddf-8a55-f5925e45ca90 up in Southbound
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.117 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.116 104094 INFO neutron.agent.ovn.metadata.agent [-] Port bae72aab-bece-4ddf-8a55-f5925e45ca90 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 bound to our chassis#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.119 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.122 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:10 np0005539505 systemd-udevd[238781]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.131 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[47cda414-ddb1-46dd-b770-0bc79be0bcba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.133 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b34af6b-e1 in ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.136 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b34af6b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.136 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0078d043-5f50-4ecc-bff6-68a257613830]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.137 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[700ff1bc-a3a4-468d-b6d5-0ff8fdf0aa00]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.146 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[746c30a5-6f12-49d7-a5b0-e589b951afde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 NetworkManager[55134]: <info>  [1764401110.1492] device (tapbae72aab-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:25:10 np0005539505 NetworkManager[55134]: <info>  [1764401110.1499] device (tapbae72aab-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:25:10 np0005539505 systemd-machined[153285]: New machine qemu-65-instance-0000007d.
Nov 29 02:25:10 np0005539505 systemd[1]: Started Virtual Machine qemu-65-instance-0000007d.
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.170 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[95cd5697-1461-4636-92ea-746fa268fb4a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.195 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[0640dd10-eebf-4318-8822-aaa3a78fad51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.200 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5be484f5-9184-4b6e-8ffd-a7261da64276]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 NetworkManager[55134]: <info>  [1764401110.2010] manager: (tap9b34af6b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/280)
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.235 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[4f884bb5-5ac5-4338-8369-da29bc37bf78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.238 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[33e0953b-a948-448a-aa59-0ec6f09408f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.253 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:10 np0005539505 NetworkManager[55134]: <info>  [1764401110.2622] device (tap9b34af6b-e0): carrier: link connected
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.268 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ad277f79-225c-4571-974b-c6e62f05f360]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.284 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ef4d70-a0ce-4218-8ab0-0eb5778bd094]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655784, 'reachable_time': 16122, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238815, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.301 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9775e8e4-34e3-4f9b-a343-cf0f3960aa43]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:40d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 655784, 'tstamp': 655784}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238816, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.317 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0eb5ef06-c8e8-4232-9494-86d38520a7ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b34af6b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:40:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 182], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655784, 'reachable_time': 16122, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238817, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.347 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1d1845-448d-4b74-841f-64dd2b5eb7c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.408 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[89dec2cb-ad53-4df2-a303-0ad2e2050c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.409 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.410 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.410 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b34af6b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.412 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:10 np0005539505 NetworkManager[55134]: <info>  [1764401110.4129] manager: (tap9b34af6b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Nov 29 02:25:10 np0005539505 kernel: tap9b34af6b-e0: entered promiscuous mode
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.415 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.416 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b34af6b-e0, col_values=(('external_ids', {'iface-id': '88f3bff1-58a0-4231-87c4-807c4c2657d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.417 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:25:10Z|00570|binding|INFO|Releasing lport 88f3bff1-58a0-4231-87c4-807c4c2657d5 from this chassis (sb_readonly=0)
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.433 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.434 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.435 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[012f09fe-d198-4e08-a348-097593830b35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.436 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.pid.haproxy
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 9b34af6b-edf9-4b27-b1dc-2b18c2eec958
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:25:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:10.437 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'env', 'PROCESS_TAG=haproxy-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b34af6b-edf9-4b27-b1dc-2b18c2eec958.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.780 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401110.780159, 3b61bf63-8328-4d31-93e5-0a19ca27cd63 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.781 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] VM Started (Lifecycle Event)#033[00m
Nov 29 02:25:10 np0005539505 podman[238855]: 2025-11-29 07:25:10.792692823 +0000 UTC m=+0.056622394 container create 6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.811 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.815 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401110.7803078, 3b61bf63-8328-4d31-93e5-0a19ca27cd63 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.816 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.828 186962 DEBUG nova.compute.manager [req-f617e8e1-1616-4b15-9951-ab03f625e156 req-04128898-66e8-4add-9f08-b0c6d7dcba2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.829 186962 DEBUG oslo_concurrency.lockutils [req-f617e8e1-1616-4b15-9951-ab03f625e156 req-04128898-66e8-4add-9f08-b0c6d7dcba2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.829 186962 DEBUG oslo_concurrency.lockutils [req-f617e8e1-1616-4b15-9951-ab03f625e156 req-04128898-66e8-4add-9f08-b0c6d7dcba2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.829 186962 DEBUG oslo_concurrency.lockutils [req-f617e8e1-1616-4b15-9951-ab03f625e156 req-04128898-66e8-4add-9f08-b0c6d7dcba2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.830 186962 DEBUG nova.compute.manager [req-f617e8e1-1616-4b15-9951-ab03f625e156 req-04128898-66e8-4add-9f08-b0c6d7dcba2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Processing event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.830 186962 DEBUG nova.compute.manager [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:25:10 np0005539505 systemd[1]: Started libpod-conmon-6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941.scope.
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.835 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.836 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.841 186962 INFO nova.virt.libvirt.driver [-] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Instance spawned successfully.#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.841 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.843 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401110.834264, 3b61bf63-8328-4d31-93e5-0a19ca27cd63 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.843 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:25:10 np0005539505 podman[238855]: 2025-11-29 07:25:10.759755701 +0000 UTC m=+0.023685292 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:25:10 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.869 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:10 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eafd999bc40780f48923e9689d499affd8a1d5042cea35ca9df6fdd803a47f99/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.878 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.878 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.878 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.879 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.879 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.879 186962 DEBUG nova.virt.libvirt.driver [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:25:10 np0005539505 podman[238855]: 2025-11-29 07:25:10.88268705 +0000 UTC m=+0.146616621 container init 6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.883 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:25:10 np0005539505 podman[238855]: 2025-11-29 07:25:10.888263478 +0000 UTC m=+0.152193049 container start 6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 02:25:10 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238871]: [NOTICE]   (238875) : New worker (238877) forked
Nov 29 02:25:10 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238871]: [NOTICE]   (238875) : Loading success.
Nov 29 02:25:10 np0005539505 nova_compute[186958]: 2025-11-29 07:25:10.919 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:25:11 np0005539505 nova_compute[186958]: 2025-11-29 07:25:11.053 186962 INFO nova.compute.manager [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Took 5.89 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:25:11 np0005539505 nova_compute[186958]: 2025-11-29 07:25:11.053 186962 DEBUG nova.compute.manager [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:11 np0005539505 nova_compute[186958]: 2025-11-29 07:25:11.247 186962 INFO nova.compute.manager [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Took 6.59 seconds to build instance.#033[00m
Nov 29 02:25:11 np0005539505 nova_compute[186958]: 2025-11-29 07:25:11.290 186962 DEBUG oslo_concurrency.lockutils [None req-720915f9-f56f-4f56-a66d-1975be6a8317 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:11 np0005539505 nova_compute[186958]: 2025-11-29 07:25:11.580 186962 DEBUG nova.network.neutron [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updated VIF entry in instance network info cache for port bae72aab-bece-4ddf-8a55-f5925e45ca90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:25:11 np0005539505 nova_compute[186958]: 2025-11-29 07:25:11.582 186962 DEBUG nova.network.neutron [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating instance_info_cache with network_info: [{"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:11 np0005539505 nova_compute[186958]: 2025-11-29 07:25:11.606 186962 DEBUG oslo_concurrency.lockutils [req-30bfc431-3526-406f-810c-1070f7349e65 req-2a79c8e3-3a54-4468-b6da-a840e358596f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:12 np0005539505 nova_compute[186958]: 2025-11-29 07:25:12.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:12 np0005539505 nova_compute[186958]: 2025-11-29 07:25:12.964 186962 DEBUG nova.compute.manager [req-4a5eb22e-7702-41f8-8aa8-924c049f3ef6 req-9f04311e-d2e3-4d07-b4e1-9b0c9825c93a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:12 np0005539505 nova_compute[186958]: 2025-11-29 07:25:12.965 186962 DEBUG oslo_concurrency.lockutils [req-4a5eb22e-7702-41f8-8aa8-924c049f3ef6 req-9f04311e-d2e3-4d07-b4e1-9b0c9825c93a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:12 np0005539505 nova_compute[186958]: 2025-11-29 07:25:12.965 186962 DEBUG oslo_concurrency.lockutils [req-4a5eb22e-7702-41f8-8aa8-924c049f3ef6 req-9f04311e-d2e3-4d07-b4e1-9b0c9825c93a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:12 np0005539505 nova_compute[186958]: 2025-11-29 07:25:12.965 186962 DEBUG oslo_concurrency.lockutils [req-4a5eb22e-7702-41f8-8aa8-924c049f3ef6 req-9f04311e-d2e3-4d07-b4e1-9b0c9825c93a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:12 np0005539505 nova_compute[186958]: 2025-11-29 07:25:12.965 186962 DEBUG nova.compute.manager [req-4a5eb22e-7702-41f8-8aa8-924c049f3ef6 req-9f04311e-d2e3-4d07-b4e1-9b0c9825c93a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] No waiting events found dispatching network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:12 np0005539505 nova_compute[186958]: 2025-11-29 07:25:12.966 186962 WARNING nova.compute.manager [req-4a5eb22e-7702-41f8-8aa8-924c049f3ef6 req-9f04311e-d2e3-4d07-b4e1-9b0c9825c93a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received unexpected event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 for instance with vm_state active and task_state resize_prep.#033[00m
Nov 29 02:25:14 np0005539505 nova_compute[186958]: 2025-11-29 07:25:14.522 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:15 np0005539505 nova_compute[186958]: 2025-11-29 07:25:15.085 186962 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:15 np0005539505 nova_compute[186958]: 2025-11-29 07:25:15.085 186962 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquired lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:15 np0005539505 nova_compute[186958]: 2025-11-29 07:25:15.085 186962 DEBUG nova.network.neutron [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:25:15 np0005539505 nova_compute[186958]: 2025-11-29 07:25:15.254 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:16 np0005539505 podman[238886]: 2025-11-29 07:25:16.730904137 +0000 UTC m=+0.056368096 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Nov 29 02:25:16 np0005539505 podman[238887]: 2025-11-29 07:25:16.734644593 +0000 UTC m=+0.053153005 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:25:17 np0005539505 nova_compute[186958]: 2025-11-29 07:25:17.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:17 np0005539505 nova_compute[186958]: 2025-11-29 07:25:17.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:17 np0005539505 nova_compute[186958]: 2025-11-29 07:25:17.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:25:17 np0005539505 nova_compute[186958]: 2025-11-29 07:25:17.688 186962 DEBUG nova.network.neutron [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating instance_info_cache with network_info: [{"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:17 np0005539505 nova_compute[186958]: 2025-11-29 07:25:17.710 186962 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Releasing lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:17 np0005539505 nova_compute[186958]: 2025-11-29 07:25:17.862 186962 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 29 02:25:17 np0005539505 nova_compute[186958]: 2025-11-29 07:25:17.863 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Creating file /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/e5ea4643901d44028481b42b31394564.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Nov 29 02:25:17 np0005539505 nova_compute[186958]: 2025-11-29 07:25:17.863 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/e5ea4643901d44028481b42b31394564.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:18 np0005539505 nova_compute[186958]: 2025-11-29 07:25:18.322 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/e5ea4643901d44028481b42b31394564.tmp" returned: 1 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:18 np0005539505 nova_compute[186958]: 2025-11-29 07:25:18.323 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/e5ea4643901d44028481b42b31394564.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 29 02:25:18 np0005539505 nova_compute[186958]: 2025-11-29 07:25:18.324 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Creating directory /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Nov 29 02:25:18 np0005539505 nova_compute[186958]: 2025-11-29 07:25:18.325 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:18 np0005539505 nova_compute[186958]: 2025-11-29 07:25:18.558 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63" returned: 0 in 0.233s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:18 np0005539505 nova_compute[186958]: 2025-11-29 07:25:18.562 186962 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:25:19 np0005539505 nova_compute[186958]: 2025-11-29 07:25:19.526 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:19 np0005539505 podman[238932]: 2025-11-29 07:25:19.734497515 +0000 UTC m=+0.055640316 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:25:20 np0005539505 nova_compute[186958]: 2025-11-29 07:25:20.254 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:21 np0005539505 nova_compute[186958]: 2025-11-29 07:25:21.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:21 np0005539505 nova_compute[186958]: 2025-11-29 07:25:21.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:25:21 np0005539505 nova_compute[186958]: 2025-11-29 07:25:21.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:25:21 np0005539505 nova_compute[186958]: 2025-11-29 07:25:21.622 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:21 np0005539505 nova_compute[186958]: 2025-11-29 07:25:21.622 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:21 np0005539505 nova_compute[186958]: 2025-11-29 07:25:21.623 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:25:21 np0005539505 nova_compute[186958]: 2025-11-29 07:25:21.623 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3b61bf63-8328-4d31-93e5-0a19ca27cd63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:22 np0005539505 ovn_controller[95143]: 2025-11-29T07:25:22Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:04:b9:c5 10.100.0.9
Nov 29 02:25:22 np0005539505 ovn_controller[95143]: 2025-11-29T07:25:22Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:04:b9:c5 10.100.0.9
Nov 29 02:25:23 np0005539505 nova_compute[186958]: 2025-11-29 07:25:23.847 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating instance_info_cache with network_info: [{"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:23 np0005539505 nova_compute[186958]: 2025-11-29 07:25:23.868 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:23 np0005539505 nova_compute[186958]: 2025-11-29 07:25:23.868 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:25:23 np0005539505 nova_compute[186958]: 2025-11-29 07:25:23.869 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:23 np0005539505 nova_compute[186958]: 2025-11-29 07:25:23.869 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:23 np0005539505 nova_compute[186958]: 2025-11-29 07:25:23.905 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:23 np0005539505 nova_compute[186958]: 2025-11-29 07:25:23.905 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:23 np0005539505 nova_compute[186958]: 2025-11-29 07:25:23.906 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:23 np0005539505 nova_compute[186958]: 2025-11-29 07:25:23.906 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:25:24 np0005539505 nova_compute[186958]: 2025-11-29 07:25:24.530 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.257 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.315 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.377 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.378 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.437 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.583 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.584 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5514MB free_disk=73.04518508911133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.584 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.584 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.637 186962 INFO nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating resource usage from migration 0a5d78f3-f034-471a-9f47-7c4fe7ef5c65#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.661 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Migration 0a5d78f3-f034-471a-9f47-7c4fe7ef5c65 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.662 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.662 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.713 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.729 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.750 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:25:25 np0005539505 nova_compute[186958]: 2025-11-29 07:25:25.750 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:27 np0005539505 nova_compute[186958]: 2025-11-29 07:25:27.502 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:27.502 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:27.502 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:27.503 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:29 np0005539505 nova_compute[186958]: 2025-11-29 07:25:29.513 186962 DEBUG nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:25:29 np0005539505 nova_compute[186958]: 2025-11-29 07:25:29.534 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:29 np0005539505 podman[238974]: 2025-11-29 07:25:29.713929714 +0000 UTC m=+0.052212049 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:25:29 np0005539505 podman[238975]: 2025-11-29 07:25:29.744330344 +0000 UTC m=+0.079656045 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:25:30 np0005539505 nova_compute[186958]: 2025-11-29 07:25:30.306 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:30.385 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:30 np0005539505 nova_compute[186958]: 2025-11-29 07:25:30.385 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:30.386 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:25:31 np0005539505 kernel: tapbae72aab-be (unregistering): left promiscuous mode
Nov 29 02:25:31 np0005539505 NetworkManager[55134]: <info>  [1764401131.8120] device (tapbae72aab-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:25:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:25:31Z|00571|binding|INFO|Releasing lport bae72aab-bece-4ddf-8a55-f5925e45ca90 from this chassis (sb_readonly=0)
Nov 29 02:25:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:25:31Z|00572|binding|INFO|Setting lport bae72aab-bece-4ddf-8a55-f5925e45ca90 down in Southbound
Nov 29 02:25:31 np0005539505 nova_compute[186958]: 2025-11-29 07:25:31.820 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:25:31Z|00573|binding|INFO|Removing iface tapbae72aab-be ovn-installed in OVS
Nov 29 02:25:31 np0005539505 nova_compute[186958]: 2025-11-29 07:25:31.838 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:31 np0005539505 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000007d.scope: Deactivated successfully.
Nov 29 02:25:31 np0005539505 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d0000007d.scope: Consumed 13.024s CPU time.
Nov 29 02:25:31 np0005539505 systemd-machined[153285]: Machine qemu-65-instance-0000007d terminated.
Nov 29 02:25:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:31.905 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:b9:c5 10.100.0.9'], port_security=['fa:16:3e:04:b9:c5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '3b61bf63-8328-4d31-93e5-0a19ca27cd63', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6d55e57bfd184513a304a61cc1cb3730', 'neutron:revision_number': '4', 'neutron:security_group_ids': '44bb1ac9-49ae-4c0a-8013-0db5efadb536', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=804567bc-6857-4eb6-aa00-b449f09c69a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=bae72aab-bece-4ddf-8a55-f5925e45ca90) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:25:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:31.906 104094 INFO neutron.agent.ovn.metadata.agent [-] Port bae72aab-bece-4ddf-8a55-f5925e45ca90 in datapath 9b34af6b-edf9-4b27-b1dc-2b18c2eec958 unbound from our chassis#033[00m
Nov 29 02:25:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:31.908 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:25:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:31.911 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fe9d79bf-3cb0-4082-a831-1e4b9a37c39f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:31.912 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 namespace which is not needed anymore#033[00m
Nov 29 02:25:32 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238871]: [NOTICE]   (238875) : haproxy version is 2.8.14-c23fe91
Nov 29 02:25:32 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238871]: [NOTICE]   (238875) : path to executable is /usr/sbin/haproxy
Nov 29 02:25:32 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238871]: [WARNING]  (238875) : Exiting Master process...
Nov 29 02:25:32 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238871]: [ALERT]    (238875) : Current worker (238877) exited with code 143 (Terminated)
Nov 29 02:25:32 np0005539505 neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958[238871]: [WARNING]  (238875) : All workers exited. Exiting... (0)
Nov 29 02:25:32 np0005539505 systemd[1]: libpod-6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941.scope: Deactivated successfully.
Nov 29 02:25:32 np0005539505 podman[239048]: 2025-11-29 07:25:32.041630133 +0000 UTC m=+0.043854952 container died 6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.059 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.064 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:32 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941-userdata-shm.mount: Deactivated successfully.
Nov 29 02:25:32 np0005539505 systemd[1]: var-lib-containers-storage-overlay-eafd999bc40780f48923e9689d499affd8a1d5042cea35ca9df6fdd803a47f99-merged.mount: Deactivated successfully.
Nov 29 02:25:32 np0005539505 podman[239048]: 2025-11-29 07:25:32.092945356 +0000 UTC m=+0.095170175 container cleanup 6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:25:32 np0005539505 systemd[1]: libpod-conmon-6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941.scope: Deactivated successfully.
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.170 186962 DEBUG nova.compute.manager [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-unplugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.172 186962 DEBUG oslo_concurrency.lockutils [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.172 186962 DEBUG oslo_concurrency.lockutils [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.173 186962 DEBUG oslo_concurrency.lockutils [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.173 186962 DEBUG nova.compute.manager [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] No waiting events found dispatching network-vif-unplugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.174 186962 WARNING nova.compute.manager [req-57a491f8-6bd3-49d0-9ffd-055b2f0d5910 req-31a5fa11-440f-4c35-a777-3956961c9a3f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received unexpected event network-vif-unplugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:25:32 np0005539505 podman[239091]: 2025-11-29 07:25:32.351591416 +0000 UTC m=+0.237450011 container remove 6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:25:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:32.359 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6657b625-b23c-4882-8624-18bc23f27fbc]: (4, ('Sat Nov 29 07:25:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941)\n6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941\nSat Nov 29 07:25:32 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 (6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941)\n6eb263dc9762d4a40bf5f4947b5f2ecd20a811bd7d99daadc7f41ce80f08e941\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:32.360 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1c7cb9d9-83fb-423b-9778-c1e5456904eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:32.361 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b34af6b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.363 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:32 np0005539505 kernel: tap9b34af6b-e0: left promiscuous mode
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.391 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:32.394 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fd2a8175-9146-4a23-a769-a3ea2db020a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:32.419 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e1709669-343c-4dad-bd32-12c8a56cfb62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:32.421 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[304a059d-d416-4b0c-bb88-4bb23e2a9e8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:32.440 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a083b027-eca3-4ace-baac-a4d4276675e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 655777, 'reachable_time': 22551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239109, 'error': None, 'target': 'ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:32 np0005539505 systemd[1]: run-netns-ovnmeta\x2d9b34af6b\x2dedf9\x2d4b27\x2db1dc\x2d2b18c2eec958.mount: Deactivated successfully.
Nov 29 02:25:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:32.445 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b34af6b-edf9-4b27-b1dc-2b18c2eec958 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:25:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:32.445 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[fba28c64-0122-4c6c-a072-8226869e787d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.527 186962 INFO nova.virt.libvirt.driver [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Instance shutdown successfully after 13 seconds.#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.533 186962 INFO nova.virt.libvirt.driver [-] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Instance destroyed successfully.#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.534 186962 DEBUG nova.virt.libvirt.vif [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-284566059',display_name='tempest-ServerDiskConfigTestJSON-server-284566059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-284566059',id=125,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-9kh0vk9d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:25:14Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=3b61bf63-8328-4d31-93e5-0a19ca27cd63,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:04:b9:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.534 186962 DEBUG nova.network.os_vif_util [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "vif_mac": "fa:16:3e:04:b9:c5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.535 186962 DEBUG nova.network.os_vif_util [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.535 186962 DEBUG os_vif [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.538 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.538 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbae72aab-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.540 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.543 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.546 186962 INFO os_vif [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be')#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.550 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.603 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.604 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.694 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.696 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Copying file /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63_resize/disk to 192.168.122.101:/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:25:32 np0005539505 nova_compute[186958]: 2025-11-29 07:25:32.696 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63_resize/disk 192.168.122.101:/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:33 np0005539505 nova_compute[186958]: 2025-11-29 07:25:33.534 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "scp -r /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63_resize/disk 192.168.122.101:/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk" returned: 0 in 0.838s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:33 np0005539505 nova_compute[186958]: 2025-11-29 07:25:33.535 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Copying file /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:25:33 np0005539505 nova_compute[186958]: 2025-11-29 07:25:33.536 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63_resize/disk.config 192.168.122.101:/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:33 np0005539505 podman[239121]: 2025-11-29 07:25:33.725014267 +0000 UTC m=+0.058276700 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:25:33 np0005539505 podman[239120]: 2025-11-29 07:25:33.74099496 +0000 UTC m=+0.069058196 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:25:33 np0005539505 nova_compute[186958]: 2025-11-29 07:25:33.770 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "scp -C -r /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63_resize/disk.config 192.168.122.101:/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.config" returned: 0 in 0.234s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:33 np0005539505 nova_compute[186958]: 2025-11-29 07:25:33.771 186962 DEBUG nova.virt.libvirt.volume.remotefs [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Copying file /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Nov 29 02:25:33 np0005539505 nova_compute[186958]: 2025-11-29 07:25:33.772 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63_resize/disk.info 192.168.122.101:/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:34 np0005539505 nova_compute[186958]: 2025-11-29 07:25:34.058 186962 DEBUG oslo_concurrency.processutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] CMD "scp -C -r /var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63_resize/disk.info 192.168.122.101:/var/lib/nova/instances/3b61bf63-8328-4d31-93e5-0a19ca27cd63/disk.info" returned: 0 in 0.286s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:34 np0005539505 nova_compute[186958]: 2025-11-29 07:25:34.256 186962 DEBUG nova.compute.manager [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:34 np0005539505 nova_compute[186958]: 2025-11-29 07:25:34.256 186962 DEBUG oslo_concurrency.lockutils [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:34 np0005539505 nova_compute[186958]: 2025-11-29 07:25:34.257 186962 DEBUG oslo_concurrency.lockutils [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:34 np0005539505 nova_compute[186958]: 2025-11-29 07:25:34.257 186962 DEBUG oslo_concurrency.lockutils [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:34 np0005539505 nova_compute[186958]: 2025-11-29 07:25:34.257 186962 DEBUG nova.compute.manager [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] No waiting events found dispatching network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:34 np0005539505 nova_compute[186958]: 2025-11-29 07:25:34.257 186962 WARNING nova.compute.manager [req-83a6e4c4-a333-43f6-ac8f-bdaee8ee39dc req-74708bfe-06f8-4967-99d7-d3d2f768196f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received unexpected event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 29 02:25:34 np0005539505 nova_compute[186958]: 2025-11-29 07:25:34.418 186962 DEBUG neutronclient.v2_0.client [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port bae72aab-bece-4ddf-8a55-f5925e45ca90 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:25:34 np0005539505 nova_compute[186958]: 2025-11-29 07:25:34.540 186962 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:34 np0005539505 nova_compute[186958]: 2025-11-29 07:25:34.541 186962 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:34 np0005539505 nova_compute[186958]: 2025-11-29 07:25:34.542 186962 DEBUG oslo_concurrency.lockutils [None req-11be2dd5-2cc4-43a8-b9eb-aeaa6e7ec32f 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:35 np0005539505 nova_compute[186958]: 2025-11-29 07:25:35.349 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:25:37.388 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:37 np0005539505 nova_compute[186958]: 2025-11-29 07:25:37.617 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:40 np0005539505 nova_compute[186958]: 2025-11-29 07:25:40.044 186962 DEBUG nova.compute.manager [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-changed-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:40 np0005539505 nova_compute[186958]: 2025-11-29 07:25:40.044 186962 DEBUG nova.compute.manager [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Refreshing instance network info cache due to event network-changed-bae72aab-bece-4ddf-8a55-f5925e45ca90. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:25:40 np0005539505 nova_compute[186958]: 2025-11-29 07:25:40.044 186962 DEBUG oslo_concurrency.lockutils [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:40 np0005539505 nova_compute[186958]: 2025-11-29 07:25:40.045 186962 DEBUG oslo_concurrency.lockutils [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:40 np0005539505 nova_compute[186958]: 2025-11-29 07:25:40.045 186962 DEBUG nova.network.neutron [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Refreshing network info cache for port bae72aab-bece-4ddf-8a55-f5925e45ca90 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:25:40 np0005539505 nova_compute[186958]: 2025-11-29 07:25:40.388 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:41 np0005539505 nova_compute[186958]: 2025-11-29 07:25:41.561 186962 DEBUG nova.network.neutron [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updated VIF entry in instance network info cache for port bae72aab-bece-4ddf-8a55-f5925e45ca90. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:25:41 np0005539505 nova_compute[186958]: 2025-11-29 07:25:41.561 186962 DEBUG nova.network.neutron [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating instance_info_cache with network_info: [{"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:41 np0005539505 nova_compute[186958]: 2025-11-29 07:25:41.590 186962 DEBUG oslo_concurrency.lockutils [req-ffbd0621-ff7d-4f24-b730-ec8143627451 req-8d5beaf4-3e56-4712-8885-72097185fd70 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:41 np0005539505 nova_compute[186958]: 2025-11-29 07:25:41.944 186962 DEBUG nova.compute.manager [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:41 np0005539505 nova_compute[186958]: 2025-11-29 07:25:41.945 186962 DEBUG oslo_concurrency.lockutils [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:41 np0005539505 nova_compute[186958]: 2025-11-29 07:25:41.946 186962 DEBUG oslo_concurrency.lockutils [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:41 np0005539505 nova_compute[186958]: 2025-11-29 07:25:41.946 186962 DEBUG oslo_concurrency.lockutils [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:41 np0005539505 nova_compute[186958]: 2025-11-29 07:25:41.947 186962 DEBUG nova.compute.manager [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] No waiting events found dispatching network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:41 np0005539505 nova_compute[186958]: 2025-11-29 07:25:41.947 186962 WARNING nova.compute.manager [req-1bd185b2-b8b3-4383-b4dc-1d4a4d60aeff req-8abb669a-077b-4135-94a1-c34890a59cdd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received unexpected event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 for instance with vm_state active and task_state resize_finish.#033[00m
Nov 29 02:25:42 np0005539505 nova_compute[186958]: 2025-11-29 07:25:42.656 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:44 np0005539505 nova_compute[186958]: 2025-11-29 07:25:44.057 186962 DEBUG nova.compute.manager [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:25:44 np0005539505 nova_compute[186958]: 2025-11-29 07:25:44.057 186962 DEBUG oslo_concurrency.lockutils [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:44 np0005539505 nova_compute[186958]: 2025-11-29 07:25:44.058 186962 DEBUG oslo_concurrency.lockutils [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:44 np0005539505 nova_compute[186958]: 2025-11-29 07:25:44.058 186962 DEBUG oslo_concurrency.lockutils [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:44 np0005539505 nova_compute[186958]: 2025-11-29 07:25:44.058 186962 DEBUG nova.compute.manager [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] No waiting events found dispatching network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:25:44 np0005539505 nova_compute[186958]: 2025-11-29 07:25:44.059 186962 WARNING nova.compute.manager [req-50211ff9-a3b1-4aab-90e3-9b8061093b39 req-a3070f2c-b79b-42fb-b841-798acbb79333 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Received unexpected event network-vif-plugged-bae72aab-bece-4ddf-8a55-f5925e45ca90 for instance with vm_state resized and task_state None.#033[00m
Nov 29 02:25:44 np0005539505 nova_compute[186958]: 2025-11-29 07:25:44.828 186962 DEBUG oslo_concurrency.lockutils [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:44 np0005539505 nova_compute[186958]: 2025-11-29 07:25:44.829 186962 DEBUG oslo_concurrency.lockutils [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:44 np0005539505 nova_compute[186958]: 2025-11-29 07:25:44.829 186962 DEBUG nova.compute.manager [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Going to confirm migration 20 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 29 02:25:44 np0005539505 nova_compute[186958]: 2025-11-29 07:25:44.877 186962 DEBUG nova.objects.instance [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'info_cache' on Instance uuid 3b61bf63-8328-4d31-93e5-0a19ca27cd63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:45 np0005539505 nova_compute[186958]: 2025-11-29 07:25:45.390 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:45 np0005539505 nova_compute[186958]: 2025-11-29 07:25:45.474 186962 DEBUG neutronclient.v2_0.client [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port bae72aab-bece-4ddf-8a55-f5925e45ca90 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Nov 29 02:25:45 np0005539505 nova_compute[186958]: 2025-11-29 07:25:45.475 186962 DEBUG oslo_concurrency.lockutils [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:25:45 np0005539505 nova_compute[186958]: 2025-11-29 07:25:45.475 186962 DEBUG oslo_concurrency.lockutils [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquired lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:25:45 np0005539505 nova_compute[186958]: 2025-11-29 07:25:45.475 186962 DEBUG nova.network.neutron [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:25:46 np0005539505 nova_compute[186958]: 2025-11-29 07:25:46.944 186962 DEBUG nova.network.neutron [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Updating instance_info_cache with network_info: [{"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:25:46 np0005539505 nova_compute[186958]: 2025-11-29 07:25:46.998 186962 DEBUG oslo_concurrency.lockutils [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Releasing lock "refresh_cache-3b61bf63-8328-4d31-93e5-0a19ca27cd63" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:25:46 np0005539505 nova_compute[186958]: 2025-11-29 07:25:46.999 186962 DEBUG nova.objects.instance [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lazy-loading 'migration_context' on Instance uuid 3b61bf63-8328-4d31-93e5-0a19ca27cd63 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.026 186962 DEBUG nova.virt.libvirt.vif [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-284566059',display_name='tempest-ServerDiskConfigTestJSON-server-284566059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-284566059',id=125,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:25:42Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6d55e57bfd184513a304a61cc1cb3730',ramdisk_id='',reservation_id='r-9kh0vk9d',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1282760174',owner_user_name='tempest-ServerDiskConfigTestJSON-1282760174-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:25:42Z,user_data=None,user_id='000fb7b950024e16902cd58f2ea16ac9',uuid=3b61bf63-8328-4d31-93e5-0a19ca27cd63,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.027 186962 DEBUG nova.network.os_vif_util [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converting VIF {"id": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "address": "fa:16:3e:04:b9:c5", "network": {"id": "9b34af6b-edf9-4b27-b1dc-2b18c2eec958", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-642936070-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6d55e57bfd184513a304a61cc1cb3730", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae72aab-be", "ovs_interfaceid": "bae72aab-bece-4ddf-8a55-f5925e45ca90", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.028 186962 DEBUG nova.network.os_vif_util [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.028 186962 DEBUG os_vif [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.030 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.030 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbae72aab-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.030 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.032 186962 INFO os_vif [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:04:b9:c5,bridge_name='br-int',has_traffic_filtering=True,id=bae72aab-bece-4ddf-8a55-f5925e45ca90,network=Network(9b34af6b-edf9-4b27-b1dc-2b18c2eec958),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae72aab-be')#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.033 186962 DEBUG oslo_concurrency.lockutils [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.033 186962 DEBUG oslo_concurrency.lockutils [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.099 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401132.0986888, 3b61bf63-8328-4d31-93e5-0a19ca27cd63 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.100 186962 INFO nova.compute.manager [-] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.111 186962 DEBUG nova.compute.provider_tree [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.115 186962 DEBUG nova.compute.manager [None req-be8592df-d59f-4c0a-8833-1f4f5343c8f3 - - - - - -] [instance: 3b61bf63-8328-4d31-93e5-0a19ca27cd63] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.127 186962 DEBUG nova.scheduler.client.report [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.170 186962 DEBUG oslo_concurrency.lockutils [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.287 186962 INFO nova.scheduler.client.report [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Deleted allocation for migration 0a5d78f3-f034-471a-9f47-7c4fe7ef5c65#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.358 186962 DEBUG oslo_concurrency.lockutils [None req-9adc3379-1edb-4222-9cd4-2658bfb7434e 000fb7b950024e16902cd58f2ea16ac9 6d55e57bfd184513a304a61cc1cb3730 - - default default] Lock "3b61bf63-8328-4d31-93e5-0a19ca27cd63" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:47 np0005539505 nova_compute[186958]: 2025-11-29 07:25:47.659 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:47 np0005539505 podman[239164]: 2025-11-29 07:25:47.724152181 +0000 UTC m=+0.056087959 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, config_id=edpm, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Nov 29 02:25:47 np0005539505 podman[239165]: 2025-11-29 07:25:47.746651647 +0000 UTC m=+0.074875620 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:25:50 np0005539505 nova_compute[186958]: 2025-11-29 07:25:50.392 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:50 np0005539505 podman[239206]: 2025-11-29 07:25:50.722913392 +0000 UTC m=+0.058012273 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:25:50 np0005539505 nova_compute[186958]: 2025-11-29 07:25:50.777 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:50 np0005539505 nova_compute[186958]: 2025-11-29 07:25:50.777 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:51 np0005539505 nova_compute[186958]: 2025-11-29 07:25:51.595 186962 DEBUG nova.compute.manager [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:25:51 np0005539505 nova_compute[186958]: 2025-11-29 07:25:51.978 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:51 np0005539505 nova_compute[186958]: 2025-11-29 07:25:51.979 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:51 np0005539505 nova_compute[186958]: 2025-11-29 07:25:51.989 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:25:51 np0005539505 nova_compute[186958]: 2025-11-29 07:25:51.989 186962 INFO nova.compute.claims [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:25:52 np0005539505 nova_compute[186958]: 2025-11-29 07:25:52.661 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:53 np0005539505 nova_compute[186958]: 2025-11-29 07:25:53.314 186962 DEBUG nova.compute.provider_tree [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:25:54 np0005539505 nova_compute[186958]: 2025-11-29 07:25:54.480 186962 DEBUG nova.scheduler.client.report [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:25:55 np0005539505 nova_compute[186958]: 2025-11-29 07:25:55.394 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:56 np0005539505 nova_compute[186958]: 2025-11-29 07:25:56.783 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 4.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:56 np0005539505 nova_compute[186958]: 2025-11-29 07:25:56.784 186962 DEBUG nova.compute.manager [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:25:57 np0005539505 nova_compute[186958]: 2025-11-29 07:25:57.663 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.082 186962 DEBUG nova.compute.manager [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.082 186962 DEBUG nova.network.neutron [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.138 186962 INFO nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.161 186962 DEBUG nova.compute.manager [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.545 186962 DEBUG nova.policy [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.709 186962 DEBUG nova.compute.manager [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.711 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.711 186962 INFO nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Creating image(s)#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.712 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.712 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.712 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.726 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.781 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.782 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.783 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.794 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.849 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.850 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.880 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.881 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.881 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.958 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.959 186962 DEBUG nova.virt.disk.api [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:25:58 np0005539505 nova_compute[186958]: 2025-11-29 07:25:58.959 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:25:59 np0005539505 nova_compute[186958]: 2025-11-29 07:25:59.013 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:25:59 np0005539505 nova_compute[186958]: 2025-11-29 07:25:59.014 186962 DEBUG nova.virt.disk.api [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:25:59 np0005539505 nova_compute[186958]: 2025-11-29 07:25:59.014 186962 DEBUG nova.objects.instance [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:25:59 np0005539505 nova_compute[186958]: 2025-11-29 07:25:59.034 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:25:59 np0005539505 nova_compute[186958]: 2025-11-29 07:25:59.035 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Ensure instance console log exists: /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:25:59 np0005539505 nova_compute[186958]: 2025-11-29 07:25:59.035 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:25:59 np0005539505 nova_compute[186958]: 2025-11-29 07:25:59.036 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:25:59 np0005539505 nova_compute[186958]: 2025-11-29 07:25:59.036 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:00 np0005539505 nova_compute[186958]: 2025-11-29 07:26:00.251 186962 DEBUG nova.network.neutron [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Successfully created port: 332c67d6-3bc6-4636-b59f-6368eb8b8a14 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:26:00 np0005539505 nova_compute[186958]: 2025-11-29 07:26:00.448 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:00 np0005539505 podman[239239]: 2025-11-29 07:26:00.741425125 +0000 UTC m=+0.073012947 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:26:00 np0005539505 podman[239240]: 2025-11-29 07:26:00.746151309 +0000 UTC m=+0.074277573 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:26:01 np0005539505 nova_compute[186958]: 2025-11-29 07:26:01.134 186962 DEBUG nova.network.neutron [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Successfully updated port: 332c67d6-3bc6-4636-b59f-6368eb8b8a14 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:26:01 np0005539505 nova_compute[186958]: 2025-11-29 07:26:01.150 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:26:01 np0005539505 nova_compute[186958]: 2025-11-29 07:26:01.151 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:26:01 np0005539505 nova_compute[186958]: 2025-11-29 07:26:01.151 186962 DEBUG nova.network.neutron [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:26:01 np0005539505 nova_compute[186958]: 2025-11-29 07:26:01.209 186962 DEBUG nova.compute.manager [req-0b067d92-316c-4860-911a-46cac12f2ddf req-0a3cf948-3706-4a3e-826c-3a8f7ad2cc58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-changed-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:01 np0005539505 nova_compute[186958]: 2025-11-29 07:26:01.210 186962 DEBUG nova.compute.manager [req-0b067d92-316c-4860-911a-46cac12f2ddf req-0a3cf948-3706-4a3e-826c-3a8f7ad2cc58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Refreshing instance network info cache due to event network-changed-332c67d6-3bc6-4636-b59f-6368eb8b8a14. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:26:01 np0005539505 nova_compute[186958]: 2025-11-29 07:26:01.210 186962 DEBUG oslo_concurrency.lockutils [req-0b067d92-316c-4860-911a-46cac12f2ddf req-0a3cf948-3706-4a3e-826c-3a8f7ad2cc58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:26:01 np0005539505 nova_compute[186958]: 2025-11-29 07:26:01.313 186962 DEBUG nova.network.neutron [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:26:02 np0005539505 nova_compute[186958]: 2025-11-29 07:26:02.665 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.515 186962 DEBUG nova.network.neutron [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updating instance_info_cache with network_info: [{"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.557 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.557 186962 DEBUG nova.compute.manager [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Instance network_info: |[{"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.557 186962 DEBUG oslo_concurrency.lockutils [req-0b067d92-316c-4860-911a-46cac12f2ddf req-0a3cf948-3706-4a3e-826c-3a8f7ad2cc58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.558 186962 DEBUG nova.network.neutron [req-0b067d92-316c-4860-911a-46cac12f2ddf req-0a3cf948-3706-4a3e-826c-3a8f7ad2cc58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Refreshing network info cache for port 332c67d6-3bc6-4636-b59f-6368eb8b8a14 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.560 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Start _get_guest_xml network_info=[{"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.565 186962 WARNING nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.573 186962 DEBUG nova.virt.libvirt.host [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.574 186962 DEBUG nova.virt.libvirt.host [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.581 186962 DEBUG nova.virt.libvirt.host [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.582 186962 DEBUG nova.virt.libvirt.host [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.583 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.583 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.584 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.584 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.584 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.584 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.584 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.585 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.585 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.585 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.585 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.586 186962 DEBUG nova.virt.hardware [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.589 186962 DEBUG nova.virt.libvirt.vif [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-310936262',display_name='tempest-TestNetworkAdvancedServerOps-server-310936262',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-310936262',id=128,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNS74LvKGOBnFlPfVcT+bzwE06TQiEhYzDaSehhA7gKF48QuXG2xstievQQeYVYJy76I2fhF3gR/iV3vHo49vgT/+dyl3wOLnxorQGUWu2JjPvz0ooEDAD+SYpK4Yy329w==',key_name='tempest-TestNetworkAdvancedServerOps-2140036758',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-tvius9cz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:58Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=ce56878f-34d0-4d8d-bc77-ee4b14e32746,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.590 186962 DEBUG nova.network.os_vif_util [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.590 186962 DEBUG nova.network.os_vif_util [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.591 186962 DEBUG nova.objects.instance [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.607 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  <uuid>ce56878f-34d0-4d8d-bc77-ee4b14e32746</uuid>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  <name>instance-00000080</name>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-310936262</nova:name>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:26:03</nova:creationTime>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:        <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:        <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:        <nova:port uuid="332c67d6-3bc6-4636-b59f-6368eb8b8a14">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <entry name="serial">ce56878f-34d0-4d8d-bc77-ee4b14e32746</entry>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <entry name="uuid">ce56878f-34d0-4d8d-bc77-ee4b14e32746</entry>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.config"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:02:b4:e0"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <target dev="tap332c67d6-3b"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/console.log" append="off"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:26:03 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:26:03 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:26:03 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:26:03 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.608 186962 DEBUG nova.compute.manager [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Preparing to wait for external event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.608 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.608 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.609 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.609 186962 DEBUG nova.virt.libvirt.vif [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:25:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-310936262',display_name='tempest-TestNetworkAdvancedServerOps-server-310936262',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-310936262',id=128,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNS74LvKGOBnFlPfVcT+bzwE06TQiEhYzDaSehhA7gKF48QuXG2xstievQQeYVYJy76I2fhF3gR/iV3vHo49vgT/+dyl3wOLnxorQGUWu2JjPvz0ooEDAD+SYpK4Yy329w==',key_name='tempest-TestNetworkAdvancedServerOps-2140036758',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-tvius9cz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:25:58Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=ce56878f-34d0-4d8d-bc77-ee4b14e32746,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.610 186962 DEBUG nova.network.os_vif_util [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.610 186962 DEBUG nova.network.os_vif_util [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.611 186962 DEBUG os_vif [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.611 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.611 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.612 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.614 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.614 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap332c67d6-3b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.614 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap332c67d6-3b, col_values=(('external_ids', {'iface-id': '332c67d6-3bc6-4636-b59f-6368eb8b8a14', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:b4:e0', 'vm-uuid': 'ce56878f-34d0-4d8d-bc77-ee4b14e32746'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.647 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:03 np0005539505 NetworkManager[55134]: <info>  [1764401163.6488] manager: (tap332c67d6-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.650 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.652 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.653 186962 INFO os_vif [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b')#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.848 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.849 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.849 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No VIF found with MAC fa:16:3e:02:b4:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:26:03 np0005539505 nova_compute[186958]: 2025-11-29 07:26:03.850 186962 INFO nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Using config drive#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.304 186962 INFO nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Creating config drive at /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.config#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.309 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprftyiaux execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.431 186962 DEBUG oslo_concurrency.processutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprftyiaux" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:04 np0005539505 kernel: tap332c67d6-3b: entered promiscuous mode
Nov 29 02:26:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:04Z|00574|binding|INFO|Claiming lport 332c67d6-3bc6-4636-b59f-6368eb8b8a14 for this chassis.
Nov 29 02:26:04 np0005539505 NetworkManager[55134]: <info>  [1764401164.4918] manager: (tap332c67d6-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/283)
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.490 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.494 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:04Z|00575|binding|INFO|332c67d6-3bc6-4636-b59f-6368eb8b8a14: Claiming fa:16:3e:02:b4:e0 10.100.0.4
Nov 29 02:26:04 np0005539505 systemd-udevd[239333]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:26:04 np0005539505 podman[239294]: 2025-11-29 07:26:04.526014368 +0000 UTC m=+0.062198742 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:26:04 np0005539505 NetworkManager[55134]: <info>  [1764401164.5328] device (tap332c67d6-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:26:04 np0005539505 NetworkManager[55134]: <info>  [1764401164.5335] device (tap332c67d6-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:26:04 np0005539505 podman[239291]: 2025-11-29 07:26:04.539660014 +0000 UTC m=+0.078065181 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible)
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.547 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:04 np0005539505 systemd-machined[153285]: New machine qemu-66-instance-00000080.
Nov 29 02:26:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:04Z|00576|binding|INFO|Setting lport 332c67d6-3bc6-4636-b59f-6368eb8b8a14 ovn-installed in OVS
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.554 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:04 np0005539505 systemd[1]: Started Virtual Machine qemu-66-instance-00000080.
Nov 29 02:26:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:04Z|00577|binding|INFO|Setting lport 332c67d6-3bc6-4636-b59f-6368eb8b8a14 up in Southbound
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.645 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:b4:e0 10.100.0.4'], port_security=['fa:16:3e:02:b4:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ce56878f-34d0-4d8d-bc77-ee4b14e32746', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-615772bd-4aec-4aff-ba55-f16ad03ef223', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4c56a9cf-f5de-4201-b604-112c4ca8006f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc3dafc6-3da1-477c-a8c4-bd3e3f4c13f9, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=332c67d6-3bc6-4636-b59f-6368eb8b8a14) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.646 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 332c67d6-3bc6-4636-b59f-6368eb8b8a14 in datapath 615772bd-4aec-4aff-ba55-f16ad03ef223 bound to our chassis#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.648 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 615772bd-4aec-4aff-ba55-f16ad03ef223#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.659 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[52de43b3-2f00-40fe-a494-a0f138873b3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.661 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap615772bd-41 in ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.663 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap615772bd-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.663 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb09a3a-9394-4f44-ba44-a9df455117db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.664 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cd844f64-6fbf-452c-ad8c-201cf54a31e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.674 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[98e6d6d0-30f2-48a8-9262-f3c6f30f4c41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.696 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3d5618d3-0688-425e-b60d-037689e08500]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.726 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[5062c39d-5215-478e-9ff2-73a178f79c17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 NetworkManager[55134]: <info>  [1764401164.7332] manager: (tap615772bd-40): new Veth device (/org/freedesktop/NetworkManager/Devices/284)
Nov 29 02:26:04 np0005539505 systemd-udevd[239341]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.732 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc630bd-96d5-4ad5-b5a4-b4d3ab7256a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.771 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[31348060-eb27-4c00-9636-426eddc25665]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.774 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b41630c8-dc24-41ad-b5f9-21ac7c887ec1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 NetworkManager[55134]: <info>  [1764401164.7942] device (tap615772bd-40): carrier: link connected
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.799 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3723ce-8057-4b42-9d71-a41013f68e36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.814 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a33e4ae5-faaf-4d38-afb9-2228dac9d42b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap615772bd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:38:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661237, 'reachable_time': 19224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239374, 'error': None, 'target': 'ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.828 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[21c15505-6338-44d4-aeae-5b94eef3be99]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe08:38d8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 661237, 'tstamp': 661237}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239375, 'error': None, 'target': 'ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.845 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ab8275f3-4775-4a60-a08c-f4e99a6b8fe8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap615772bd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:38:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661237, 'reachable_time': 19224, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239376, 'error': None, 'target': 'ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.874 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f7861d08-004c-4d76-818d-f064c5bb8881]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.937 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[71465652-a5e4-4d8e-b5c0-fc3cae88443d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.939 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap615772bd-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.939 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.940 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap615772bd-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.941 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:04 np0005539505 NetworkManager[55134]: <info>  [1764401164.9421] manager: (tap615772bd-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Nov 29 02:26:04 np0005539505 kernel: tap615772bd-40: entered promiscuous mode
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.943 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.946 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap615772bd-40, col_values=(('external_ids', {'iface-id': 'e71a836b-a9d0-4b09-9143-f69ad5f9879a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.947 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:04Z|00578|binding|INFO|Releasing lport e71a836b-a9d0-4b09-9143-f69ad5f9879a from this chassis (sb_readonly=0)
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.947 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.948 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/615772bd-4aec-4aff-ba55-f16ad03ef223.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/615772bd-4aec-4aff-ba55-f16ad03ef223.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.948 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ea21506e-45e2-4d9a-8eef-3356d3173291]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.949 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-615772bd-4aec-4aff-ba55-f16ad03ef223
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/615772bd-4aec-4aff-ba55-f16ad03ef223.pid.haproxy
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 615772bd-4aec-4aff-ba55-f16ad03ef223
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:26:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:04.950 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223', 'env', 'PROCESS_TAG=haproxy-615772bd-4aec-4aff-ba55-f16ad03ef223', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/615772bd-4aec-4aff-ba55-f16ad03ef223.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.953 186962 DEBUG nova.compute.manager [req-7efac09a-dd79-4ad7-86fa-eb3b0f10d976 req-e4a7e1f7-9a45-440b-84be-c7300f750126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.953 186962 DEBUG oslo_concurrency.lockutils [req-7efac09a-dd79-4ad7-86fa-eb3b0f10d976 req-e4a7e1f7-9a45-440b-84be-c7300f750126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.953 186962 DEBUG oslo_concurrency.lockutils [req-7efac09a-dd79-4ad7-86fa-eb3b0f10d976 req-e4a7e1f7-9a45-440b-84be-c7300f750126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.953 186962 DEBUG oslo_concurrency.lockutils [req-7efac09a-dd79-4ad7-86fa-eb3b0f10d976 req-e4a7e1f7-9a45-440b-84be-c7300f750126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.953 186962 DEBUG nova.compute.manager [req-7efac09a-dd79-4ad7-86fa-eb3b0f10d976 req-e4a7e1f7-9a45-440b-84be-c7300f750126 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Processing event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:26:04 np0005539505 nova_compute[186958]: 2025-11-29 07:26:04.958 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.066 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401165.0658166, ce56878f-34d0-4d8d-bc77-ee4b14e32746 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.066 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] VM Started (Lifecycle Event)#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.069 186962 DEBUG nova.compute.manager [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.072 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.076 186962 INFO nova.virt.libvirt.driver [-] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Instance spawned successfully.#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.076 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.101 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.101 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.103 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.103 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.103 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.104 186962 DEBUG nova.virt.libvirt.driver [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.184 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.187 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.218 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.218 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401165.0661447, ce56878f-34d0-4d8d-bc77-ee4b14e32746 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.219 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.242 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.245 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401165.0720143, ce56878f-34d0-4d8d-bc77-ee4b14e32746 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.245 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.258 186962 INFO nova.compute.manager [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Took 6.55 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.259 186962 DEBUG nova.compute.manager [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.267 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.269 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.298 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:26:05 np0005539505 podman[239415]: 2025-11-29 07:26:05.324519567 +0000 UTC m=+0.059094613 container create d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.353 186962 INFO nova.compute.manager [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Took 13.41 seconds to build instance.#033[00m
Nov 29 02:26:05 np0005539505 systemd[1]: Started libpod-conmon-d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964.scope.
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.379 186962 DEBUG oslo_concurrency.lockutils [None req-5ea37cba-c8f9-4f35-ad63-d7e19fc2e851 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:05 np0005539505 podman[239415]: 2025-11-29 07:26:05.28857766 +0000 UTC m=+0.023152746 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:26:05 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:26:05 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/835c8917a39a35dd61c5758529998f95307dec2a8db023d2ea1e68d1a49ef291/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:26:05 np0005539505 podman[239415]: 2025-11-29 07:26:05.416796249 +0000 UTC m=+0.151371305 container init d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:26:05 np0005539505 podman[239415]: 2025-11-29 07:26:05.422001136 +0000 UTC m=+0.156576182 container start d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:26:05 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239430]: [NOTICE]   (239434) : New worker (239436) forked
Nov 29 02:26:05 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239430]: [NOTICE]   (239434) : Loading success.
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.448 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.796 186962 DEBUG nova.network.neutron [req-0b067d92-316c-4860-911a-46cac12f2ddf req-0a3cf948-3706-4a3e-826c-3a8f7ad2cc58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updated VIF entry in instance network info cache for port 332c67d6-3bc6-4636-b59f-6368eb8b8a14. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.797 186962 DEBUG nova.network.neutron [req-0b067d92-316c-4860-911a-46cac12f2ddf req-0a3cf948-3706-4a3e-826c-3a8f7ad2cc58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updating instance_info_cache with network_info: [{"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:05 np0005539505 nova_compute[186958]: 2025-11-29 07:26:05.892 186962 DEBUG oslo_concurrency.lockutils [req-0b067d92-316c-4860-911a-46cac12f2ddf req-0a3cf948-3706-4a3e-826c-3a8f7ad2cc58 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:26:06 np0005539505 nova_compute[186958]: 2025-11-29 07:26:06.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:07 np0005539505 nova_compute[186958]: 2025-11-29 07:26:07.051 186962 DEBUG nova.compute.manager [req-f9ab8760-9781-42bf-9fa7-2eab3e441e80 req-ff560d60-7626-442b-b54a-294d9cf2b4c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:07 np0005539505 nova_compute[186958]: 2025-11-29 07:26:07.051 186962 DEBUG oslo_concurrency.lockutils [req-f9ab8760-9781-42bf-9fa7-2eab3e441e80 req-ff560d60-7626-442b-b54a-294d9cf2b4c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:07 np0005539505 nova_compute[186958]: 2025-11-29 07:26:07.051 186962 DEBUG oslo_concurrency.lockutils [req-f9ab8760-9781-42bf-9fa7-2eab3e441e80 req-ff560d60-7626-442b-b54a-294d9cf2b4c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:07 np0005539505 nova_compute[186958]: 2025-11-29 07:26:07.052 186962 DEBUG oslo_concurrency.lockutils [req-f9ab8760-9781-42bf-9fa7-2eab3e441e80 req-ff560d60-7626-442b-b54a-294d9cf2b4c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:07 np0005539505 nova_compute[186958]: 2025-11-29 07:26:07.052 186962 DEBUG nova.compute.manager [req-f9ab8760-9781-42bf-9fa7-2eab3e441e80 req-ff560d60-7626-442b-b54a-294d9cf2b4c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] No waiting events found dispatching network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:07 np0005539505 nova_compute[186958]: 2025-11-29 07:26:07.052 186962 WARNING nova.compute.manager [req-f9ab8760-9781-42bf-9fa7-2eab3e441e80 req-ff560d60-7626-442b-b54a-294d9cf2b4c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received unexpected event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:26:08 np0005539505 nova_compute[186958]: 2025-11-29 07:26:08.647 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:09 np0005539505 nova_compute[186958]: 2025-11-29 07:26:09.395 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:09 np0005539505 NetworkManager[55134]: <info>  [1764401169.3967] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Nov 29 02:26:09 np0005539505 NetworkManager[55134]: <info>  [1764401169.3977] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/287)
Nov 29 02:26:09 np0005539505 nova_compute[186958]: 2025-11-29 07:26:09.556 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:09Z|00579|binding|INFO|Releasing lport e71a836b-a9d0-4b09-9143-f69ad5f9879a from this chassis (sb_readonly=0)
Nov 29 02:26:09 np0005539505 nova_compute[186958]: 2025-11-29 07:26:09.589 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:09 np0005539505 nova_compute[186958]: 2025-11-29 07:26:09.692 186962 DEBUG nova.compute.manager [req-78e7d562-0372-45fb-adea-2173c2d77a5e req-0c807b7c-a504-4cf5-896a-e60b8c40f7db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-changed-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:09 np0005539505 nova_compute[186958]: 2025-11-29 07:26:09.695 186962 DEBUG nova.compute.manager [req-78e7d562-0372-45fb-adea-2173c2d77a5e req-0c807b7c-a504-4cf5-896a-e60b8c40f7db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Refreshing instance network info cache due to event network-changed-332c67d6-3bc6-4636-b59f-6368eb8b8a14. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:26:09 np0005539505 nova_compute[186958]: 2025-11-29 07:26:09.695 186962 DEBUG oslo_concurrency.lockutils [req-78e7d562-0372-45fb-adea-2173c2d77a5e req-0c807b7c-a504-4cf5-896a-e60b8c40f7db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:26:09 np0005539505 nova_compute[186958]: 2025-11-29 07:26:09.696 186962 DEBUG oslo_concurrency.lockutils [req-78e7d562-0372-45fb-adea-2173c2d77a5e req-0c807b7c-a504-4cf5-896a-e60b8c40f7db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:26:09 np0005539505 nova_compute[186958]: 2025-11-29 07:26:09.696 186962 DEBUG nova.network.neutron [req-78e7d562-0372-45fb-adea-2173c2d77a5e req-0c807b7c-a504-4cf5-896a-e60b8c40f7db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Refreshing network info cache for port 332c67d6-3bc6-4636-b59f-6368eb8b8a14 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:26:10 np0005539505 nova_compute[186958]: 2025-11-29 07:26:10.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:10 np0005539505 nova_compute[186958]: 2025-11-29 07:26:10.450 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:12 np0005539505 nova_compute[186958]: 2025-11-29 07:26:12.345 186962 DEBUG nova.network.neutron [req-78e7d562-0372-45fb-adea-2173c2d77a5e req-0c807b7c-a504-4cf5-896a-e60b8c40f7db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updated VIF entry in instance network info cache for port 332c67d6-3bc6-4636-b59f-6368eb8b8a14. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:26:12 np0005539505 nova_compute[186958]: 2025-11-29 07:26:12.347 186962 DEBUG nova.network.neutron [req-78e7d562-0372-45fb-adea-2173c2d77a5e req-0c807b7c-a504-4cf5-896a-e60b8c40f7db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updating instance_info_cache with network_info: [{"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:13 np0005539505 nova_compute[186958]: 2025-11-29 07:26:13.728 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:14 np0005539505 nova_compute[186958]: 2025-11-29 07:26:14.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:14Z|00580|memory|INFO|peak resident set size grew 50% in last 3283.6 seconds, from 16000 kB to 24072 kB
Nov 29 02:26:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:14Z|00581|memory|INFO|idl-cells-OVN_Southbound:10091 idl-cells-Open_vSwitch:813 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:345 lflow-cache-entries-cache-matches:275 lflow-cache-size-KB:1439 local_datapath_usage-KB:2 ofctrl_desired_flow_usage-KB:599 ofctrl_installed_flow_usage-KB:440 ofctrl_sb_flow_ref_usage-KB:223
Nov 29 02:26:14 np0005539505 nova_compute[186958]: 2025-11-29 07:26:14.748 186962 DEBUG oslo_concurrency.lockutils [req-78e7d562-0372-45fb-adea-2173c2d77a5e req-0c807b7c-a504-4cf5-896a-e60b8c40f7db 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:26:15 np0005539505 nova_compute[186958]: 2025-11-29 07:26:15.451 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:17 np0005539505 nova_compute[186958]: 2025-11-29 07:26:17.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:17 np0005539505 nova_compute[186958]: 2025-11-29 07:26:17.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:26:18 np0005539505 nova_compute[186958]: 2025-11-29 07:26:18.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:18 np0005539505 podman[239467]: 2025-11-29 07:26:18.720099781 +0000 UTC m=+0.048405761 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:26:18 np0005539505 nova_compute[186958]: 2025-11-29 07:26:18.730 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:18 np0005539505 podman[239466]: 2025-11-29 07:26:18.732158282 +0000 UTC m=+0.061855602 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:26:18 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:18Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:02:b4:e0 10.100.0.4
Nov 29 02:26:18 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:18Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:02:b4:e0 10.100.0.4
Nov 29 02:26:20 np0005539505 nova_compute[186958]: 2025-11-29 07:26:20.454 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:21 np0005539505 nova_compute[186958]: 2025-11-29 07:26:21.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:21 np0005539505 podman[239507]: 2025-11-29 07:26:21.710300099 +0000 UTC m=+0.045597741 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:26:22 np0005539505 nova_compute[186958]: 2025-11-29 07:26:22.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:23 np0005539505 nova_compute[186958]: 2025-11-29 07:26:23.198 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:23 np0005539505 nova_compute[186958]: 2025-11-29 07:26:23.198 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:23 np0005539505 nova_compute[186958]: 2025-11-29 07:26:23.198 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:23 np0005539505 nova_compute[186958]: 2025-11-29 07:26:23.199 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:26:23 np0005539505 nova_compute[186958]: 2025-11-29 07:26:23.731 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:25 np0005539505 nova_compute[186958]: 2025-11-29 07:26:25.498 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:27.504 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:27.505 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:27.505 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:28 np0005539505 nova_compute[186958]: 2025-11-29 07:26:28.734 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.223 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.291 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.291 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.342 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.478 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.480 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5522MB free_disk=73.04518127441406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.480 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.480 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.550 186962 INFO nova.compute.manager [None req-da92d82d-bac0-4cce-8fbe-07cfeec1c879 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Get console output#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.555 213540 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.965 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.965 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:26:29 np0005539505 nova_compute[186958]: 2025-11-29 07:26:29.966 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:26:30 np0005539505 nova_compute[186958]: 2025-11-29 07:26:30.093 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:26:30 np0005539505 nova_compute[186958]: 2025-11-29 07:26:30.522 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:30 np0005539505 nova_compute[186958]: 2025-11-29 07:26:30.746 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:26:31 np0005539505 podman[239535]: 2025-11-29 07:26:31.710758202 +0000 UTC m=+0.046389504 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:26:31 np0005539505 podman[239536]: 2025-11-29 07:26:31.746254247 +0000 UTC m=+0.081202350 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 29 02:26:33 np0005539505 nova_compute[186958]: 2025-11-29 07:26:33.074 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:26:33 np0005539505 nova_compute[186958]: 2025-11-29 07:26:33.074 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:33 np0005539505 nova_compute[186958]: 2025-11-29 07:26:33.735 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:34 np0005539505 nova_compute[186958]: 2025-11-29 07:26:34.070 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:34 np0005539505 nova_compute[186958]: 2025-11-29 07:26:34.070 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:34 np0005539505 nova_compute[186958]: 2025-11-29 07:26:34.070 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:26:34 np0005539505 nova_compute[186958]: 2025-11-29 07:26:34.070 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:26:34 np0005539505 podman[239585]: 2025-11-29 07:26:34.718810167 +0000 UTC m=+0.055525992 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, config_id=multipathd)
Nov 29 02:26:34 np0005539505 podman[239586]: 2025-11-29 07:26:34.728866672 +0000 UTC m=+0.061122741 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm)
Nov 29 02:26:35 np0005539505 nova_compute[186958]: 2025-11-29 07:26:35.164 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:26:35 np0005539505 nova_compute[186958]: 2025-11-29 07:26:35.165 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:26:35 np0005539505 nova_compute[186958]: 2025-11-29 07:26:35.165 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:26:35 np0005539505 nova_compute[186958]: 2025-11-29 07:26:35.165 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:35 np0005539505 nova_compute[186958]: 2025-11-29 07:26:35.313 186962 DEBUG oslo_concurrency.lockutils [None req-9bedc767-4342-4eb0-bec8-ab1aeafc26d5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:35 np0005539505 nova_compute[186958]: 2025-11-29 07:26:35.314 186962 DEBUG oslo_concurrency.lockutils [None req-9bedc767-4342-4eb0-bec8-ab1aeafc26d5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:35 np0005539505 nova_compute[186958]: 2025-11-29 07:26:35.315 186962 DEBUG nova.compute.manager [None req-9bedc767-4342-4eb0-bec8-ab1aeafc26d5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:35 np0005539505 nova_compute[186958]: 2025-11-29 07:26:35.320 186962 DEBUG nova.compute.manager [None req-9bedc767-4342-4eb0-bec8-ab1aeafc26d5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 29 02:26:35 np0005539505 nova_compute[186958]: 2025-11-29 07:26:35.321 186962 DEBUG nova.objects.instance [None req-9bedc767-4342-4eb0-bec8-ab1aeafc26d5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'flavor' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:35 np0005539505 nova_compute[186958]: 2025-11-29 07:26:35.524 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:36 np0005539505 nova_compute[186958]: 2025-11-29 07:26:36.686 186962 DEBUG nova.objects.instance [None req-9bedc767-4342-4eb0-bec8-ab1aeafc26d5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'info_cache' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:37 np0005539505 nova_compute[186958]: 2025-11-29 07:26:37.404 186962 DEBUG nova.virt.libvirt.driver [None req-9bedc767-4342-4eb0-bec8-ab1aeafc26d5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:26:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:37.841 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:26:37 np0005539505 nova_compute[186958]: 2025-11-29 07:26:37.841 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:37.842 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:26:38 np0005539505 nova_compute[186958]: 2025-11-29 07:26:38.751 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:39 np0005539505 kernel: tap332c67d6-3b (unregistering): left promiscuous mode
Nov 29 02:26:39 np0005539505 NetworkManager[55134]: <info>  [1764401199.5915] device (tap332c67d6-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:26:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:39Z|00582|binding|INFO|Releasing lport 332c67d6-3bc6-4636-b59f-6368eb8b8a14 from this chassis (sb_readonly=0)
Nov 29 02:26:39 np0005539505 nova_compute[186958]: 2025-11-29 07:26:39.597 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:39Z|00583|binding|INFO|Setting lport 332c67d6-3bc6-4636-b59f-6368eb8b8a14 down in Southbound
Nov 29 02:26:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:39Z|00584|binding|INFO|Removing iface tap332c67d6-3b ovn-installed in OVS
Nov 29 02:26:39 np0005539505 nova_compute[186958]: 2025-11-29 07:26:39.600 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:39 np0005539505 nova_compute[186958]: 2025-11-29 07:26:39.615 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:39 np0005539505 nova_compute[186958]: 2025-11-29 07:26:39.624 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updating instance_info_cache with network_info: [{"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:39 np0005539505 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000080.scope: Deactivated successfully.
Nov 29 02:26:39 np0005539505 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000080.scope: Consumed 14.352s CPU time.
Nov 29 02:26:39 np0005539505 systemd-machined[153285]: Machine qemu-66-instance-00000080 terminated.
Nov 29 02:26:40 np0005539505 nova_compute[186958]: 2025-11-29 07:26:40.422 186962 INFO nova.virt.libvirt.driver [None req-9bedc767-4342-4eb0-bec8-ab1aeafc26d5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:26:40 np0005539505 nova_compute[186958]: 2025-11-29 07:26:40.430 186962 INFO nova.virt.libvirt.driver [-] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Instance destroyed successfully.#033[00m
Nov 29 02:26:40 np0005539505 nova_compute[186958]: 2025-11-29 07:26:40.431 186962 DEBUG nova.objects.instance [None req-9bedc767-4342-4eb0-bec8-ab1aeafc26d5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'numa_topology' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:40 np0005539505 nova_compute[186958]: 2025-11-29 07:26:40.526 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:40.690 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:b4:e0 10.100.0.4'], port_security=['fa:16:3e:02:b4:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ce56878f-34d0-4d8d-bc77-ee4b14e32746', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-615772bd-4aec-4aff-ba55-f16ad03ef223', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4c56a9cf-f5de-4201-b604-112c4ca8006f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.220'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc3dafc6-3da1-477c-a8c4-bd3e3f4c13f9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=332c67d6-3bc6-4636-b59f-6368eb8b8a14) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:26:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:40.691 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 332c67d6-3bc6-4636-b59f-6368eb8b8a14 in datapath 615772bd-4aec-4aff-ba55-f16ad03ef223 unbound from our chassis#033[00m
Nov 29 02:26:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:40.693 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 615772bd-4aec-4aff-ba55-f16ad03ef223, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:26:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:40.694 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6e943e01-5b9f-4036-854f-b7df12550bb5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:40.694 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223 namespace which is not needed anymore#033[00m
Nov 29 02:26:40 np0005539505 nova_compute[186958]: 2025-11-29 07:26:40.809 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:26:40 np0005539505 nova_compute[186958]: 2025-11-29 07:26:40.809 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:26:40 np0005539505 nova_compute[186958]: 2025-11-29 07:26:40.809 186962 DEBUG nova.compute.manager [None req-9bedc767-4342-4eb0-bec8-ab1aeafc26d5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:40 np0005539505 nova_compute[186958]: 2025-11-29 07:26:40.810 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:26:40 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239430]: [NOTICE]   (239434) : haproxy version is 2.8.14-c23fe91
Nov 29 02:26:40 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239430]: [NOTICE]   (239434) : path to executable is /usr/sbin/haproxy
Nov 29 02:26:40 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239430]: [WARNING]  (239434) : Exiting Master process...
Nov 29 02:26:40 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239430]: [ALERT]    (239434) : Current worker (239436) exited with code 143 (Terminated)
Nov 29 02:26:40 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239430]: [WARNING]  (239434) : All workers exited. Exiting... (0)
Nov 29 02:26:40 np0005539505 systemd[1]: libpod-d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964.scope: Deactivated successfully.
Nov 29 02:26:40 np0005539505 podman[239664]: 2025-11-29 07:26:40.829633565 +0000 UTC m=+0.043710738 container died d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:26:40 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964-userdata-shm.mount: Deactivated successfully.
Nov 29 02:26:40 np0005539505 systemd[1]: var-lib-containers-storage-overlay-835c8917a39a35dd61c5758529998f95307dec2a8db023d2ea1e68d1a49ef291-merged.mount: Deactivated successfully.
Nov 29 02:26:40 np0005539505 podman[239664]: 2025-11-29 07:26:40.868649229 +0000 UTC m=+0.082726402 container cleanup d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:26:40 np0005539505 systemd[1]: libpod-conmon-d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964.scope: Deactivated successfully.
Nov 29 02:26:40 np0005539505 podman[239697]: 2025-11-29 07:26:40.926770454 +0000 UTC m=+0.039816788 container remove d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 02:26:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:40.931 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4ad07d5a-491c-43bc-9291-6d64e5cf2182]: (4, ('Sat Nov 29 07:26:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223 (d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964)\nd4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964\nSat Nov 29 07:26:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223 (d4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964)\nd4d52f902fd63a3a3bcf6bd78ed24e4f2f6a75dd15915fe5a0ee09e7c3408964\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:40.933 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1eaa477e-75d6-4a31-b587-469671a00c07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:40.934 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap615772bd-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:40 np0005539505 nova_compute[186958]: 2025-11-29 07:26:40.944 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:40 np0005539505 kernel: tap615772bd-40: left promiscuous mode
Nov 29 02:26:40 np0005539505 nova_compute[186958]: 2025-11-29 07:26:40.961 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:40.963 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4deec86d-1a72-4df8-852b-dc8729398f8e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:40.984 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[27fdff9a-5474-4148-ae52-a3978409d426]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:40.986 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b85e2715-051e-488a-a956-3be6f978f870]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:41.001 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4b57f11e-e875-4955-b5ca-39d2614865b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 661230, 'reachable_time': 36972, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239717, 'error': None, 'target': 'ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:41.004 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:26:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:41.004 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[8442e0f3-3b46-4df7-bfb0-b63bfb889024]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:41 np0005539505 systemd[1]: run-netns-ovnmeta\x2d615772bd\x2d4aec\x2d4aff\x2dba55\x2df16ad03ef223.mount: Deactivated successfully.
Nov 29 02:26:41 np0005539505 nova_compute[186958]: 2025-11-29 07:26:41.905 186962 DEBUG oslo_concurrency.lockutils [None req-9bedc767-4342-4eb0-bec8-ab1aeafc26d5 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 6.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:42 np0005539505 nova_compute[186958]: 2025-11-29 07:26:42.671 186962 DEBUG nova.compute.manager [req-d7dcce4f-c7b0-44f7-a55d-a806470658be req-fc7a7343-b636-4e11-94f6-b183e236ba4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-vif-unplugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:42 np0005539505 nova_compute[186958]: 2025-11-29 07:26:42.671 186962 DEBUG oslo_concurrency.lockutils [req-d7dcce4f-c7b0-44f7-a55d-a806470658be req-fc7a7343-b636-4e11-94f6-b183e236ba4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:42 np0005539505 nova_compute[186958]: 2025-11-29 07:26:42.672 186962 DEBUG oslo_concurrency.lockutils [req-d7dcce4f-c7b0-44f7-a55d-a806470658be req-fc7a7343-b636-4e11-94f6-b183e236ba4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:42 np0005539505 nova_compute[186958]: 2025-11-29 07:26:42.672 186962 DEBUG oslo_concurrency.lockutils [req-d7dcce4f-c7b0-44f7-a55d-a806470658be req-fc7a7343-b636-4e11-94f6-b183e236ba4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:42 np0005539505 nova_compute[186958]: 2025-11-29 07:26:42.672 186962 DEBUG nova.compute.manager [req-d7dcce4f-c7b0-44f7-a55d-a806470658be req-fc7a7343-b636-4e11-94f6-b183e236ba4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] No waiting events found dispatching network-vif-unplugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:42 np0005539505 nova_compute[186958]: 2025-11-29 07:26:42.672 186962 WARNING nova.compute.manager [req-d7dcce4f-c7b0-44f7-a55d-a806470658be req-fc7a7343-b636-4e11-94f6-b183e236ba4b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received unexpected event network-vif-unplugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 for instance with vm_state stopped and task_state None.#033[00m
Nov 29 02:26:43 np0005539505 nova_compute[186958]: 2025-11-29 07:26:43.753 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:44.844 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:45 np0005539505 nova_compute[186958]: 2025-11-29 07:26:45.192 186962 DEBUG nova.compute.manager [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:45 np0005539505 nova_compute[186958]: 2025-11-29 07:26:45.192 186962 DEBUG oslo_concurrency.lockutils [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:45 np0005539505 nova_compute[186958]: 2025-11-29 07:26:45.192 186962 DEBUG oslo_concurrency.lockutils [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:45 np0005539505 nova_compute[186958]: 2025-11-29 07:26:45.193 186962 DEBUG oslo_concurrency.lockutils [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:45 np0005539505 nova_compute[186958]: 2025-11-29 07:26:45.193 186962 DEBUG nova.compute.manager [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] No waiting events found dispatching network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:45 np0005539505 nova_compute[186958]: 2025-11-29 07:26:45.193 186962 WARNING nova.compute.manager [req-aaa1f71d-1df0-4218-ba87-6767850795cc req-387f8eb2-0140-42dc-a8a9-6897d872004f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received unexpected event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 for instance with vm_state stopped and task_state None.#033[00m
Nov 29 02:26:45 np0005539505 nova_compute[186958]: 2025-11-29 07:26:45.527 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:46 np0005539505 nova_compute[186958]: 2025-11-29 07:26:46.969 186962 INFO nova.compute.manager [None req-15cb7f4f-802d-4d4e-b081-b23b5383bb1b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Get console output#033[00m
Nov 29 02:26:47 np0005539505 nova_compute[186958]: 2025-11-29 07:26:47.380 186962 DEBUG nova.objects.instance [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'flavor' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:47 np0005539505 nova_compute[186958]: 2025-11-29 07:26:47.419 186962 DEBUG nova.objects.instance [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'info_cache' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:47 np0005539505 nova_compute[186958]: 2025-11-29 07:26:47.476 186962 DEBUG oslo_concurrency.lockutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:26:47 np0005539505 nova_compute[186958]: 2025-11-29 07:26:47.476 186962 DEBUG oslo_concurrency.lockutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:26:47 np0005539505 nova_compute[186958]: 2025-11-29 07:26:47.477 186962 DEBUG nova.network.neutron [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.093 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ce56878f-34d0-4d8d-bc77-ee4b14e32746', 'name': 'tempest-TestNetworkAdvancedServerOps-server-310936262', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000080', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'c231e63624d44fc19e0989abfb1afb22', 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'hostId': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.094 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.095 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.095 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.096 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.096 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.097 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.097 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.097 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.097 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-310936262>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-310936262>]
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.098 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.098 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.098 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.099 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.099 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.100 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.100 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.100 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.101 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.101 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-310936262>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-310936262>]
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.101 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.102 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.102 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.102 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-310936262>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-310936262>]
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.102 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.103 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.104 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.105 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.106 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.106 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.106 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-310936262>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-310936262>]
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.107 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.108 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.109 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.109 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.110 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.111 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.111 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.112 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.113 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:26:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:26:48.114 12 DEBUG ceilometer.compute.pollsters [-] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000080, id=ce56878f-34d0-4d8d-bc77-ee4b14e32746>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:26:48 np0005539505 nova_compute[186958]: 2025-11-29 07:26:48.755 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:49 np0005539505 podman[239719]: 2025-11-29 07:26:49.733362389 +0000 UTC m=+0.056826019 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:26:49 np0005539505 podman[239718]: 2025-11-29 07:26:49.737048194 +0000 UTC m=+0.063483978 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:26:50 np0005539505 nova_compute[186958]: 2025-11-29 07:26:50.567 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:50 np0005539505 nova_compute[186958]: 2025-11-29 07:26:50.755 186962 DEBUG nova.network.neutron [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updating instance_info_cache with network_info: [{"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.234 186962 DEBUG oslo_concurrency.lockutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.276 186962 INFO nova.virt.libvirt.driver [-] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Instance destroyed successfully.#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.277 186962 DEBUG nova.objects.instance [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'numa_topology' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.298 186962 DEBUG nova.objects.instance [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.334 186962 DEBUG nova.virt.libvirt.vif [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-310936262',display_name='tempest-TestNetworkAdvancedServerOps-server-310936262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-310936262',id=128,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNS74LvKGOBnFlPfVcT+bzwE06TQiEhYzDaSehhA7gKF48QuXG2xstievQQeYVYJy76I2fhF3gR/iV3vHo49vgT/+dyl3wOLnxorQGUWu2JjPvz0ooEDAD+SYpK4Yy329w==',key_name='tempest-TestNetworkAdvancedServerOps-2140036758',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:26:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-tvius9cz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:26:40Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=ce56878f-34d0-4d8d-bc77-ee4b14e32746,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.335 186962 DEBUG nova.network.os_vif_util [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.336 186962 DEBUG nova.network.os_vif_util [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.336 186962 DEBUG os_vif [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.338 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.338 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap332c67d6-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.340 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.342 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.344 186962 INFO os_vif [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b')#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.351 186962 DEBUG nova.virt.libvirt.driver [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Start _get_guest_xml network_info=[{"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.355 186962 WARNING nova.virt.libvirt.driver [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.372 186962 DEBUG nova.virt.libvirt.host [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.373 186962 DEBUG nova.virt.libvirt.host [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.381 186962 DEBUG nova.virt.libvirt.host [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.382 186962 DEBUG nova.virt.libvirt.host [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.383 186962 DEBUG nova.virt.libvirt.driver [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.383 186962 DEBUG nova.virt.hardware [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.384 186962 DEBUG nova.virt.hardware [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.384 186962 DEBUG nova.virt.hardware [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.385 186962 DEBUG nova.virt.hardware [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.385 186962 DEBUG nova.virt.hardware [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.385 186962 DEBUG nova.virt.hardware [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.385 186962 DEBUG nova.virt.hardware [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.386 186962 DEBUG nova.virt.hardware [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.386 186962 DEBUG nova.virt.hardware [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.386 186962 DEBUG nova.virt.hardware [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.387 186962 DEBUG nova.virt.hardware [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.387 186962 DEBUG nova.objects.instance [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'vcpu_model' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.695 186962 DEBUG oslo_concurrency.processutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.752 186962 DEBUG oslo_concurrency.processutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.config --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.753 186962 DEBUG oslo_concurrency.lockutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.753 186962 DEBUG oslo_concurrency.lockutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.754 186962 DEBUG oslo_concurrency.lockutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.755 186962 DEBUG nova.virt.libvirt.vif [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-310936262',display_name='tempest-TestNetworkAdvancedServerOps-server-310936262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-310936262',id=128,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNS74LvKGOBnFlPfVcT+bzwE06TQiEhYzDaSehhA7gKF48QuXG2xstievQQeYVYJy76I2fhF3gR/iV3vHo49vgT/+dyl3wOLnxorQGUWu2JjPvz0ooEDAD+SYpK4Yy329w==',key_name='tempest-TestNetworkAdvancedServerOps-2140036758',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:26:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-tvius9cz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:26:40Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=ce56878f-34d0-4d8d-bc77-ee4b14e32746,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.755 186962 DEBUG nova.network.os_vif_util [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.756 186962 DEBUG nova.network.os_vif_util [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.757 186962 DEBUG nova.objects.instance [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.802 186962 DEBUG nova.virt.libvirt.driver [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  <uuid>ce56878f-34d0-4d8d-bc77-ee4b14e32746</uuid>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  <name>instance-00000080</name>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-310936262</nova:name>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:26:51</nova:creationTime>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:        <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:        <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:        <nova:port uuid="332c67d6-3bc6-4636-b59f-6368eb8b8a14">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <entry name="serial">ce56878f-34d0-4d8d-bc77-ee4b14e32746</entry>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <entry name="uuid">ce56878f-34d0-4d8d-bc77-ee4b14e32746</entry>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk.config"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:02:b4:e0"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <target dev="tap332c67d6-3b"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/console.log" append="off"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <input type="keyboard" bus="usb"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:26:51 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:26:51 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:26:51 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:26:51 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.804 186962 DEBUG oslo_concurrency.processutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.857 186962 DEBUG oslo_concurrency.processutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.859 186962 DEBUG oslo_concurrency.processutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.914 186962 DEBUG oslo_concurrency.processutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:51 np0005539505 nova_compute[186958]: 2025-11-29 07:26:51.915 186962 DEBUG nova.objects.instance [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'trusted_certs' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.036 186962 DEBUG oslo_concurrency.processutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.088 186962 DEBUG oslo_concurrency.processutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.088 186962 DEBUG nova.virt.disk.api [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.089 186962 DEBUG oslo_concurrency.processutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.138 186962 DEBUG oslo_concurrency.processutils [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.139 186962 DEBUG nova.virt.disk.api [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.139 186962 DEBUG nova.objects.instance [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.158 186962 DEBUG nova.virt.libvirt.vif [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-310936262',display_name='tempest-TestNetworkAdvancedServerOps-server-310936262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-310936262',id=128,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNS74LvKGOBnFlPfVcT+bzwE06TQiEhYzDaSehhA7gKF48QuXG2xstievQQeYVYJy76I2fhF3gR/iV3vHo49vgT/+dyl3wOLnxorQGUWu2JjPvz0ooEDAD+SYpK4Yy329w==',key_name='tempest-TestNetworkAdvancedServerOps-2140036758',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:26:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-tvius9cz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:26:40Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=ce56878f-34d0-4d8d-bc77-ee4b14e32746,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.158 186962 DEBUG nova.network.os_vif_util [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.159 186962 DEBUG nova.network.os_vif_util [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.160 186962 DEBUG os_vif [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.160 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.161 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.161 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.163 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.163 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap332c67d6-3b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.164 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap332c67d6-3b, col_values=(('external_ids', {'iface-id': '332c67d6-3bc6-4636-b59f-6368eb8b8a14', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:b4:e0', 'vm-uuid': 'ce56878f-34d0-4d8d-bc77-ee4b14e32746'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.165 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:52 np0005539505 NetworkManager[55134]: <info>  [1764401212.1660] manager: (tap332c67d6-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/288)
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.167 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.169 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.170 186962 INFO os_vif [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b')#033[00m
Nov 29 02:26:52 np0005539505 kernel: tap332c67d6-3b: entered promiscuous mode
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.244 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:52Z|00585|binding|INFO|Claiming lport 332c67d6-3bc6-4636-b59f-6368eb8b8a14 for this chassis.
Nov 29 02:26:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:52Z|00586|binding|INFO|332c67d6-3bc6-4636-b59f-6368eb8b8a14: Claiming fa:16:3e:02:b4:e0 10.100.0.4
Nov 29 02:26:52 np0005539505 NetworkManager[55134]: <info>  [1764401212.2482] manager: (tap332c67d6-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/289)
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.254 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:b4:e0 10.100.0.4'], port_security=['fa:16:3e:02:b4:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ce56878f-34d0-4d8d-bc77-ee4b14e32746', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-615772bd-4aec-4aff-ba55-f16ad03ef223', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '5', 'neutron:security_group_ids': '4c56a9cf-f5de-4201-b604-112c4ca8006f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.220'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc3dafc6-3da1-477c-a8c4-bd3e3f4c13f9, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=332c67d6-3bc6-4636-b59f-6368eb8b8a14) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.256 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 332c67d6-3bc6-4636-b59f-6368eb8b8a14 in datapath 615772bd-4aec-4aff-ba55-f16ad03ef223 bound to our chassis#033[00m
Nov 29 02:26:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:52Z|00587|binding|INFO|Setting lport 332c67d6-3bc6-4636-b59f-6368eb8b8a14 ovn-installed in OVS
Nov 29 02:26:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:52Z|00588|binding|INFO|Setting lport 332c67d6-3bc6-4636-b59f-6368eb8b8a14 up in Southbound
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.258 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.259 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 615772bd-4aec-4aff-ba55-f16ad03ef223#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.263 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.273 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4f077844-d350-410f-b006-ac8c2c83d976]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.274 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap615772bd-41 in ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.276 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap615772bd-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.276 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[446aa1ec-c4e0-45f0-9629-c4d7758c8f23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.278 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd3f09b-2b39-4a90-b979-6a4083ad0671]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 systemd-udevd[239816]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:26:52 np0005539505 systemd-machined[153285]: New machine qemu-67-instance-00000080.
Nov 29 02:26:52 np0005539505 podman[239785]: 2025-11-29 07:26:52.283871744 +0000 UTC m=+0.077692590 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.289 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[4d5cfeff-9d1d-497d-9c47-b5c09cab84e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 systemd[1]: Started Virtual Machine qemu-67-instance-00000080.
Nov 29 02:26:52 np0005539505 NetworkManager[55134]: <info>  [1764401212.3033] device (tap332c67d6-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:26:52 np0005539505 NetworkManager[55134]: <info>  [1764401212.3048] device (tap332c67d6-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.306 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2258dcdd-9d08-4305-a5ae-b54af112ca39]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.331 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3d424682-a1b0-4e6b-a952-b0735ebb55fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 NetworkManager[55134]: <info>  [1764401212.3382] manager: (tap615772bd-40): new Veth device (/org/freedesktop/NetworkManager/Devices/290)
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.337 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[12cffb6d-78a5-4bda-b73a-c7130f87ad32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.364 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e92c444d-6b46-4ee8-b9bc-3d54695c4cb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.367 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea4361b-f6c1-41dc-ae8b-e75a0b2c5370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 NetworkManager[55134]: <info>  [1764401212.3855] device (tap615772bd-40): carrier: link connected
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.391 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[71ed4906-1640-4bc9-b361-1d2a19f071ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.407 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5650e476-5c15-49c1-b2fd-9be0db0cb939]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap615772bd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:38:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 188], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 665996, 'reachable_time': 18266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239848, 'error': None, 'target': 'ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.420 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[eaf6dc18-9f56-491b-9e28-5e8d350f0ad6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe08:38d8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 665996, 'tstamp': 665996}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239849, 'error': None, 'target': 'ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.436 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[aee756de-f372-436c-bb9c-8420589fb7c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap615772bd-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:08:38:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 188], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 665996, 'reachable_time': 18266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239850, 'error': None, 'target': 'ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.462 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6604f25b-7635-4a60-a4c0-10913e6dc7f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.518 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8f37551a-fd31-4e51-b51d-d296424ef24a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.519 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap615772bd-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.519 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.520 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap615772bd-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:52 np0005539505 kernel: tap615772bd-40: entered promiscuous mode
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.522 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:52 np0005539505 NetworkManager[55134]: <info>  [1764401212.5253] manager: (tap615772bd-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.526 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap615772bd-40, col_values=(('external_ids', {'iface-id': 'e71a836b-a9d0-4b09-9143-f69ad5f9879a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:26:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:52Z|00589|binding|INFO|Releasing lport e71a836b-a9d0-4b09-9143-f69ad5f9879a from this chassis (sb_readonly=0)
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.527 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.528 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.529 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/615772bd-4aec-4aff-ba55-f16ad03ef223.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/615772bd-4aec-4aff-ba55-f16ad03ef223.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.530 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b2fd4e2c-263c-45e5-9728-b082d3925c7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.531 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-615772bd-4aec-4aff-ba55-f16ad03ef223
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/615772bd-4aec-4aff-ba55-f16ad03ef223.pid.haproxy
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 615772bd-4aec-4aff-ba55-f16ad03ef223
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:26:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:26:52.532 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223', 'env', 'PROCESS_TAG=haproxy-615772bd-4aec-4aff-ba55-f16ad03ef223', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/615772bd-4aec-4aff-ba55-f16ad03ef223.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.538 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.831 186962 DEBUG nova.compute.manager [req-faab7135-7718-4574-abf5-f1ca37895912 req-6af9ed02-db3f-4d82-82bc-f7021dbd4439 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.831 186962 DEBUG oslo_concurrency.lockutils [req-faab7135-7718-4574-abf5-f1ca37895912 req-6af9ed02-db3f-4d82-82bc-f7021dbd4439 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.832 186962 DEBUG oslo_concurrency.lockutils [req-faab7135-7718-4574-abf5-f1ca37895912 req-6af9ed02-db3f-4d82-82bc-f7021dbd4439 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.832 186962 DEBUG oslo_concurrency.lockutils [req-faab7135-7718-4574-abf5-f1ca37895912 req-6af9ed02-db3f-4d82-82bc-f7021dbd4439 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.832 186962 DEBUG nova.compute.manager [req-faab7135-7718-4574-abf5-f1ca37895912 req-6af9ed02-db3f-4d82-82bc-f7021dbd4439 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] No waiting events found dispatching network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.833 186962 WARNING nova.compute.manager [req-faab7135-7718-4574-abf5-f1ca37895912 req-6af9ed02-db3f-4d82-82bc-f7021dbd4439 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received unexpected event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 for instance with vm_state stopped and task_state powering-on.#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.867 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Removed pending event for ce56878f-34d0-4d8d-bc77-ee4b14e32746 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.868 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401212.8672657, ce56878f-34d0-4d8d-bc77-ee4b14e32746 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.868 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.870 186962 DEBUG nova.compute.manager [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.874 186962 INFO nova.virt.libvirt.driver [-] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Instance rebooted successfully.#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.875 186962 DEBUG nova.compute.manager [None req-3781b13f-3263-4ceb-a09d-08f783bc337e bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.910 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.914 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:26:52 np0005539505 podman[239890]: 2025-11-29 07:26:52.969628303 +0000 UTC m=+0.068697965 container create 908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.979 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.980 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401212.868132, ce56878f-34d0-4d8d-bc77-ee4b14e32746 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:26:52 np0005539505 nova_compute[186958]: 2025-11-29 07:26:52.980 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] VM Started (Lifecycle Event)#033[00m
Nov 29 02:26:53 np0005539505 systemd[1]: Started libpod-conmon-908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63.scope.
Nov 29 02:26:53 np0005539505 nova_compute[186958]: 2025-11-29 07:26:53.010 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:26:53 np0005539505 nova_compute[186958]: 2025-11-29 07:26:53.015 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:26:53 np0005539505 podman[239890]: 2025-11-29 07:26:52.925102353 +0000 UTC m=+0.024172065 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:26:53 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:26:53 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9b129672d6df072e0524ec72ae6b1c146a5832f37d2bda6412831228c7196191/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:26:53 np0005539505 podman[239890]: 2025-11-29 07:26:53.051747807 +0000 UTC m=+0.150817499 container init 908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 29 02:26:53 np0005539505 podman[239890]: 2025-11-29 07:26:53.056932394 +0000 UTC m=+0.156002056 container start 908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:26:53 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239906]: [NOTICE]   (239910) : New worker (239912) forked
Nov 29 02:26:53 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239906]: [NOTICE]   (239910) : Loading success.
Nov 29 02:26:53 np0005539505 ovn_controller[95143]: 2025-11-29T07:26:53Z|00590|binding|INFO|Releasing lport e71a836b-a9d0-4b09-9143-f69ad5f9879a from this chassis (sb_readonly=0)
Nov 29 02:26:53 np0005539505 nova_compute[186958]: 2025-11-29 07:26:53.773 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:55 np0005539505 nova_compute[186958]: 2025-11-29 07:26:55.512 186962 DEBUG nova.compute.manager [req-f12e6662-e2a8-46ed-a26a-ad4d04a378fd req-ce8a8392-e86c-4e31-9270-25bf820992ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:26:55 np0005539505 nova_compute[186958]: 2025-11-29 07:26:55.513 186962 DEBUG oslo_concurrency.lockutils [req-f12e6662-e2a8-46ed-a26a-ad4d04a378fd req-ce8a8392-e86c-4e31-9270-25bf820992ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:26:55 np0005539505 nova_compute[186958]: 2025-11-29 07:26:55.514 186962 DEBUG oslo_concurrency.lockutils [req-f12e6662-e2a8-46ed-a26a-ad4d04a378fd req-ce8a8392-e86c-4e31-9270-25bf820992ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:26:55 np0005539505 nova_compute[186958]: 2025-11-29 07:26:55.514 186962 DEBUG oslo_concurrency.lockutils [req-f12e6662-e2a8-46ed-a26a-ad4d04a378fd req-ce8a8392-e86c-4e31-9270-25bf820992ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:26:55 np0005539505 nova_compute[186958]: 2025-11-29 07:26:55.514 186962 DEBUG nova.compute.manager [req-f12e6662-e2a8-46ed-a26a-ad4d04a378fd req-ce8a8392-e86c-4e31-9270-25bf820992ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] No waiting events found dispatching network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:26:55 np0005539505 nova_compute[186958]: 2025-11-29 07:26:55.514 186962 WARNING nova.compute.manager [req-f12e6662-e2a8-46ed-a26a-ad4d04a378fd req-ce8a8392-e86c-4e31-9270-25bf820992ea 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received unexpected event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:26:55 np0005539505 nova_compute[186958]: 2025-11-29 07:26:55.568 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:26:57 np0005539505 nova_compute[186958]: 2025-11-29 07:26:57.166 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:00 np0005539505 nova_compute[186958]: 2025-11-29 07:27:00.571 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:02 np0005539505 nova_compute[186958]: 2025-11-29 07:27:02.170 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:02 np0005539505 podman[239921]: 2025-11-29 07:27:02.761117883 +0000 UTC m=+0.089604457 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:27:02 np0005539505 podman[239922]: 2025-11-29 07:27:02.794795736 +0000 UTC m=+0.113969187 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 02:27:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:27:04Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:02:b4:e0 10.100.0.4
Nov 29 02:27:05 np0005539505 nova_compute[186958]: 2025-11-29 07:27:05.575 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:05 np0005539505 podman[239981]: 2025-11-29 07:27:05.756044235 +0000 UTC m=+0.069719524 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:27:05 np0005539505 podman[239980]: 2025-11-29 07:27:05.768315973 +0000 UTC m=+0.085918673 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:27:07 np0005539505 nova_compute[186958]: 2025-11-29 07:27:07.174 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:27:09Z|00591|binding|INFO|Releasing lport e71a836b-a9d0-4b09-9143-f69ad5f9879a from this chassis (sb_readonly=0)
Nov 29 02:27:09 np0005539505 nova_compute[186958]: 2025-11-29 07:27:09.592 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:10 np0005539505 nova_compute[186958]: 2025-11-29 07:27:10.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:10 np0005539505 nova_compute[186958]: 2025-11-29 07:27:10.623 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:11 np0005539505 nova_compute[186958]: 2025-11-29 07:27:11.963 186962 INFO nova.compute.manager [None req-68766ef4-bbb2-4292-bd54-955569ba47d1 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Get console output#033[00m
Nov 29 02:27:11 np0005539505 nova_compute[186958]: 2025-11-29 07:27:11.968 213540 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:27:12 np0005539505 nova_compute[186958]: 2025-11-29 07:27:12.178 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:14 np0005539505 nova_compute[186958]: 2025-11-29 07:27:14.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:15 np0005539505 nova_compute[186958]: 2025-11-29 07:27:15.624 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:16 np0005539505 nova_compute[186958]: 2025-11-29 07:27:16.623 186962 DEBUG nova.compute.manager [req-4bccb54d-9833-431a-bc09-8177072b27c0 req-fb721610-1463-41e4-8827-5508df227487 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-changed-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:16 np0005539505 nova_compute[186958]: 2025-11-29 07:27:16.624 186962 DEBUG nova.compute.manager [req-4bccb54d-9833-431a-bc09-8177072b27c0 req-fb721610-1463-41e4-8827-5508df227487 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Refreshing instance network info cache due to event network-changed-332c67d6-3bc6-4636-b59f-6368eb8b8a14. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:27:16 np0005539505 nova_compute[186958]: 2025-11-29 07:27:16.624 186962 DEBUG oslo_concurrency.lockutils [req-4bccb54d-9833-431a-bc09-8177072b27c0 req-fb721610-1463-41e4-8827-5508df227487 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:27:16 np0005539505 nova_compute[186958]: 2025-11-29 07:27:16.624 186962 DEBUG oslo_concurrency.lockutils [req-4bccb54d-9833-431a-bc09-8177072b27c0 req-fb721610-1463-41e4-8827-5508df227487 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:27:16 np0005539505 nova_compute[186958]: 2025-11-29 07:27:16.624 186962 DEBUG nova.network.neutron [req-4bccb54d-9833-431a-bc09-8177072b27c0 req-fb721610-1463-41e4-8827-5508df227487 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Refreshing network info cache for port 332c67d6-3bc6-4636-b59f-6368eb8b8a14 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.185 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.553 186962 DEBUG oslo_concurrency.lockutils [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.553 186962 DEBUG oslo_concurrency.lockutils [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.554 186962 DEBUG oslo_concurrency.lockutils [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.554 186962 DEBUG oslo_concurrency.lockutils [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.555 186962 DEBUG oslo_concurrency.lockutils [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.618 186962 INFO nova.compute.manager [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Terminating instance#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.722 186962 DEBUG nova.compute.manager [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:27:17 np0005539505 kernel: tap332c67d6-3b (unregistering): left promiscuous mode
Nov 29 02:27:17 np0005539505 NetworkManager[55134]: <info>  [1764401237.7412] device (tap332c67d6-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.749 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:27:17Z|00592|binding|INFO|Releasing lport 332c67d6-3bc6-4636-b59f-6368eb8b8a14 from this chassis (sb_readonly=0)
Nov 29 02:27:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:27:17Z|00593|binding|INFO|Setting lport 332c67d6-3bc6-4636-b59f-6368eb8b8a14 down in Southbound
Nov 29 02:27:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:27:17Z|00594|binding|INFO|Removing iface tap332c67d6-3b ovn-installed in OVS
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.752 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.768 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:17 np0005539505 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000080.scope: Deactivated successfully.
Nov 29 02:27:17 np0005539505 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000080.scope: Consumed 13.141s CPU time.
Nov 29 02:27:17 np0005539505 systemd-machined[153285]: Machine qemu-67-instance-00000080 terminated.
Nov 29 02:27:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:17.904 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:b4:e0 10.100.0.4'], port_security=['fa:16:3e:02:b4:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'ce56878f-34d0-4d8d-bc77-ee4b14e32746', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-615772bd-4aec-4aff-ba55-f16ad03ef223', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '6', 'neutron:security_group_ids': '4c56a9cf-f5de-4201-b604-112c4ca8006f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc3dafc6-3da1-477c-a8c4-bd3e3f4c13f9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=332c67d6-3bc6-4636-b59f-6368eb8b8a14) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:27:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:17.905 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 332c67d6-3bc6-4636-b59f-6368eb8b8a14 in datapath 615772bd-4aec-4aff-ba55-f16ad03ef223 unbound from our chassis#033[00m
Nov 29 02:27:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:17.906 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 615772bd-4aec-4aff-ba55-f16ad03ef223, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:27:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:17.907 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8a180caf-9850-4d9b-b901-50fe2801a0d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:17.908 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223 namespace which is not needed anymore#033[00m
Nov 29 02:27:17 np0005539505 NetworkManager[55134]: <info>  [1764401237.9396] manager: (tap332c67d6-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/292)
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.980 186962 INFO nova.virt.libvirt.driver [-] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Instance destroyed successfully.#033[00m
Nov 29 02:27:17 np0005539505 nova_compute[186958]: 2025-11-29 07:27:17.981 186962 DEBUG nova.objects.instance [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid ce56878f-34d0-4d8d-bc77-ee4b14e32746 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:18 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239906]: [NOTICE]   (239910) : haproxy version is 2.8.14-c23fe91
Nov 29 02:27:18 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239906]: [NOTICE]   (239910) : path to executable is /usr/sbin/haproxy
Nov 29 02:27:18 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239906]: [WARNING]  (239910) : Exiting Master process...
Nov 29 02:27:18 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239906]: [ALERT]    (239910) : Current worker (239912) exited with code 143 (Terminated)
Nov 29 02:27:18 np0005539505 neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223[239906]: [WARNING]  (239910) : All workers exited. Exiting... (0)
Nov 29 02:27:18 np0005539505 systemd[1]: libpod-908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63.scope: Deactivated successfully.
Nov 29 02:27:18 np0005539505 podman[240053]: 2025-11-29 07:27:18.029709606 +0000 UTC m=+0.042192625 container died 908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:27:18 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63-userdata-shm.mount: Deactivated successfully.
Nov 29 02:27:18 np0005539505 systemd[1]: var-lib-containers-storage-overlay-9b129672d6df072e0524ec72ae6b1c146a5832f37d2bda6412831228c7196191-merged.mount: Deactivated successfully.
Nov 29 02:27:18 np0005539505 podman[240053]: 2025-11-29 07:27:18.070593103 +0000 UTC m=+0.083076122 container cleanup 908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:27:18 np0005539505 systemd[1]: libpod-conmon-908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63.scope: Deactivated successfully.
Nov 29 02:27:18 np0005539505 podman[240085]: 2025-11-29 07:27:18.142478438 +0000 UTC m=+0.047750213 container remove 908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:27:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:18.148 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9682cb39-b2a0-41e1-9724-5045fb0f46ba]: (4, ('Sat Nov 29 07:27:17 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223 (908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63)\n908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63\nSat Nov 29 07:27:18 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223 (908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63)\n908e96a6a2cd141badd579d73be19484c80bdc5576668d2729686d675097cd63\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:18.150 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cee1f89b-bab6-4e5c-9b22-072d4eb26310]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:18.152 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap615772bd-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.155 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:18 np0005539505 kernel: tap615772bd-40: left promiscuous mode
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.180 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:18.185 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9f76ebdc-e771-48e2-ad67-6b6f0d8bc25c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:18.201 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9761dbcf-9c3f-4eab-8557-37ccff9fcd32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:18.203 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a8dfd20b-55cc-42ed-8f85-d828cf7bb185]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:18.220 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ca06dc0c-fea4-46c1-8eef-fcf076eafd0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 665991, 'reachable_time': 26729, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240104, 'error': None, 'target': 'ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:18.222 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-615772bd-4aec-4aff-ba55-f16ad03ef223 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:27:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:18.222 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[b8a040e1-e013-41d9-b4fe-de452566ebf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:18 np0005539505 systemd[1]: run-netns-ovnmeta\x2d615772bd\x2d4aec\x2d4aff\x2dba55\x2df16ad03ef223.mount: Deactivated successfully.
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.286 186962 DEBUG nova.virt.libvirt.vif [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:25:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-310936262',display_name='tempest-TestNetworkAdvancedServerOps-server-310936262',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-310936262',id=128,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNS74LvKGOBnFlPfVcT+bzwE06TQiEhYzDaSehhA7gKF48QuXG2xstievQQeYVYJy76I2fhF3gR/iV3vHo49vgT/+dyl3wOLnxorQGUWu2JjPvz0ooEDAD+SYpK4Yy329w==',key_name='tempest-TestNetworkAdvancedServerOps-2140036758',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:26:05Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-tvius9cz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:26:52Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=ce56878f-34d0-4d8d-bc77-ee4b14e32746,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.286 186962 DEBUG nova.network.os_vif_util [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.287 186962 DEBUG nova.network.os_vif_util [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.287 186962 DEBUG os_vif [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.289 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.289 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap332c67d6-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.291 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.292 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.295 186962 INFO os_vif [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:b4:e0,bridge_name='br-int',has_traffic_filtering=True,id=332c67d6-3bc6-4636-b59f-6368eb8b8a14,network=Network(615772bd-4aec-4aff-ba55-f16ad03ef223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap332c67d6-3b')#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.296 186962 INFO nova.virt.libvirt.driver [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Deleting instance files /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746_del#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.297 186962 INFO nova.virt.libvirt.driver [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Deletion of /var/lib/nova/instances/ce56878f-34d0-4d8d-bc77-ee4b14e32746_del complete#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.606 186962 INFO nova.compute.manager [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Took 0.88 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.606 186962 DEBUG oslo.service.loopingcall [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.607 186962 DEBUG nova.compute.manager [-] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:27:18 np0005539505 nova_compute[186958]: 2025-11-29 07:27:18.607 186962 DEBUG nova.network.neutron [-] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:27:19 np0005539505 nova_compute[186958]: 2025-11-29 07:27:19.046 186962 DEBUG nova.compute.manager [req-a1f1968d-565f-4a66-972a-f6739d417030 req-4d596409-6f2f-4222-a3f7-532e2bde1ff8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-vif-unplugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:19 np0005539505 nova_compute[186958]: 2025-11-29 07:27:19.047 186962 DEBUG oslo_concurrency.lockutils [req-a1f1968d-565f-4a66-972a-f6739d417030 req-4d596409-6f2f-4222-a3f7-532e2bde1ff8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:19 np0005539505 nova_compute[186958]: 2025-11-29 07:27:19.047 186962 DEBUG oslo_concurrency.lockutils [req-a1f1968d-565f-4a66-972a-f6739d417030 req-4d596409-6f2f-4222-a3f7-532e2bde1ff8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:19 np0005539505 nova_compute[186958]: 2025-11-29 07:27:19.047 186962 DEBUG oslo_concurrency.lockutils [req-a1f1968d-565f-4a66-972a-f6739d417030 req-4d596409-6f2f-4222-a3f7-532e2bde1ff8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:19 np0005539505 nova_compute[186958]: 2025-11-29 07:27:19.047 186962 DEBUG nova.compute.manager [req-a1f1968d-565f-4a66-972a-f6739d417030 req-4d596409-6f2f-4222-a3f7-532e2bde1ff8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] No waiting events found dispatching network-vif-unplugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:19 np0005539505 nova_compute[186958]: 2025-11-29 07:27:19.047 186962 DEBUG nova.compute.manager [req-a1f1968d-565f-4a66-972a-f6739d417030 req-4d596409-6f2f-4222-a3f7-532e2bde1ff8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-vif-unplugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:27:19 np0005539505 nova_compute[186958]: 2025-11-29 07:27:19.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:20 np0005539505 nova_compute[186958]: 2025-11-29 07:27:20.627 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:20 np0005539505 podman[240106]: 2025-11-29 07:27:20.753465472 +0000 UTC m=+0.070142846 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:27:20 np0005539505 podman[240105]: 2025-11-29 07:27:20.763476566 +0000 UTC m=+0.086066057 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Nov 29 02:27:21 np0005539505 nova_compute[186958]: 2025-11-29 07:27:21.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:21 np0005539505 nova_compute[186958]: 2025-11-29 07:27:21.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:22 np0005539505 podman[240152]: 2025-11-29 07:27:22.7486647 +0000 UTC m=+0.069590851 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 29 02:27:23 np0005539505 nova_compute[186958]: 2025-11-29 07:27:23.292 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:23 np0005539505 nova_compute[186958]: 2025-11-29 07:27:23.908 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:23 np0005539505 nova_compute[186958]: 2025-11-29 07:27:23.909 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:27:23 np0005539505 nova_compute[186958]: 2025-11-29 07:27:23.909 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.762 186962 DEBUG nova.compute.manager [req-3978aad2-6532-42a3-b8d2-9d4c0b3c0fa9 req-aed04e9a-2095-4bdc-b7cf-6483a4c3d247 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.763 186962 DEBUG oslo_concurrency.lockutils [req-3978aad2-6532-42a3-b8d2-9d4c0b3c0fa9 req-aed04e9a-2095-4bdc-b7cf-6483a4c3d247 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.763 186962 DEBUG oslo_concurrency.lockutils [req-3978aad2-6532-42a3-b8d2-9d4c0b3c0fa9 req-aed04e9a-2095-4bdc-b7cf-6483a4c3d247 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.764 186962 DEBUG oslo_concurrency.lockutils [req-3978aad2-6532-42a3-b8d2-9d4c0b3c0fa9 req-aed04e9a-2095-4bdc-b7cf-6483a4c3d247 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.764 186962 DEBUG nova.compute.manager [req-3978aad2-6532-42a3-b8d2-9d4c0b3c0fa9 req-aed04e9a-2095-4bdc-b7cf-6483a4c3d247 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] No waiting events found dispatching network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.764 186962 WARNING nova.compute.manager [req-3978aad2-6532-42a3-b8d2-9d4c0b3c0fa9 req-aed04e9a-2095-4bdc-b7cf-6483a4c3d247 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received unexpected event network-vif-plugged-332c67d6-3bc6-4636-b59f-6368eb8b8a14 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.765 186962 DEBUG nova.compute.manager [req-53c39158-2293-4ca7-9c81-eccd063e16ce req-afd6f716-c32a-4124-85c0-fd8bcf8f2237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Received event network-vif-deleted-332c67d6-3bc6-4636-b59f-6368eb8b8a14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.766 186962 INFO nova.compute.manager [req-53c39158-2293-4ca7-9c81-eccd063e16ce req-afd6f716-c32a-4124-85c0-fd8bcf8f2237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Neutron deleted interface 332c67d6-3bc6-4636-b59f-6368eb8b8a14; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.766 186962 DEBUG nova.network.neutron [req-53c39158-2293-4ca7-9c81-eccd063e16ce req-afd6f716-c32a-4124-85c0-fd8bcf8f2237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.932 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.932 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.933 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.951 186962 DEBUG nova.network.neutron [-] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.993 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.994 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.994 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.994 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:27:24 np0005539505 nova_compute[186958]: 2025-11-29 07:27:24.999 186962 INFO nova.compute.manager [-] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Took 6.39 seconds to deallocate network for instance.#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.111 186962 DEBUG nova.compute.manager [req-53c39158-2293-4ca7-9c81-eccd063e16ce req-afd6f716-c32a-4124-85c0-fd8bcf8f2237 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Detach interface failed, port_id=332c67d6-3bc6-4636-b59f-6368eb8b8a14, reason: Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.160 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.161 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5683MB free_disk=73.07388687133789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.161 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.161 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.471 186962 WARNING nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance ce56878f-34d0-4d8d-bc77-ee4b14e32746 is not being actively managed by this compute host but has allocations referencing this compute host: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocation because we do not know what to do.#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.472 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.472 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.532 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.612 186962 DEBUG oslo_concurrency.lockutils [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.620 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.629 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.666 186962 DEBUG nova.network.neutron [req-4bccb54d-9833-431a-bc09-8177072b27c0 req-fb721610-1463-41e4-8827-5508df227487 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updated VIF entry in instance network info cache for port 332c67d6-3bc6-4636-b59f-6368eb8b8a14. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:27:25 np0005539505 nova_compute[186958]: 2025-11-29 07:27:25.667 186962 DEBUG nova.network.neutron [req-4bccb54d-9833-431a-bc09-8177072b27c0 req-fb721610-1463-41e4-8827-5508df227487 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Updating instance_info_cache with network_info: [{"id": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "address": "fa:16:3e:02:b4:e0", "network": {"id": "615772bd-4aec-4aff-ba55-f16ad03ef223", "bridge": "br-int", "label": "tempest-network-smoke--569883081", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap332c67d6-3b", "ovs_interfaceid": "332c67d6-3bc6-4636-b59f-6368eb8b8a14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:27:26 np0005539505 nova_compute[186958]: 2025-11-29 07:27:26.336 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:27:26 np0005539505 nova_compute[186958]: 2025-11-29 07:27:26.337 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:26 np0005539505 nova_compute[186958]: 2025-11-29 07:27:26.337 186962 DEBUG oslo_concurrency.lockutils [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.725s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:26 np0005539505 nova_compute[186958]: 2025-11-29 07:27:26.342 186962 DEBUG oslo_concurrency.lockutils [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:26 np0005539505 nova_compute[186958]: 2025-11-29 07:27:26.495 186962 DEBUG oslo_concurrency.lockutils [req-4bccb54d-9833-431a-bc09-8177072b27c0 req-fb721610-1463-41e4-8827-5508df227487 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-ce56878f-34d0-4d8d-bc77-ee4b14e32746" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:27:26 np0005539505 nova_compute[186958]: 2025-11-29 07:27:26.519 186962 INFO nova.scheduler.client.report [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Deleted allocations for instance ce56878f-34d0-4d8d-bc77-ee4b14e32746#033[00m
Nov 29 02:27:26 np0005539505 nova_compute[186958]: 2025-11-29 07:27:26.802 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:27.507 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:27.508 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:27.508 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:27 np0005539505 nova_compute[186958]: 2025-11-29 07:27:27.720 186962 DEBUG oslo_concurrency.lockutils [None req-c1225abe-05ee-4c1f-a013-dcec0db0ffaf bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "ce56878f-34d0-4d8d-bc77-ee4b14e32746" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:28 np0005539505 nova_compute[186958]: 2025-11-29 07:27:28.293 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:30 np0005539505 nova_compute[186958]: 2025-11-29 07:27:30.632 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:31 np0005539505 nova_compute[186958]: 2025-11-29 07:27:31.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:31 np0005539505 nova_compute[186958]: 2025-11-29 07:27:31.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:27:32 np0005539505 nova_compute[186958]: 2025-11-29 07:27:32.979 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401237.9781272, ce56878f-34d0-4d8d-bc77-ee4b14e32746 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:27:32 np0005539505 nova_compute[186958]: 2025-11-29 07:27:32.979 186962 INFO nova.compute.manager [-] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:27:33 np0005539505 nova_compute[186958]: 2025-11-29 07:27:33.295 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:33 np0005539505 podman[240172]: 2025-11-29 07:27:33.71820078 +0000 UTC m=+0.048623267 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:27:33 np0005539505 podman[240173]: 2025-11-29 07:27:33.761995 +0000 UTC m=+0.085605654 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 02:27:34 np0005539505 nova_compute[186958]: 2025-11-29 07:27:34.228 186962 DEBUG nova.compute.manager [None req-427507d3-a29d-4955-a8f1-550f8bccf39f - - - - - -] [instance: ce56878f-34d0-4d8d-bc77-ee4b14e32746] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:34 np0005539505 nova_compute[186958]: 2025-11-29 07:27:34.635 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:34 np0005539505 nova_compute[186958]: 2025-11-29 07:27:34.635 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:27:34 np0005539505 nova_compute[186958]: 2025-11-29 07:27:34.635 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:27:34 np0005539505 nova_compute[186958]: 2025-11-29 07:27:34.820 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:27:35 np0005539505 nova_compute[186958]: 2025-11-29 07:27:35.633 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:36 np0005539505 podman[240222]: 2025-11-29 07:27:36.767637605 +0000 UTC m=+0.094954468 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:27:36 np0005539505 podman[240223]: 2025-11-29 07:27:36.78477504 +0000 UTC m=+0.105828276 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:27:37 np0005539505 nova_compute[186958]: 2025-11-29 07:27:37.527 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:37 np0005539505 nova_compute[186958]: 2025-11-29 07:27:37.683 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:38 np0005539505 nova_compute[186958]: 2025-11-29 07:27:38.297 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:38.483 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:27:38 np0005539505 nova_compute[186958]: 2025-11-29 07:27:38.483 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:38.485 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:27:40 np0005539505 nova_compute[186958]: 2025-11-29 07:27:40.657 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:43 np0005539505 nova_compute[186958]: 2025-11-29 07:27:43.298 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:43 np0005539505 nova_compute[186958]: 2025-11-29 07:27:43.407 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:43 np0005539505 nova_compute[186958]: 2025-11-29 07:27:43.407 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:43 np0005539505 nova_compute[186958]: 2025-11-29 07:27:43.825 186962 DEBUG nova.compute.manager [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.122 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.123 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.131 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.131 186962 INFO nova.compute.claims [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.308 186962 DEBUG nova.compute.provider_tree [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.337 186962 DEBUG nova.scheduler.client.report [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.385 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.386 186962 DEBUG nova.compute.manager [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.527 186962 DEBUG nova.compute.manager [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.528 186962 DEBUG nova.network.neutron [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.550 186962 INFO nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.571 186962 DEBUG nova.compute.manager [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.705 186962 DEBUG nova.compute.manager [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.707 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.707 186962 INFO nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Creating image(s)#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.708 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.708 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.709 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.728 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.783 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.785 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.786 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.809 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.864 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.866 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.903 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.904 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.905 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.960 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.961 186962 DEBUG nova.virt.disk.api [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Checking if we can resize image /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:27:44 np0005539505 nova_compute[186958]: 2025-11-29 07:27:44.962 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:45 np0005539505 nova_compute[186958]: 2025-11-29 07:27:45.017 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:45 np0005539505 nova_compute[186958]: 2025-11-29 07:27:45.019 186962 DEBUG nova.virt.disk.api [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Cannot resize image /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:27:45 np0005539505 nova_compute[186958]: 2025-11-29 07:27:45.019 186962 DEBUG nova.objects.instance [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'migration_context' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:45 np0005539505 nova_compute[186958]: 2025-11-29 07:27:45.032 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:27:45 np0005539505 nova_compute[186958]: 2025-11-29 07:27:45.032 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Ensure instance console log exists: /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:27:45 np0005539505 nova_compute[186958]: 2025-11-29 07:27:45.033 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:45 np0005539505 nova_compute[186958]: 2025-11-29 07:27:45.033 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:45 np0005539505 nova_compute[186958]: 2025-11-29 07:27:45.034 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:45 np0005539505 nova_compute[186958]: 2025-11-29 07:27:45.653 186962 DEBUG nova.policy [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:27:45 np0005539505 nova_compute[186958]: 2025-11-29 07:27:45.658 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:47 np0005539505 nova_compute[186958]: 2025-11-29 07:27:47.307 186962 DEBUG nova.network.neutron [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Successfully created port: 1e2538b2-9233-45f7-9334-e7fcdba1da31 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:27:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:47.487 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:48 np0005539505 nova_compute[186958]: 2025-11-29 07:27:48.300 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:50 np0005539505 nova_compute[186958]: 2025-11-29 07:27:50.660 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:51 np0005539505 nova_compute[186958]: 2025-11-29 07:27:51.689 186962 DEBUG nova.network.neutron [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Successfully updated port: 1e2538b2-9233-45f7-9334-e7fcdba1da31 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:27:51 np0005539505 podman[240280]: 2025-11-29 07:27:51.717338525 +0000 UTC m=+0.043537754 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:27:51 np0005539505 nova_compute[186958]: 2025-11-29 07:27:51.729 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:27:51 np0005539505 nova_compute[186958]: 2025-11-29 07:27:51.729 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquired lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:27:51 np0005539505 nova_compute[186958]: 2025-11-29 07:27:51.729 186962 DEBUG nova.network.neutron [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:27:51 np0005539505 podman[240279]: 2025-11-29 07:27:51.750518164 +0000 UTC m=+0.079602524 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Nov 29 02:27:52 np0005539505 nova_compute[186958]: 2025-11-29 07:27:52.746 186962 DEBUG nova.network.neutron [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:27:53 np0005539505 nova_compute[186958]: 2025-11-29 07:27:53.302 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:53 np0005539505 podman[240324]: 2025-11-29 07:27:53.72941172 +0000 UTC m=+0.059139914 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:27:53 np0005539505 nova_compute[186958]: 2025-11-29 07:27:53.794 186962 DEBUG nova.compute.manager [req-c22629cb-365b-47d8-b9aa-4e268507ad1d req-2516a225-660a-4bf1-af38-69314f9c290d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-changed-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:53 np0005539505 nova_compute[186958]: 2025-11-29 07:27:53.794 186962 DEBUG nova.compute.manager [req-c22629cb-365b-47d8-b9aa-4e268507ad1d req-2516a225-660a-4bf1-af38-69314f9c290d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Refreshing instance network info cache due to event network-changed-1e2538b2-9233-45f7-9334-e7fcdba1da31. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:27:53 np0005539505 nova_compute[186958]: 2025-11-29 07:27:53.794 186962 DEBUG oslo_concurrency.lockutils [req-c22629cb-365b-47d8-b9aa-4e268507ad1d req-2516a225-660a-4bf1-af38-69314f9c290d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.662 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.719 186962 DEBUG nova.network.neutron [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Updating instance_info_cache with network_info: [{"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.759 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Releasing lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.759 186962 DEBUG nova.compute.manager [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Instance network_info: |[{"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.760 186962 DEBUG oslo_concurrency.lockutils [req-c22629cb-365b-47d8-b9aa-4e268507ad1d req-2516a225-660a-4bf1-af38-69314f9c290d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.760 186962 DEBUG nova.network.neutron [req-c22629cb-365b-47d8-b9aa-4e268507ad1d req-2516a225-660a-4bf1-af38-69314f9c290d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Refreshing network info cache for port 1e2538b2-9233-45f7-9334-e7fcdba1da31 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.762 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Start _get_guest_xml network_info=[{"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.766 186962 WARNING nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.773 186962 DEBUG nova.virt.libvirt.host [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.774 186962 DEBUG nova.virt.libvirt.host [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.783 186962 DEBUG nova.virt.libvirt.host [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.784 186962 DEBUG nova.virt.libvirt.host [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.785 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.786 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.786 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.786 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.787 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.787 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.787 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.787 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.788 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.788 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.788 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.788 186962 DEBUG nova.virt.hardware [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.793 186962 DEBUG nova.virt.libvirt.vif [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1624881395',display_name='tempest-ListServerFiltersTestJSON-instance-1624881395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1624881395',id=131,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-br107bnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:27:44Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=19c7d8b2-3f1a-40cf-a538-dd2752970ffb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.793 186962 DEBUG nova.network.os_vif_util [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.794 186962 DEBUG nova.network.os_vif_util [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.795 186962 DEBUG nova.objects.instance [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'pci_devices' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.822 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  <uuid>19c7d8b2-3f1a-40cf-a538-dd2752970ffb</uuid>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  <name>instance-00000083</name>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1624881395</nova:name>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:27:55</nova:creationTime>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:        <nova:user uuid="3e2a40601ced4de78fe1767769f262c0">tempest-ListServerFiltersTestJSON-1571311845-project-member</nova:user>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:        <nova:project uuid="7843cfa993a1428aaaa660321ebba1ac">tempest-ListServerFiltersTestJSON-1571311845</nova:project>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:        <nova:port uuid="1e2538b2-9233-45f7-9334-e7fcdba1da31">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <entry name="serial">19c7d8b2-3f1a-40cf-a538-dd2752970ffb</entry>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <entry name="uuid">19c7d8b2-3f1a-40cf-a538-dd2752970ffb</entry>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.config"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:a4:02:7c"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <target dev="tap1e2538b2-92"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/console.log" append="off"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:27:55 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:27:55 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:27:55 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:27:55 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.823 186962 DEBUG nova.compute.manager [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Preparing to wait for external event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.824 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.824 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.824 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.825 186962 DEBUG nova.virt.libvirt.vif [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1624881395',display_name='tempest-ListServerFiltersTestJSON-instance-1624881395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1624881395',id=131,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-br107bnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:27:44Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=19c7d8b2-3f1a-40cf-a538-dd2752970ffb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.825 186962 DEBUG nova.network.os_vif_util [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.826 186962 DEBUG nova.network.os_vif_util [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.826 186962 DEBUG os_vif [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.827 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.827 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.827 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.831 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.831 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e2538b2-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.831 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1e2538b2-92, col_values=(('external_ids', {'iface-id': '1e2538b2-9233-45f7-9334-e7fcdba1da31', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:02:7c', 'vm-uuid': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.833 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:55 np0005539505 NetworkManager[55134]: <info>  [1764401275.8336] manager: (tap1e2538b2-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/293)
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.835 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.838 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:55 np0005539505 nova_compute[186958]: 2025-11-29 07:27:55.838 186962 INFO os_vif [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92')#033[00m
Nov 29 02:27:56 np0005539505 nova_compute[186958]: 2025-11-29 07:27:56.101 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:27:56 np0005539505 nova_compute[186958]: 2025-11-29 07:27:56.102 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:27:56 np0005539505 nova_compute[186958]: 2025-11-29 07:27:56.102 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] No VIF found with MAC fa:16:3e:a4:02:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:27:56 np0005539505 nova_compute[186958]: 2025-11-29 07:27:56.103 186962 INFO nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Using config drive#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.260 186962 INFO nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Creating config drive at /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.config#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.265 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcxhljgqp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.390 186962 DEBUG oslo_concurrency.processutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcxhljgqp" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:27:57 np0005539505 kernel: tap1e2538b2-92: entered promiscuous mode
Nov 29 02:27:57 np0005539505 NetworkManager[55134]: <info>  [1764401277.4503] manager: (tap1e2538b2-92): new Tun device (/org/freedesktop/NetworkManager/Devices/294)
Nov 29 02:27:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:27:57Z|00595|binding|INFO|Claiming lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 for this chassis.
Nov 29 02:27:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:27:57Z|00596|binding|INFO|1e2538b2-9233-45f7-9334-e7fcdba1da31: Claiming fa:16:3e:a4:02:7c 10.100.0.14
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.451 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.453 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.471 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:02:7c 10.100.0.14'], port_security=['fa:16:3e:a4:02:7c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28412826-5463-46e4-95cb-a7d788b1ab15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7843cfa993a1428aaaa660321ebba1ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b91ab01c-e143-4067-9931-a92270268d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cbf7b29-c247-42f8-abc3-94d1e6be8d3f, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=1e2538b2-9233-45f7-9334-e7fcdba1da31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.472 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 1e2538b2-9233-45f7-9334-e7fcdba1da31 in datapath 28412826-5463-46e4-95cb-a7d788b1ab15 bound to our chassis#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.473 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 28412826-5463-46e4-95cb-a7d788b1ab15#033[00m
Nov 29 02:27:57 np0005539505 systemd-udevd[240360]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.484 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7dcf8426-373a-47c1-a745-038c35353f3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.485 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap28412826-51 in ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.487 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:57 np0005539505 NetworkManager[55134]: <info>  [1764401277.4889] device (tap1e2538b2-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:27:57 np0005539505 NetworkManager[55134]: <info>  [1764401277.4896] device (tap1e2538b2-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:27:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:27:57Z|00597|binding|INFO|Setting lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 ovn-installed in OVS
Nov 29 02:27:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:27:57Z|00598|binding|INFO|Setting lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 up in Southbound
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.487 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap28412826-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.487 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed80a55-6bb9-40fe-8953-75eeeeddad2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.489 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9167629d-ed47-4119-93ec-82b12bb5070f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.492 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:57 np0005539505 systemd-machined[153285]: New machine qemu-68-instance-00000083.
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.501 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d08db6-4f83-4dcf-91f7-4cf1d16200e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 systemd[1]: Started Virtual Machine qemu-68-instance-00000083.
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.524 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[da0236b9-f1e1-478a-80cf-f00da567fcc2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.555 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c2694a-7ffb-4067-89a1-b8f6c82456a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 NetworkManager[55134]: <info>  [1764401277.5658] manager: (tap28412826-50): new Veth device (/org/freedesktop/NetworkManager/Devices/295)
Nov 29 02:27:57 np0005539505 systemd-udevd[240365]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.565 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e23c62d5-2382-445a-9529-bc64f6330cfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.599 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[315acd82-ad78-4ae1-ba25-c00ae60ae126]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.604 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[bad68169-43ec-4236-a8bf-4d37124aeffa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 NetworkManager[55134]: <info>  [1764401277.6330] device (tap28412826-50): carrier: link connected
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.639 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[9b406b84-3bcc-4ebf-a0d2-151c83ab9883]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.658 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cb16ed1f-e1ee-4ef4-9dab-ad299318e531]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28412826-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c0:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 191], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672521, 'reachable_time': 38124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240398, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.680 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b118ac-68f1-4c68-9042-ab4b1e543c29]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:c072'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 672521, 'tstamp': 672521}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240403, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.699 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[660a8071-6b06-4fef-876b-6a9d2c895d12]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28412826-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c0:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 191], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672521, 'reachable_time': 38124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240404, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.730 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401277.7299998, 19c7d8b2-3f1a-40cf-a538-dd2752970ffb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.731 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] VM Started (Lifecycle Event)#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.736 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[46e93c84-4190-4d6a-b250-e3f462a1f6cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.759 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.763 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401277.7302613, 19c7d8b2-3f1a-40cf-a538-dd2752970ffb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.763 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.784 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.788 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.789 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2e404048-1d73-491d-bd73-e082da5a3bd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.791 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28412826-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.791 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.792 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28412826-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.810 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.829 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:57 np0005539505 NetworkManager[55134]: <info>  [1764401277.8299] manager: (tap28412826-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/296)
Nov 29 02:27:57 np0005539505 kernel: tap28412826-50: entered promiscuous mode
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.830 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.833 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap28412826-50, col_values=(('external_ids', {'iface-id': '2abf732f-8f8c-470e-b6e2-def265b14d70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:27:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:27:57Z|00599|binding|INFO|Releasing lport 2abf732f-8f8c-470e-b6e2-def265b14d70 from this chassis (sb_readonly=0)
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.834 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.835 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.836 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fac0fd37-e6f9-40dc-810b-6e97aba4d13d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.837 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-28412826-5463-46e4-95cb-a7d788b1ab15
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 28412826-5463-46e4-95cb-a7d788b1ab15
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:27:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:27:57.837 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'env', 'PROCESS_TAG=haproxy-28412826-5463-46e4-95cb-a7d788b1ab15', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/28412826-5463-46e4-95cb-a7d788b1ab15.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:27:57 np0005539505 nova_compute[186958]: 2025-11-29 07:27:57.846 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:27:58 np0005539505 podman[240437]: 2025-11-29 07:27:58.221416784 +0000 UTC m=+0.095956147 container create f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:27:58 np0005539505 podman[240437]: 2025-11-29 07:27:58.14530888 +0000 UTC m=+0.019848263 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.253 186962 DEBUG nova.compute.manager [req-e37b9cb8-9ebb-40bd-8b54-5feb8888879f req-60b53b37-efa4-47ab-b122-ecc56c60f6da 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.254 186962 DEBUG oslo_concurrency.lockutils [req-e37b9cb8-9ebb-40bd-8b54-5feb8888879f req-60b53b37-efa4-47ab-b122-ecc56c60f6da 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.254 186962 DEBUG oslo_concurrency.lockutils [req-e37b9cb8-9ebb-40bd-8b54-5feb8888879f req-60b53b37-efa4-47ab-b122-ecc56c60f6da 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.255 186962 DEBUG oslo_concurrency.lockutils [req-e37b9cb8-9ebb-40bd-8b54-5feb8888879f req-60b53b37-efa4-47ab-b122-ecc56c60f6da 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.255 186962 DEBUG nova.compute.manager [req-e37b9cb8-9ebb-40bd-8b54-5feb8888879f req-60b53b37-efa4-47ab-b122-ecc56c60f6da 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Processing event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.255 186962 DEBUG nova.compute.manager [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:27:58 np0005539505 systemd[1]: Started libpod-conmon-f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4.scope.
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.258 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401278.2587268, 19c7d8b2-3f1a-40cf-a538-dd2752970ffb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.259 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.261 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.267 186962 INFO nova.virt.libvirt.driver [-] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Instance spawned successfully.#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.268 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:27:58 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.293 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:58 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d62a61b36e428f8380b0faab1c1d90b8aff2bea52253ee73e0fbbe284e37e861/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.299 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.300 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.301 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.302 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.303 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.305 186962 DEBUG nova.virt.libvirt.driver [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.312 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:27:58 np0005539505 podman[240437]: 2025-11-29 07:27:58.342333706 +0000 UTC m=+0.216873119 container init f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:27:58 np0005539505 podman[240437]: 2025-11-29 07:27:58.349009765 +0000 UTC m=+0.223549138 container start f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.362 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:27:58 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[240452]: [NOTICE]   (240456) : New worker (240458) forked
Nov 29 02:27:58 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[240452]: [NOTICE]   (240456) : Loading success.
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.464 186962 INFO nova.compute.manager [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Took 13.76 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.465 186962 DEBUG nova.compute.manager [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.662 186962 INFO nova.compute.manager [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Took 14.61 seconds to build instance.#033[00m
Nov 29 02:27:58 np0005539505 nova_compute[186958]: 2025-11-29 07:27:58.699 186962 DEBUG oslo_concurrency.lockutils [None req-45fe26a0-c05e-4b6b-b9dc-37d263bbdf56 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:27:59 np0005539505 nova_compute[186958]: 2025-11-29 07:27:59.896 186962 DEBUG nova.network.neutron [req-c22629cb-365b-47d8-b9aa-4e268507ad1d req-2516a225-660a-4bf1-af38-69314f9c290d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Updated VIF entry in instance network info cache for port 1e2538b2-9233-45f7-9334-e7fcdba1da31. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:27:59 np0005539505 nova_compute[186958]: 2025-11-29 07:27:59.897 186962 DEBUG nova.network.neutron [req-c22629cb-365b-47d8-b9aa-4e268507ad1d req-2516a225-660a-4bf1-af38-69314f9c290d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Updating instance_info_cache with network_info: [{"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:27:59 np0005539505 nova_compute[186958]: 2025-11-29 07:27:59.918 186962 DEBUG oslo_concurrency.lockutils [req-c22629cb-365b-47d8-b9aa-4e268507ad1d req-2516a225-660a-4bf1-af38-69314f9c290d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:00 np0005539505 nova_compute[186958]: 2025-11-29 07:28:00.480 186962 DEBUG nova.compute.manager [req-6e17a381-552e-4004-a41d-0dd0dc3cb6af req-0dd2b7da-8dfb-4022-b97a-6f832c19d0b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:00 np0005539505 nova_compute[186958]: 2025-11-29 07:28:00.480 186962 DEBUG oslo_concurrency.lockutils [req-6e17a381-552e-4004-a41d-0dd0dc3cb6af req-0dd2b7da-8dfb-4022-b97a-6f832c19d0b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:00 np0005539505 nova_compute[186958]: 2025-11-29 07:28:00.481 186962 DEBUG oslo_concurrency.lockutils [req-6e17a381-552e-4004-a41d-0dd0dc3cb6af req-0dd2b7da-8dfb-4022-b97a-6f832c19d0b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:00 np0005539505 nova_compute[186958]: 2025-11-29 07:28:00.481 186962 DEBUG oslo_concurrency.lockutils [req-6e17a381-552e-4004-a41d-0dd0dc3cb6af req-0dd2b7da-8dfb-4022-b97a-6f832c19d0b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:00 np0005539505 nova_compute[186958]: 2025-11-29 07:28:00.481 186962 DEBUG nova.compute.manager [req-6e17a381-552e-4004-a41d-0dd0dc3cb6af req-0dd2b7da-8dfb-4022-b97a-6f832c19d0b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] No waiting events found dispatching network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:00 np0005539505 nova_compute[186958]: 2025-11-29 07:28:00.481 186962 WARNING nova.compute.manager [req-6e17a381-552e-4004-a41d-0dd0dc3cb6af req-0dd2b7da-8dfb-4022-b97a-6f832c19d0b2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received unexpected event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:28:00 np0005539505 nova_compute[186958]: 2025-11-29 07:28:00.664 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:00 np0005539505 nova_compute[186958]: 2025-11-29 07:28:00.833 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:04 np0005539505 podman[240467]: 2025-11-29 07:28:04.723009583 +0000 UTC m=+0.054823503 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:28:04 np0005539505 podman[240468]: 2025-11-29 07:28:04.748832563 +0000 UTC m=+0.078683477 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 29 02:28:05 np0005539505 nova_compute[186958]: 2025-11-29 07:28:05.666 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:05 np0005539505 nova_compute[186958]: 2025-11-29 07:28:05.835 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:07 np0005539505 podman[240516]: 2025-11-29 07:28:07.725437137 +0000 UTC m=+0.053373781 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 29 02:28:07 np0005539505 podman[240517]: 2025-11-29 07:28:07.725437587 +0000 UTC m=+0.052573729 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 02:28:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:09Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a4:02:7c 10.100.0.14
Nov 29 02:28:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:09Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a4:02:7c 10.100.0.14
Nov 29 02:28:10 np0005539505 nova_compute[186958]: 2025-11-29 07:28:10.313 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:10 np0005539505 nova_compute[186958]: 2025-11-29 07:28:10.314 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:10 np0005539505 nova_compute[186958]: 2025-11-29 07:28:10.332 186962 DEBUG nova.compute.manager [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:28:10 np0005539505 nova_compute[186958]: 2025-11-29 07:28:10.479 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:10 np0005539505 nova_compute[186958]: 2025-11-29 07:28:10.480 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:10 np0005539505 nova_compute[186958]: 2025-11-29 07:28:10.485 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:28:10 np0005539505 nova_compute[186958]: 2025-11-29 07:28:10.486 186962 INFO nova.compute.claims [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:28:10 np0005539505 nova_compute[186958]: 2025-11-29 07:28:10.641 186962 DEBUG nova.compute.provider_tree [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:28:10 np0005539505 nova_compute[186958]: 2025-11-29 07:28:10.667 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:10 np0005539505 nova_compute[186958]: 2025-11-29 07:28:10.837 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:11 np0005539505 nova_compute[186958]: 2025-11-29 07:28:11.558 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:11 np0005539505 nova_compute[186958]: 2025-11-29 07:28:11.984 186962 DEBUG nova.scheduler.client.report [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.025 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.047 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.048 186962 DEBUG nova.compute.manager [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.132 186962 DEBUG nova.compute.manager [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.133 186962 DEBUG nova.network.neutron [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.159 186962 INFO nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.186 186962 DEBUG nova.compute.manager [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.331 186962 DEBUG nova.compute.manager [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.333 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.333 186962 INFO nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Creating image(s)#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.334 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "/var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.334 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.335 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "/var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.353 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.411 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.412 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.413 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.428 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.466 186962 DEBUG nova.policy [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c231e63624d44fc19e0989abfb1afb22', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.482 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.483 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.530 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.531 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.531 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.589 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.590 186962 DEBUG nova.virt.disk.api [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Checking if we can resize image /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.590 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.645 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.646 186962 DEBUG nova.virt.disk.api [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Cannot resize image /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.647 186962 DEBUG nova.objects.instance [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'migration_context' on Instance uuid 97b39e54-312a-4ebc-863b-e1ef5f4cf363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.678 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.679 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Ensure instance console log exists: /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.680 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.680 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:12 np0005539505 nova_compute[186958]: 2025-11-29 07:28:12.681 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:14.682 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:46:33 2001:db8:0:1:f816:3eff:feb9:4633 2001:db8::f816:3eff:feb9:4633'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feb9:4633/64 2001:db8::f816:3eff:feb9:4633/64', 'neutron:device_id': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed5ad144-c783-4b67-a226-e0c5588d3535, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f0ce4da0-40ec-44ef-8179-4cbfad9b57f1) old=Port_Binding(mac=['fa:16:3e:b9:46:33 2001:db8::f816:3eff:feb9:4633'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb9:4633/64', 'neutron:device_id': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:14.684 104094 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f0ce4da0-40ec-44ef-8179-4cbfad9b57f1 in datapath ff387e90-45c2-42d7-b536-fee4d2b6eb5e updated#033[00m
Nov 29 02:28:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:14.685 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff387e90-45c2-42d7-b536-fee4d2b6eb5e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:28:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:14.686 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[00f19e94-4c28-475f-936f-480808f05f07]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:14 np0005539505 nova_compute[186958]: 2025-11-29 07:28:14.966 186962 DEBUG nova.network.neutron [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Successfully created port: 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:28:15 np0005539505 nova_compute[186958]: 2025-11-29 07:28:15.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:15 np0005539505 nova_compute[186958]: 2025-11-29 07:28:15.670 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:15 np0005539505 nova_compute[186958]: 2025-11-29 07:28:15.839 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:16 np0005539505 nova_compute[186958]: 2025-11-29 07:28:16.678 186962 DEBUG nova.network.neutron [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Successfully updated port: 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:28:16 np0005539505 nova_compute[186958]: 2025-11-29 07:28:16.697 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:28:16 np0005539505 nova_compute[186958]: 2025-11-29 07:28:16.698 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:28:16 np0005539505 nova_compute[186958]: 2025-11-29 07:28:16.698 186962 DEBUG nova.network.neutron [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:28:16 np0005539505 nova_compute[186958]: 2025-11-29 07:28:16.817 186962 DEBUG nova.compute.manager [req-df8fb984-6df2-49d8-91cb-59013fa2ed23 req-0af2562f-19ab-447c-afdf-06afeace8631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-changed-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:16 np0005539505 nova_compute[186958]: 2025-11-29 07:28:16.817 186962 DEBUG nova.compute.manager [req-df8fb984-6df2-49d8-91cb-59013fa2ed23 req-0af2562f-19ab-447c-afdf-06afeace8631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Refreshing instance network info cache due to event network-changed-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:28:16 np0005539505 nova_compute[186958]: 2025-11-29 07:28:16.817 186962 DEBUG oslo_concurrency.lockutils [req-df8fb984-6df2-49d8-91cb-59013fa2ed23 req-0af2562f-19ab-447c-afdf-06afeace8631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:28:16 np0005539505 nova_compute[186958]: 2025-11-29 07:28:16.914 186962 DEBUG nova.network.neutron [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.379 186962 DEBUG nova.network.neutron [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Updating instance_info_cache with network_info: [{"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.406 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.406 186962 DEBUG nova.compute.manager [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Instance network_info: |[{"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.407 186962 DEBUG oslo_concurrency.lockutils [req-df8fb984-6df2-49d8-91cb-59013fa2ed23 req-0af2562f-19ab-447c-afdf-06afeace8631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.407 186962 DEBUG nova.network.neutron [req-df8fb984-6df2-49d8-91cb-59013fa2ed23 req-0af2562f-19ab-447c-afdf-06afeace8631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Refreshing network info cache for port 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.411 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Start _get_guest_xml network_info=[{"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.416 186962 WARNING nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.426 186962 DEBUG nova.virt.libvirt.host [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.427 186962 DEBUG nova.virt.libvirt.host [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.431 186962 DEBUG nova.virt.libvirt.host [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.432 186962 DEBUG nova.virt.libvirt.host [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.433 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.433 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.434 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.434 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.434 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.435 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.435 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.435 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.436 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.436 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.436 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.436 186962 DEBUG nova.virt.hardware [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.440 186962 DEBUG nova.virt.libvirt.vif [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:28:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-468452072',display_name='tempest-TestNetworkAdvancedServerOps-server-468452072',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-468452072',id=134,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYkyrD6GH7Y6zXJE6UdUt8xCR/1va5hW1FscfbM7L30ylZqlz2D8TZeKR1ExqmSnTdiWL+UU0qKusuYO7HAthTySgqkMobIXogk9BOrKSRDnqKaAi1AWXXuuhtQyh3SVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1220905380',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-4no3w984',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:12Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=97b39e54-312a-4ebc-863b-e1ef5f4cf363,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.440 186962 DEBUG nova.network.os_vif_util [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.441 186962 DEBUG nova.network.os_vif_util [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:93:e0,bridge_name='br-int',has_traffic_filtering=True,id=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5,network=Network(717f1c01-fb17-41f0-848c-cebdb3841bf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9083d4b6-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.442 186962 DEBUG nova.objects.instance [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 97b39e54-312a-4ebc-863b-e1ef5f4cf363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.467 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  <uuid>97b39e54-312a-4ebc-863b-e1ef5f4cf363</uuid>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  <name>instance-00000086</name>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-468452072</nova:name>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:28:18</nova:creationTime>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:        <nova:user uuid="bfd2024670594b10941cec8a59d2573f">tempest-TestNetworkAdvancedServerOps-1380683659-project-member</nova:user>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:        <nova:project uuid="c231e63624d44fc19e0989abfb1afb22">tempest-TestNetworkAdvancedServerOps-1380683659</nova:project>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:        <nova:port uuid="9083d4b6-b3e2-451d-b686-a9cab7e5a2f5">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <entry name="serial">97b39e54-312a-4ebc-863b-e1ef5f4cf363</entry>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <entry name="uuid">97b39e54-312a-4ebc-863b-e1ef5f4cf363</entry>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk.config"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:19:93:e0"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <target dev="tap9083d4b6-b3"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/console.log" append="off"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:28:18 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:28:18 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:28:18 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:28:18 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.468 186962 DEBUG nova.compute.manager [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Preparing to wait for external event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.469 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.469 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.469 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.470 186962 DEBUG nova.virt.libvirt.vif [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:28:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-468452072',display_name='tempest-TestNetworkAdvancedServerOps-server-468452072',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-468452072',id=134,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYkyrD6GH7Y6zXJE6UdUt8xCR/1va5hW1FscfbM7L30ylZqlz2D8TZeKR1ExqmSnTdiWL+UU0qKusuYO7HAthTySgqkMobIXogk9BOrKSRDnqKaAi1AWXXuuhtQyh3SVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1220905380',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-4no3w984',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:12Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=97b39e54-312a-4ebc-863b-e1ef5f4cf363,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.470 186962 DEBUG nova.network.os_vif_util [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.470 186962 DEBUG nova.network.os_vif_util [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:93:e0,bridge_name='br-int',has_traffic_filtering=True,id=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5,network=Network(717f1c01-fb17-41f0-848c-cebdb3841bf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9083d4b6-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.471 186962 DEBUG os_vif [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:93:e0,bridge_name='br-int',has_traffic_filtering=True,id=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5,network=Network(717f1c01-fb17-41f0-848c-cebdb3841bf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9083d4b6-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.471 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.472 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.472 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.474 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.475 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9083d4b6-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.475 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9083d4b6-b3, col_values=(('external_ids', {'iface-id': '9083d4b6-b3e2-451d-b686-a9cab7e5a2f5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:93:e0', 'vm-uuid': '97b39e54-312a-4ebc-863b-e1ef5f4cf363'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.476 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:18 np0005539505 NetworkManager[55134]: <info>  [1764401298.4771] manager: (tap9083d4b6-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.479 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.483 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.484 186962 INFO os_vif [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:93:e0,bridge_name='br-int',has_traffic_filtering=True,id=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5,network=Network(717f1c01-fb17-41f0-848c-cebdb3841bf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9083d4b6-b3')#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.571 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.571 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.572 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] No VIF found with MAC fa:16:3e:19:93:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:28:18 np0005539505 nova_compute[186958]: 2025-11-29 07:28:18.572 186962 INFO nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Using config drive#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.050 186962 INFO nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Creating config drive at /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk.config#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.055 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjuiiz80e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.177 186962 DEBUG oslo_concurrency.processutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjuiiz80e" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:19 np0005539505 kernel: tap9083d4b6-b3: entered promiscuous mode
Nov 29 02:28:19 np0005539505 NetworkManager[55134]: <info>  [1764401299.2219] manager: (tap9083d4b6-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/298)
Nov 29 02:28:19 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:19Z|00600|binding|INFO|Claiming lport 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 for this chassis.
Nov 29 02:28:19 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:19Z|00601|binding|INFO|9083d4b6-b3e2-451d-b686-a9cab7e5a2f5: Claiming fa:16:3e:19:93:e0 10.100.0.7
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.273 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.275 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.293 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:93:e0 10.100.0.7'], port_security=['fa:16:3e:19:93:e0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '97b39e54-312a-4ebc-863b-e1ef5f4cf363', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '2', 'neutron:security_group_ids': '99684fb2-cb2f-45bc-99fa-10934e68636b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=48e2bcd2-f544-4812-a73e-4d43e4ef323e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.294 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 in datapath 717f1c01-fb17-41f0-848c-cebdb3841bf9 bound to our chassis#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.296 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 717f1c01-fb17-41f0-848c-cebdb3841bf9#033[00m
Nov 29 02:28:19 np0005539505 systemd-machined[153285]: New machine qemu-69-instance-00000086.
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.306 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b32d7f0f-b0e9-4940-84b3-c3714e032dec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.307 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap717f1c01-f1 in ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.309 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap717f1c01-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.309 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6e257c49-bec6-4dcc-b053-27db80c29e77]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.310 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7e26cb61-0770-49de-8451-3da1ca69d75f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.322 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[143d3e56-17b1-4e62-9066-d8119f96ab22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 systemd[1]: Started Virtual Machine qemu-69-instance-00000086.
Nov 29 02:28:19 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:19Z|00602|binding|INFO|Setting lport 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 ovn-installed in OVS
Nov 29 02:28:19 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:19Z|00603|binding|INFO|Setting lport 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 up in Southbound
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.330 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:19 np0005539505 systemd-udevd[240604]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.348 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[815b3f86-a300-48f8-9a62-f4ff505a9aaa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 NetworkManager[55134]: <info>  [1764401299.3569] device (tap9083d4b6-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:28:19 np0005539505 NetworkManager[55134]: <info>  [1764401299.3579] device (tap9083d4b6-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.377 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3569b6f7-4cc5-4a24-9b26-a1244362e327]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.383 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8833d90d-57f9-460d-924d-aa0bc256e67c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 NetworkManager[55134]: <info>  [1764401299.3851] manager: (tap717f1c01-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/299)
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.412 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[81654857-30b9-4774-ac06-d47b81fd3b27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.415 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3650a2ae-6b5f-4a72-8e02-0588c23e735b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 NetworkManager[55134]: <info>  [1764401299.4356] device (tap717f1c01-f0): carrier: link connected
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.440 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2649dfc0-aede-45ec-899f-ecb9f45cc1b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.455 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[da4449da-028b-4931-9f9a-0cf57db19f0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap717f1c01-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:5b:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674702, 'reachable_time': 39649, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240636, 'error': None, 'target': 'ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.469 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b01387ee-128b-4779-b242-a9a4c914faf7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:5bed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 674702, 'tstamp': 674702}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240637, 'error': None, 'target': 'ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.484 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d55011f2-9fa9-41e6-90fd-5b64097e6c6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap717f1c01-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:5b:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674702, 'reachable_time': 39649, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240640, 'error': None, 'target': 'ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.514 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c68c4e-f9f2-4849-b2f0-36fd677ea213]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.568 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[155124b1-ee0d-4ac5-b581-4d876ba5d8bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.570 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap717f1c01-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.570 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.571 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap717f1c01-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:19 np0005539505 NetworkManager[55134]: <info>  [1764401299.5736] manager: (tap717f1c01-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Nov 29 02:28:19 np0005539505 kernel: tap717f1c01-f0: entered promiscuous mode
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.573 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401299.57172, 97b39e54-312a-4ebc-863b-e1ef5f4cf363 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.573 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] VM Started (Lifecycle Event)#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.575 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.576 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap717f1c01-f0, col_values=(('external_ids', {'iface-id': '84ef03fa-56c5-40a8-989e-6c02a5b46df5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:19 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:19Z|00604|binding|INFO|Releasing lport 84ef03fa-56c5-40a8-989e-6c02a5b46df5 from this chassis (sb_readonly=0)
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.578 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/717f1c01-fb17-41f0-848c-cebdb3841bf9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/717f1c01-fb17-41f0-848c-cebdb3841bf9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.579 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dbb9fe39-c6e9-41fc-bb04-38497b9335a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.580 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-717f1c01-fb17-41f0-848c-cebdb3841bf9
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/717f1c01-fb17-41f0-848c-cebdb3841bf9.pid.haproxy
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 717f1c01-fb17-41f0-848c-cebdb3841bf9
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:28:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:19.580 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'env', 'PROCESS_TAG=haproxy-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/717f1c01-fb17-41f0-848c-cebdb3841bf9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.588 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.602 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.608 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401299.572952, 97b39e54-312a-4ebc-863b-e1ef5f4cf363 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.609 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.631 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.635 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.660 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:28:19 np0005539505 podman[240677]: 2025-11-29 07:28:19.942977268 +0000 UTC m=+0.051259511 container create 082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.964 186962 DEBUG nova.network.neutron [req-df8fb984-6df2-49d8-91cb-59013fa2ed23 req-0af2562f-19ab-447c-afdf-06afeace8631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Updated VIF entry in instance network info cache for port 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.966 186962 DEBUG nova.network.neutron [req-df8fb984-6df2-49d8-91cb-59013fa2ed23 req-0af2562f-19ab-447c-afdf-06afeace8631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Updating instance_info_cache with network_info: [{"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:19 np0005539505 systemd[1]: Started libpod-conmon-082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02.scope.
Nov 29 02:28:19 np0005539505 nova_compute[186958]: 2025-11-29 07:28:19.989 186962 DEBUG oslo_concurrency.lockutils [req-df8fb984-6df2-49d8-91cb-59013fa2ed23 req-0af2562f-19ab-447c-afdf-06afeace8631 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:20 np0005539505 podman[240677]: 2025-11-29 07:28:19.914620966 +0000 UTC m=+0.022903219 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:28:20 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:28:20 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c00cf8ba1b3bebf1e619726fe0040ad93dc7cf07c97a4efbb93fc0168bbceb3f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:28:20 np0005539505 podman[240677]: 2025-11-29 07:28:20.031103463 +0000 UTC m=+0.139385716 container init 082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:28:20 np0005539505 podman[240677]: 2025-11-29 07:28:20.036890566 +0000 UTC m=+0.145172799 container start 082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:28:20 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[240692]: [NOTICE]   (240696) : New worker (240698) forked
Nov 29 02:28:20 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[240692]: [NOTICE]   (240696) : Loading success.
Nov 29 02:28:20 np0005539505 nova_compute[186958]: 2025-11-29 07:28:20.673 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.888 186962 DEBUG nova.compute.manager [req-a4df4ece-a29a-4065-8578-a04212f54090 req-a3380f52-ebc4-49ea-b055-6fac9d7f91be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.889 186962 DEBUG oslo_concurrency.lockutils [req-a4df4ece-a29a-4065-8578-a04212f54090 req-a3380f52-ebc4-49ea-b055-6fac9d7f91be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.889 186962 DEBUG oslo_concurrency.lockutils [req-a4df4ece-a29a-4065-8578-a04212f54090 req-a3380f52-ebc4-49ea-b055-6fac9d7f91be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.889 186962 DEBUG oslo_concurrency.lockutils [req-a4df4ece-a29a-4065-8578-a04212f54090 req-a3380f52-ebc4-49ea-b055-6fac9d7f91be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.889 186962 DEBUG nova.compute.manager [req-a4df4ece-a29a-4065-8578-a04212f54090 req-a3380f52-ebc4-49ea-b055-6fac9d7f91be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Processing event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.890 186962 DEBUG nova.compute.manager [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.894 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401301.8940248, 97b39e54-312a-4ebc-863b-e1ef5f4cf363 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.894 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.895 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.899 186962 INFO nova.virt.libvirt.driver [-] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Instance spawned successfully.#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.899 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.919 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.926 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.927 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.928 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.928 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.929 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.929 186962 DEBUG nova.virt.libvirt.driver [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.933 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:28:21 np0005539505 nova_compute[186958]: 2025-11-29 07:28:21.970 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:28:22 np0005539505 nova_compute[186958]: 2025-11-29 07:28:22.040 186962 INFO nova.compute.manager [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Took 9.71 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:28:22 np0005539505 nova_compute[186958]: 2025-11-29 07:28:22.040 186962 DEBUG nova.compute.manager [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:22 np0005539505 nova_compute[186958]: 2025-11-29 07:28:22.134 186962 INFO nova.compute.manager [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Took 11.70 seconds to build instance.#033[00m
Nov 29 02:28:22 np0005539505 nova_compute[186958]: 2025-11-29 07:28:22.210 186962 DEBUG oslo_concurrency.lockutils [None req-cce036ab-2828-4c21-bade-dadbbd7981d6 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.896s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:22 np0005539505 nova_compute[186958]: 2025-11-29 07:28:22.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:22 np0005539505 podman[240708]: 2025-11-29 07:28:22.730360257 +0000 UTC m=+0.061219793 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc.)
Nov 29 02:28:22 np0005539505 podman[240709]: 2025-11-29 07:28:22.742442229 +0000 UTC m=+0.067247844 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:28:23 np0005539505 nova_compute[186958]: 2025-11-29 07:28:23.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:23 np0005539505 nova_compute[186958]: 2025-11-29 07:28:23.428 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:23 np0005539505 nova_compute[186958]: 2025-11-29 07:28:23.429 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:23 np0005539505 nova_compute[186958]: 2025-11-29 07:28:23.429 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:23 np0005539505 nova_compute[186958]: 2025-11-29 07:28:23.429 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:28:23 np0005539505 nova_compute[186958]: 2025-11-29 07:28:23.477 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:23 np0005539505 nova_compute[186958]: 2025-11-29 07:28:23.897 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:23 np0005539505 nova_compute[186958]: 2025-11-29 07:28:23.966 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:23 np0005539505 nova_compute[186958]: 2025-11-29 07:28:23.968 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.010 186962 DEBUG nova.compute.manager [req-1ee29e14-bdfd-4cc4-a8d4-7cc12c781bfd req-e583689d-c415-4fd2-9a50-6bb2d5c629ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.011 186962 DEBUG oslo_concurrency.lockutils [req-1ee29e14-bdfd-4cc4-a8d4-7cc12c781bfd req-e583689d-c415-4fd2-9a50-6bb2d5c629ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.012 186962 DEBUG oslo_concurrency.lockutils [req-1ee29e14-bdfd-4cc4-a8d4-7cc12c781bfd req-e583689d-c415-4fd2-9a50-6bb2d5c629ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.012 186962 DEBUG oslo_concurrency.lockutils [req-1ee29e14-bdfd-4cc4-a8d4-7cc12c781bfd req-e583689d-c415-4fd2-9a50-6bb2d5c629ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.013 186962 DEBUG nova.compute.manager [req-1ee29e14-bdfd-4cc4-a8d4-7cc12c781bfd req-e583689d-c415-4fd2-9a50-6bb2d5c629ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] No waiting events found dispatching network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.013 186962 WARNING nova.compute.manager [req-1ee29e14-bdfd-4cc4-a8d4-7cc12c781bfd req-e583689d-c415-4fd2-9a50-6bb2d5c629ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received unexpected event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.028 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.034 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.057 186962 DEBUG oslo_concurrency.lockutils [None req-da5cb54a-8d5d-4152-b979-30840abe45ca 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.058 186962 DEBUG oslo_concurrency.lockutils [None req-da5cb54a-8d5d-4152-b979-30840abe45ca 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.059 186962 DEBUG nova.compute.manager [None req-da5cb54a-8d5d-4152-b979-30840abe45ca 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.063 186962 DEBUG nova.compute.manager [None req-da5cb54a-8d5d-4152-b979-30840abe45ca 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.064 186962 DEBUG nova.objects.instance [None req-da5cb54a-8d5d-4152-b979-30840abe45ca 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'flavor' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.093 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.095 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.119 186962 DEBUG nova.objects.instance [None req-da5cb54a-8d5d-4152-b979-30840abe45ca 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'info_cache' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.168 186962 DEBUG nova.virt.libvirt.driver [None req-da5cb54a-8d5d-4152-b979-30840abe45ca 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.174 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.351 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.352 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5463MB free_disk=73.04421997070312GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.353 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.353 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.576 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 19c7d8b2-3f1a-40cf-a538-dd2752970ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.578 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.579 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.579 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.629 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.645 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.667 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:28:24 np0005539505 nova_compute[186958]: 2025-11-29 07:28:24.668 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:24 np0005539505 podman[240765]: 2025-11-29 07:28:24.750346497 +0000 UTC m=+0.078564875 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:28:25 np0005539505 nova_compute[186958]: 2025-11-29 07:28:25.667 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:25 np0005539505 nova_compute[186958]: 2025-11-29 07:28:25.667 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:28:25 np0005539505 nova_compute[186958]: 2025-11-29 07:28:25.668 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:28:25 np0005539505 nova_compute[186958]: 2025-11-29 07:28:25.675 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:25 np0005539505 nova_compute[186958]: 2025-11-29 07:28:25.687 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:28:25 np0005539505 nova_compute[186958]: 2025-11-29 07:28:25.687 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:28:25 np0005539505 nova_compute[186958]: 2025-11-29 07:28:25.688 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:28:25 np0005539505 nova_compute[186958]: 2025-11-29 07:28:25.688 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:26 np0005539505 kernel: tap1e2538b2-92 (unregistering): left promiscuous mode
Nov 29 02:28:26 np0005539505 NetworkManager[55134]: <info>  [1764401306.3687] device (tap1e2538b2-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:28:26 np0005539505 nova_compute[186958]: 2025-11-29 07:28:26.375 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00605|binding|INFO|Releasing lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 from this chassis (sb_readonly=0)
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00606|binding|INFO|Setting lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 down in Southbound
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00607|binding|INFO|Removing iface tap1e2538b2-92 ovn-installed in OVS
Nov 29 02:28:26 np0005539505 nova_compute[186958]: 2025-11-29 07:28:26.378 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.384 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:02:7c 10.100.0.14'], port_security=['fa:16:3e:a4:02:7c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28412826-5463-46e4-95cb-a7d788b1ab15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7843cfa993a1428aaaa660321ebba1ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b91ab01c-e143-4067-9931-a92270268d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cbf7b29-c247-42f8-abc3-94d1e6be8d3f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=1e2538b2-9233-45f7-9334-e7fcdba1da31) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.386 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 1e2538b2-9233-45f7-9334-e7fcdba1da31 in datapath 28412826-5463-46e4-95cb-a7d788b1ab15 unbound from our chassis#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.388 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28412826-5463-46e4-95cb-a7d788b1ab15, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:28:26 np0005539505 nova_compute[186958]: 2025-11-29 07:28:26.390 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.391 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[50eee206-2e77-4973-a5a3-9cda2111b698]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.391 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 namespace which is not needed anymore#033[00m
Nov 29 02:28:26 np0005539505 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000083.scope: Deactivated successfully.
Nov 29 02:28:26 np0005539505 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d00000083.scope: Consumed 12.949s CPU time.
Nov 29 02:28:26 np0005539505 systemd-machined[153285]: Machine qemu-68-instance-00000083 terminated.
Nov 29 02:28:26 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[240452]: [NOTICE]   (240456) : haproxy version is 2.8.14-c23fe91
Nov 29 02:28:26 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[240452]: [NOTICE]   (240456) : path to executable is /usr/sbin/haproxy
Nov 29 02:28:26 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[240452]: [WARNING]  (240456) : Exiting Master process...
Nov 29 02:28:26 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[240452]: [WARNING]  (240456) : Exiting Master process...
Nov 29 02:28:26 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[240452]: [ALERT]    (240456) : Current worker (240458) exited with code 143 (Terminated)
Nov 29 02:28:26 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[240452]: [WARNING]  (240456) : All workers exited. Exiting... (0)
Nov 29 02:28:26 np0005539505 systemd[1]: libpod-f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4.scope: Deactivated successfully.
Nov 29 02:28:26 np0005539505 podman[240807]: 2025-11-29 07:28:26.520619769 +0000 UTC m=+0.041171767 container died f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:28:26 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4-userdata-shm.mount: Deactivated successfully.
Nov 29 02:28:26 np0005539505 systemd[1]: var-lib-containers-storage-overlay-d62a61b36e428f8380b0faab1c1d90b8aff2bea52253ee73e0fbbe284e37e861-merged.mount: Deactivated successfully.
Nov 29 02:28:26 np0005539505 podman[240807]: 2025-11-29 07:28:26.556737911 +0000 UTC m=+0.077289909 container cleanup f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:28:26 np0005539505 systemd[1]: libpod-conmon-f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4.scope: Deactivated successfully.
Nov 29 02:28:26 np0005539505 kernel: tap1e2538b2-92: entered promiscuous mode
Nov 29 02:28:26 np0005539505 NetworkManager[55134]: <info>  [1764401306.5949] manager: (tap1e2538b2-92): new Tun device (/org/freedesktop/NetworkManager/Devices/301)
Nov 29 02:28:26 np0005539505 nova_compute[186958]: 2025-11-29 07:28:26.595 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00608|binding|INFO|Claiming lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 for this chassis.
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00609|binding|INFO|1e2538b2-9233-45f7-9334-e7fcdba1da31: Claiming fa:16:3e:a4:02:7c 10.100.0.14
Nov 29 02:28:26 np0005539505 kernel: tap1e2538b2-92 (unregistering): left promiscuous mode
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.604 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:02:7c 10.100.0.14'], port_security=['fa:16:3e:a4:02:7c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28412826-5463-46e4-95cb-a7d788b1ab15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7843cfa993a1428aaaa660321ebba1ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b91ab01c-e143-4067-9931-a92270268d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cbf7b29-c247-42f8-abc3-94d1e6be8d3f, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=1e2538b2-9233-45f7-9334-e7fcdba1da31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00610|binding|INFO|Setting lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 ovn-installed in OVS
Nov 29 02:28:26 np0005539505 nova_compute[186958]: 2025-11-29 07:28:26.616 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00611|binding|INFO|Setting lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 up in Southbound
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00612|binding|INFO|Releasing lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 from this chassis (sb_readonly=1)
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00613|binding|INFO|Removing iface tap1e2538b2-92 ovn-installed in OVS
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00614|if_status|INFO|Dropped 6 log messages in last 676 seconds (most recently, 676 seconds ago) due to excessive rate
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00615|if_status|INFO|Not setting lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 down as sb is readonly
Nov 29 02:28:26 np0005539505 nova_compute[186958]: 2025-11-29 07:28:26.619 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:26 np0005539505 podman[240840]: 2025-11-29 07:28:26.620856156 +0000 UTC m=+0.046240920 container remove f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00616|binding|INFO|Releasing lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 from this chassis (sb_readonly=0)
Nov 29 02:28:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:26Z|00617|binding|INFO|Setting lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 down in Southbound
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.630 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:02:7c 10.100.0.14'], port_security=['fa:16:3e:a4:02:7c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28412826-5463-46e4-95cb-a7d788b1ab15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7843cfa993a1428aaaa660321ebba1ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b91ab01c-e143-4067-9931-a92270268d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cbf7b29-c247-42f8-abc3-94d1e6be8d3f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=1e2538b2-9233-45f7-9334-e7fcdba1da31) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:26 np0005539505 nova_compute[186958]: 2025-11-29 07:28:26.632 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.634 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d48ed793-5440-4faf-bb81-312745618ad1]: (4, ('Sat Nov 29 07:28:26 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 (f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4)\nf8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4\nSat Nov 29 07:28:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 (f8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4)\nf8f663a6c501b2b2454e4905a5d191be06a1822b742e47921a1de5f98af8f1c4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.635 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[396f4556-0084-4f4f-8094-cb1254fda34a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.637 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28412826-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:26 np0005539505 nova_compute[186958]: 2025-11-29 07:28:26.638 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:26 np0005539505 kernel: tap28412826-50: left promiscuous mode
Nov 29 02:28:26 np0005539505 nova_compute[186958]: 2025-11-29 07:28:26.651 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.654 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[39d7766c-5003-4dd7-ac78-cf4cebcee1c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.668 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3f854a-c9c7-4d97-988d-6bade37dbc43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.669 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7a13eeb1-0094-4850-be75-83600a90413c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.686 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4a811023-21cd-4137-9bc4-6677d51e5928]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 672513, 'reachable_time': 15036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240868, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:26 np0005539505 systemd[1]: run-netns-ovnmeta\x2d28412826\x2d5463\x2d46e4\x2d95cb\x2da7d788b1ab15.mount: Deactivated successfully.
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.688 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.688 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[0016e36a-c285-4ff7-85bb-4f0d28103c9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.691 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 1e2538b2-9233-45f7-9334-e7fcdba1da31 in datapath 28412826-5463-46e4-95cb-a7d788b1ab15 unbound from our chassis#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.693 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28412826-5463-46e4-95cb-a7d788b1ab15, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.694 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[35b89334-5556-484f-89b4-a374ab91753a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.694 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 1e2538b2-9233-45f7-9334-e7fcdba1da31 in datapath 28412826-5463-46e4-95cb-a7d788b1ab15 unbound from our chassis#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.696 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28412826-5463-46e4-95cb-a7d788b1ab15, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:28:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:26.696 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[21dc0636-561f-4e18-9c3b-a13d9752cb65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:26 np0005539505 nova_compute[186958]: 2025-11-29 07:28:26.993 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Updating instance_info_cache with network_info: [{"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.017 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.017 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.473 186962 INFO nova.virt.libvirt.driver [None req-da5cb54a-8d5d-4152-b979-30840abe45ca 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.479 186962 INFO nova.virt.libvirt.driver [-] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Instance destroyed successfully.#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.479 186962 DEBUG nova.objects.instance [None req-da5cb54a-8d5d-4152-b979-30840abe45ca 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'numa_topology' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.483 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:27 np0005539505 NetworkManager[55134]: <info>  [1764401307.4838] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Nov 29 02:28:27 np0005539505 NetworkManager[55134]: <info>  [1764401307.4844] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.497 186962 DEBUG nova.compute.manager [None req-da5cb54a-8d5d-4152-b979-30840abe45ca 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:27.507 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:27.508 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:27.509 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.620 186962 DEBUG oslo_concurrency.lockutils [None req-da5cb54a-8d5d-4152-b979-30840abe45ca 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.640 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:27Z|00618|binding|INFO|Releasing lport 84ef03fa-56c5-40a8-989e-6c02a5b46df5 from this chassis (sb_readonly=0)
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.663 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.683 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.721 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.722 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.723 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:27 np0005539505 nova_compute[186958]: 2025-11-29 07:28:27.745 186962 DEBUG nova.compute.manager [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.004 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.005 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.012 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.012 186962 INFO nova.compute.claims [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.151 186962 DEBUG nova.compute.provider_tree [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.171 186962 DEBUG nova.scheduler.client.report [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.195 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.196 186962 DEBUG nova.compute.manager [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.266 186962 DEBUG nova.compute.manager [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.267 186962 DEBUG nova.network.neutron [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.285 186962 INFO nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.303 186962 DEBUG nova.compute.manager [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.428 186962 DEBUG nova.compute.manager [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.429 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.430 186962 INFO nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Creating image(s)#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.431 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.431 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.433 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.450 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.479 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.524 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.526 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.526 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.543 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.610 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.611 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.647 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.648 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.649 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.689 186962 DEBUG nova.objects.instance [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'flavor' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.702 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.703 186962 DEBUG nova.virt.disk.api [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Checking if we can resize image /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.704 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.727 186962 DEBUG nova.objects.instance [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'info_cache' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.757 186962 DEBUG oslo_concurrency.lockutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.758 186962 DEBUG oslo_concurrency.lockutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquired lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.758 186962 DEBUG nova.network.neutron [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.760 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.760 186962 DEBUG nova.virt.disk.api [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Cannot resize image /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.760 186962 DEBUG nova.objects.instance [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'migration_context' on Instance uuid 5487c798-eb5a-4186-9693-a64ecd64b296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.786 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.787 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Ensure instance console log exists: /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.787 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.788 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.788 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:28 np0005539505 nova_compute[186958]: 2025-11-29 07:28:28.822 186962 DEBUG nova.policy [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:28:29 np0005539505 nova_compute[186958]: 2025-11-29 07:28:29.082 186962 DEBUG nova.compute.manager [req-dc62f5b1-9746-42a6-9f0c-ff4356c0ed39 req-6955ab52-df1b-480d-9b74-0fe710a702b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-unplugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:29 np0005539505 nova_compute[186958]: 2025-11-29 07:28:29.083 186962 DEBUG oslo_concurrency.lockutils [req-dc62f5b1-9746-42a6-9f0c-ff4356c0ed39 req-6955ab52-df1b-480d-9b74-0fe710a702b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:29 np0005539505 nova_compute[186958]: 2025-11-29 07:28:29.083 186962 DEBUG oslo_concurrency.lockutils [req-dc62f5b1-9746-42a6-9f0c-ff4356c0ed39 req-6955ab52-df1b-480d-9b74-0fe710a702b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:29 np0005539505 nova_compute[186958]: 2025-11-29 07:28:29.083 186962 DEBUG oslo_concurrency.lockutils [req-dc62f5b1-9746-42a6-9f0c-ff4356c0ed39 req-6955ab52-df1b-480d-9b74-0fe710a702b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:29 np0005539505 nova_compute[186958]: 2025-11-29 07:28:29.083 186962 DEBUG nova.compute.manager [req-dc62f5b1-9746-42a6-9f0c-ff4356c0ed39 req-6955ab52-df1b-480d-9b74-0fe710a702b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] No waiting events found dispatching network-vif-unplugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:29 np0005539505 nova_compute[186958]: 2025-11-29 07:28:29.084 186962 WARNING nova.compute.manager [req-dc62f5b1-9746-42a6-9f0c-ff4356c0ed39 req-6955ab52-df1b-480d-9b74-0fe710a702b7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received unexpected event network-vif-unplugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 for instance with vm_state stopped and task_state powering-on.#033[00m
Nov 29 02:28:29 np0005539505 nova_compute[186958]: 2025-11-29 07:28:29.452 186962 DEBUG nova.compute.manager [req-a8559951-424e-4e54-adf2-8ddc9d722aa7 req-a4799480-b870-43f8-bbbd-337193a8e53d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-changed-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:29 np0005539505 nova_compute[186958]: 2025-11-29 07:28:29.452 186962 DEBUG nova.compute.manager [req-a8559951-424e-4e54-adf2-8ddc9d722aa7 req-a4799480-b870-43f8-bbbd-337193a8e53d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Refreshing instance network info cache due to event network-changed-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:28:29 np0005539505 nova_compute[186958]: 2025-11-29 07:28:29.452 186962 DEBUG oslo_concurrency.lockutils [req-a8559951-424e-4e54-adf2-8ddc9d722aa7 req-a4799480-b870-43f8-bbbd-337193a8e53d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:28:29 np0005539505 nova_compute[186958]: 2025-11-29 07:28:29.453 186962 DEBUG oslo_concurrency.lockutils [req-a8559951-424e-4e54-adf2-8ddc9d722aa7 req-a4799480-b870-43f8-bbbd-337193a8e53d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:28:29 np0005539505 nova_compute[186958]: 2025-11-29 07:28:29.453 186962 DEBUG nova.network.neutron [req-a8559951-424e-4e54-adf2-8ddc9d722aa7 req-a4799480-b870-43f8-bbbd-337193a8e53d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Refreshing network info cache for port 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.495 186962 DEBUG nova.network.neutron [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Successfully created port: 0273a9e9-32f2-4363-a3b4-aa1a87caf07c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.677 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.727 186962 DEBUG nova.network.neutron [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Updating instance_info_cache with network_info: [{"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.755 186962 DEBUG oslo_concurrency.lockutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Releasing lock "refresh_cache-19c7d8b2-3f1a-40cf-a538-dd2752970ffb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.778 186962 INFO nova.virt.libvirt.driver [-] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Instance destroyed successfully.#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.779 186962 DEBUG nova.objects.instance [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'numa_topology' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.795 186962 DEBUG nova.objects.instance [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'resources' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.819 186962 DEBUG nova.virt.libvirt.vif [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1624881395',display_name='tempest-ListServerFiltersTestJSON-instance-1624881395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1624881395',id=131,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:27:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-br107bnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:28:27Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=19c7d8b2-3f1a-40cf-a538-dd2752970ffb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.820 186962 DEBUG nova.network.os_vif_util [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.821 186962 DEBUG nova.network.os_vif_util [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.821 186962 DEBUG os_vif [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.824 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.824 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e2538b2-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.826 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.829 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.831 186962 INFO os_vif [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92')#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.838 186962 DEBUG nova.virt.libvirt.driver [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Start _get_guest_xml network_info=[{"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.842 186962 WARNING nova.virt.libvirt.driver [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.846 186962 DEBUG nova.virt.libvirt.host [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.847 186962 DEBUG nova.virt.libvirt.host [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.850 186962 DEBUG nova.virt.libvirt.host [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.851 186962 DEBUG nova.virt.libvirt.host [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.852 186962 DEBUG nova.virt.libvirt.driver [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.852 186962 DEBUG nova.virt.hardware [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.852 186962 DEBUG nova.virt.hardware [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.853 186962 DEBUG nova.virt.hardware [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.853 186962 DEBUG nova.virt.hardware [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.853 186962 DEBUG nova.virt.hardware [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.853 186962 DEBUG nova.virt.hardware [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.854 186962 DEBUG nova.virt.hardware [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.854 186962 DEBUG nova.virt.hardware [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.854 186962 DEBUG nova.virt.hardware [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.855 186962 DEBUG nova.virt.hardware [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.855 186962 DEBUG nova.virt.hardware [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.855 186962 DEBUG nova.objects.instance [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'vcpu_model' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.874 186962 DEBUG oslo_concurrency.processutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.929 186962 DEBUG oslo_concurrency.processutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.config --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.930 186962 DEBUG oslo_concurrency.lockutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.930 186962 DEBUG oslo_concurrency.lockutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.931 186962 DEBUG oslo_concurrency.lockutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.932 186962 DEBUG nova.virt.libvirt.vif [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1624881395',display_name='tempest-ListServerFiltersTestJSON-instance-1624881395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1624881395',id=131,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:27:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-br107bnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:28:27Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=19c7d8b2-3f1a-40cf-a538-dd2752970ffb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.932 186962 DEBUG nova.network.os_vif_util [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.933 186962 DEBUG nova.network.os_vif_util [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.934 186962 DEBUG nova.objects.instance [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'pci_devices' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.950 186962 DEBUG nova.virt.libvirt.driver [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  <uuid>19c7d8b2-3f1a-40cf-a538-dd2752970ffb</uuid>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  <name>instance-00000083</name>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1624881395</nova:name>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:28:30</nova:creationTime>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:        <nova:user uuid="3e2a40601ced4de78fe1767769f262c0">tempest-ListServerFiltersTestJSON-1571311845-project-member</nova:user>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:        <nova:project uuid="7843cfa993a1428aaaa660321ebba1ac">tempest-ListServerFiltersTestJSON-1571311845</nova:project>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:        <nova:port uuid="1e2538b2-9233-45f7-9334-e7fcdba1da31">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <entry name="serial">19c7d8b2-3f1a-40cf-a538-dd2752970ffb</entry>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <entry name="uuid">19c7d8b2-3f1a-40cf-a538-dd2752970ffb</entry>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.config"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:a4:02:7c"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <target dev="tap1e2538b2-92"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/console.log" append="off"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <input type="keyboard" bus="usb"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:28:30 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:28:30 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:28:30 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:28:30 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:28:30 np0005539505 nova_compute[186958]: 2025-11-29 07:28:30.957 186962 DEBUG oslo_concurrency.processutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.026 186962 DEBUG oslo_concurrency.processutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.041 186962 DEBUG oslo_concurrency.processutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.092 186962 DEBUG oslo_concurrency.processutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.093 186962 DEBUG nova.objects.instance [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'trusted_certs' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.241 186962 DEBUG oslo_concurrency.processutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.297 186962 DEBUG oslo_concurrency.processutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.299 186962 DEBUG nova.virt.disk.api [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Checking if we can resize image /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.300 186962 DEBUG oslo_concurrency.processutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.367 186962 DEBUG oslo_concurrency.processutils [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.368 186962 DEBUG nova.virt.disk.api [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Cannot resize image /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.369 186962 DEBUG nova.objects.instance [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'migration_context' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.372 186962 DEBUG nova.network.neutron [req-a8559951-424e-4e54-adf2-8ddc9d722aa7 req-a4799480-b870-43f8-bbbd-337193a8e53d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Updated VIF entry in instance network info cache for port 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.373 186962 DEBUG nova.network.neutron [req-a8559951-424e-4e54-adf2-8ddc9d722aa7 req-a4799480-b870-43f8-bbbd-337193a8e53d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Updating instance_info_cache with network_info: [{"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.396 186962 DEBUG nova.virt.libvirt.vif [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1624881395',display_name='tempest-ListServerFiltersTestJSON-instance-1624881395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1624881395',id=131,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:27:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-br107bnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:27Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=19c7d8b2-3f1a-40cf-a538-dd2752970ffb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.397 186962 DEBUG nova.network.os_vif_util [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.398 186962 DEBUG nova.network.os_vif_util [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.398 186962 DEBUG os_vif [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.399 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.400 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.400 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.404 186962 DEBUG nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.405 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.405 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.406 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.406 186962 DEBUG nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] No waiting events found dispatching network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.406 186962 WARNING nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received unexpected event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 for instance with vm_state stopped and task_state powering-on.#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.407 186962 DEBUG nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.407 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.407 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.408 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.408 186962 DEBUG nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] No waiting events found dispatching network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.408 186962 WARNING nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received unexpected event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 for instance with vm_state stopped and task_state powering-on.#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.409 186962 DEBUG nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.409 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.409 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.410 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.410 186962 DEBUG nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] No waiting events found dispatching network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.410 186962 WARNING nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received unexpected event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 for instance with vm_state stopped and task_state powering-on.#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.410 186962 DEBUG nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-unplugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.411 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.411 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.411 186962 DEBUG oslo_concurrency.lockutils [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.411 186962 DEBUG nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] No waiting events found dispatching network-vif-unplugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.412 186962 WARNING nova.compute.manager [req-d1284ca4-c8d8-4adc-81e1-a9a778b93122 req-a5249d04-acf2-4708-8b08-bced7e7ca363 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received unexpected event network-vif-unplugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 for instance with vm_state stopped and task_state powering-on.#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.413 186962 DEBUG oslo_concurrency.lockutils [req-a8559951-424e-4e54-adf2-8ddc9d722aa7 req-a4799480-b870-43f8-bbbd-337193a8e53d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.414 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.414 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e2538b2-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.415 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1e2538b2-92, col_values=(('external_ids', {'iface-id': '1e2538b2-9233-45f7-9334-e7fcdba1da31', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:02:7c', 'vm-uuid': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.416 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539505 NetworkManager[55134]: <info>  [1764401311.4177] manager: (tap1e2538b2-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.419 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.422 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.425 186962 INFO os_vif [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92')#033[00m
Nov 29 02:28:31 np0005539505 kernel: tap1e2538b2-92: entered promiscuous mode
Nov 29 02:28:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:31Z|00619|binding|INFO|Claiming lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 for this chassis.
Nov 29 02:28:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:31Z|00620|binding|INFO|1e2538b2-9233-45f7-9334-e7fcdba1da31: Claiming fa:16:3e:a4:02:7c 10.100.0.14
Nov 29 02:28:31 np0005539505 NetworkManager[55134]: <info>  [1764401311.5131] manager: (tap1e2538b2-92): new Tun device (/org/freedesktop/NetworkManager/Devices/305)
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.512 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.517 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:02:7c 10.100.0.14'], port_security=['fa:16:3e:a4:02:7c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28412826-5463-46e4-95cb-a7d788b1ab15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7843cfa993a1428aaaa660321ebba1ac', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'b91ab01c-e143-4067-9931-a92270268d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cbf7b29-c247-42f8-abc3-94d1e6be8d3f, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=1e2538b2-9233-45f7-9334-e7fcdba1da31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.518 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 1e2538b2-9233-45f7-9334-e7fcdba1da31 in datapath 28412826-5463-46e4-95cb-a7d788b1ab15 bound to our chassis#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.520 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 28412826-5463-46e4-95cb-a7d788b1ab15#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.532 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2ce35242-4640-4ae6-8baa-ee781b4e7cbd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.533 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap28412826-51 in ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.536 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap28412826-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.537 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[75a0bc7e-280d-4dd3-aa9a-ce2421206029]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 systemd-udevd[240915]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.537 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[72337e34-d034-4d31-a04c-3fedc7e3231d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:31Z|00621|binding|INFO|Setting lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 ovn-installed in OVS
Nov 29 02:28:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:31Z|00622|binding|INFO|Setting lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 up in Southbound
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.550 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.550 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[eb57200f-7d6d-470c-882a-07a4450376ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 NetworkManager[55134]: <info>  [1764401311.5566] device (tap1e2538b2-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:28:31 np0005539505 systemd-machined[153285]: New machine qemu-70-instance-00000083.
Nov 29 02:28:31 np0005539505 NetworkManager[55134]: <info>  [1764401311.5623] device (tap1e2538b2-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.562 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[996ae569-39f3-45d1-be44-5f7a939bf167]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 systemd[1]: Started Virtual Machine qemu-70-instance-00000083.
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.595 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e99f5c80-34a7-4520-b748-ca2c362480b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 NetworkManager[55134]: <info>  [1764401311.6019] manager: (tap28412826-50): new Veth device (/org/freedesktop/NetworkManager/Devices/306)
Nov 29 02:28:31 np0005539505 systemd-udevd[240922]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.601 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b9682bb7-7b39-4325-bf5f-1a5680c09dcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.634 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a716ce9e-2c51-4ded-86fa-b3c778f09c4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.637 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c83737e8-e3da-4199-bd03-2dee90589d11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 NetworkManager[55134]: <info>  [1764401311.6554] device (tap28412826-50): carrier: link connected
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.659 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[338a80a1-2459-48ab-821a-7696ceff7476]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.674 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ddeecb95-d3b3-4cdb-9e67-dcaeb05f5752]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28412826-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c0:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675923, 'reachable_time': 37132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240950, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.692 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[500f7514-3271-421c-afe9-b683fb0fe05d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:c072'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 675923, 'tstamp': 675923}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240951, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.707 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f06fbc1d-0deb-4ecd-8cef-799573949ae1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28412826-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c0:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675923, 'reachable_time': 37132, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240952, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.738 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cea5ddd0-153a-4927-aac2-a76cecacc878]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.799 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[197503cd-6fbc-49f1-8d96-43ec795eb02f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.801 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28412826-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.801 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.802 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28412826-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:31 np0005539505 NetworkManager[55134]: <info>  [1764401311.8045] manager: (tap28412826-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Nov 29 02:28:31 np0005539505 kernel: tap28412826-50: entered promiscuous mode
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.803 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.805 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.807 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap28412826-50, col_values=(('external_ids', {'iface-id': '2abf732f-8f8c-470e-b6e2-def265b14d70'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.808 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:31Z|00623|binding|INFO|Releasing lport 2abf732f-8f8c-470e-b6e2-def265b14d70 from this chassis (sb_readonly=0)
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.824 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.825 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.826 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2e6003f0-bec3-4ffc-8711-1b5cb16cf6b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.827 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-28412826-5463-46e4-95cb-a7d788b1ab15
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/28412826-5463-46e4-95cb-a7d788b1ab15.pid.haproxy
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 28412826-5463-46e4-95cb-a7d788b1ab15
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:28:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:31.827 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'env', 'PROCESS_TAG=haproxy-28412826-5463-46e4-95cb-a7d788b1ab15', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/28412826-5463-46e4-95cb-a7d788b1ab15.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.907 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Removed pending event for 19c7d8b2-3f1a-40cf-a538-dd2752970ffb due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.908 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401311.906031, 19c7d8b2-3f1a-40cf-a538-dd2752970ffb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.908 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.910 186962 DEBUG nova.compute.manager [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.916 186962 INFO nova.virt.libvirt.driver [-] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Instance rebooted successfully.#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.917 186962 DEBUG nova.compute.manager [None req-d070e8e8-fc23-492a-b77e-ca7e5bcf9a67 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.950 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:31 np0005539505 nova_compute[186958]: 2025-11-29 07:28:31.953 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:28:32 np0005539505 nova_compute[186958]: 2025-11-29 07:28:32.008 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401311.9072266, 19c7d8b2-3f1a-40cf-a538-dd2752970ffb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:32 np0005539505 nova_compute[186958]: 2025-11-29 07:28:32.009 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] VM Started (Lifecycle Event)#033[00m
Nov 29 02:28:32 np0005539505 nova_compute[186958]: 2025-11-29 07:28:32.030 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:32 np0005539505 nova_compute[186958]: 2025-11-29 07:28:32.036 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:28:32 np0005539505 podman[240991]: 2025-11-29 07:28:32.192130675 +0000 UTC m=+0.068128700 container create e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:28:32 np0005539505 systemd[1]: Started libpod-conmon-e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9.scope.
Nov 29 02:28:32 np0005539505 podman[240991]: 2025-11-29 07:28:32.153932453 +0000 UTC m=+0.029930498 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:28:32 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:28:32 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a634b3bce9b25e2f7774bd2016dfffbabae8e78ae8229234d1ae66b2dd2a682d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:28:32 np0005539505 nova_compute[186958]: 2025-11-29 07:28:32.274 186962 DEBUG nova.network.neutron [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Successfully updated port: 0273a9e9-32f2-4363-a3b4-aa1a87caf07c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:28:32 np0005539505 podman[240991]: 2025-11-29 07:28:32.280390132 +0000 UTC m=+0.156388157 container init e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:28:32 np0005539505 podman[240991]: 2025-11-29 07:28:32.287908025 +0000 UTC m=+0.163906040 container start e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 29 02:28:32 np0005539505 nova_compute[186958]: 2025-11-29 07:28:32.292 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "refresh_cache-5487c798-eb5a-4186-9693-a64ecd64b296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:28:32 np0005539505 nova_compute[186958]: 2025-11-29 07:28:32.293 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquired lock "refresh_cache-5487c798-eb5a-4186-9693-a64ecd64b296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:28:32 np0005539505 nova_compute[186958]: 2025-11-29 07:28:32.293 186962 DEBUG nova.network.neutron [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:28:32 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[241006]: [NOTICE]   (241010) : New worker (241012) forked
Nov 29 02:28:32 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[241006]: [NOTICE]   (241010) : Loading success.
Nov 29 02:28:32 np0005539505 nova_compute[186958]: 2025-11-29 07:28:32.527 186962 DEBUG nova.network.neutron [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.507 186962 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.508 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.509 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.509 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.510 186962 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] No waiting events found dispatching network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.510 186962 WARNING nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received unexpected event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.511 186962 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received event network-changed-0273a9e9-32f2-4363-a3b4-aa1a87caf07c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.512 186962 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Refreshing instance network info cache due to event network-changed-0273a9e9-32f2-4363-a3b4-aa1a87caf07c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.512 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-5487c798-eb5a-4186-9693-a64ecd64b296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.766 186962 DEBUG nova.network.neutron [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Updating instance_info_cache with network_info: [{"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.828 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Releasing lock "refresh_cache-5487c798-eb5a-4186-9693-a64ecd64b296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.829 186962 DEBUG nova.compute.manager [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Instance network_info: |[{"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.829 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-5487c798-eb5a-4186-9693-a64ecd64b296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.830 186962 DEBUG nova.network.neutron [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Refreshing network info cache for port 0273a9e9-32f2-4363-a3b4-aa1a87caf07c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.833 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Start _get_guest_xml network_info=[{"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.837 186962 WARNING nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.842 186962 DEBUG nova.virt.libvirt.host [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.843 186962 DEBUG nova.virt.libvirt.host [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.847 186962 DEBUG nova.virt.libvirt.host [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.848 186962 DEBUG nova.virt.libvirt.host [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.849 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.850 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.850 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.851 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.851 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.851 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.851 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.852 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.852 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.852 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.853 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.853 186962 DEBUG nova.virt.hardware [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.856 186962 DEBUG nova.virt.libvirt.vif [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:28:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-400688363',display_name='tempest-ServerRescueNegativeTestJSON-server-400688363',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-400688363',id=137,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d1e4f74add34e9b9a2084bd9586db0c',ramdisk_id='',reservation_id='r-1c7hw7ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1892401049',owner_user_name='tempest-ServerRescueNegativeTestJSON-1892401049-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:28Z,user_data=None,user_id='4863fb992d4c48de9a92f63ffb1174a8',uuid=5487c798-eb5a-4186-9693-a64ecd64b296,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.857 186962 DEBUG nova.network.os_vif_util [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converting VIF {"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.857 186962 DEBUG nova.network.os_vif_util [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:ab:ce,bridge_name='br-int',has_traffic_filtering=True,id=0273a9e9-32f2-4363-a3b4-aa1a87caf07c,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0273a9e9-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.859 186962 DEBUG nova.objects.instance [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'pci_devices' on Instance uuid 5487c798-eb5a-4186-9693-a64ecd64b296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.880 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  <uuid>5487c798-eb5a-4186-9693-a64ecd64b296</uuid>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  <name>instance-00000089</name>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-400688363</nova:name>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:28:33</nova:creationTime>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:        <nova:user uuid="4863fb992d4c48de9a92f63ffb1174a8">tempest-ServerRescueNegativeTestJSON-1892401049-project-member</nova:user>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:        <nova:project uuid="5d1e4f74add34e9b9a2084bd9586db0c">tempest-ServerRescueNegativeTestJSON-1892401049</nova:project>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:        <nova:port uuid="0273a9e9-32f2-4363-a3b4-aa1a87caf07c">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <entry name="serial">5487c798-eb5a-4186-9693-a64ecd64b296</entry>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <entry name="uuid">5487c798-eb5a-4186-9693-a64ecd64b296</entry>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.config"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:7d:ab:ce"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <target dev="tap0273a9e9-32"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/console.log" append="off"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:28:33 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:28:33 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:28:33 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:28:33 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.886 186962 DEBUG nova.compute.manager [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Preparing to wait for external event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.886 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.886 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.887 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.888 186962 DEBUG nova.virt.libvirt.vif [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:28:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-400688363',display_name='tempest-ServerRescueNegativeTestJSON-server-400688363',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-400688363',id=137,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5d1e4f74add34e9b9a2084bd9586db0c',ramdisk_id='',reservation_id='r-1c7hw7ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-1892401049',owner_user_name='tempest-ServerRescueNegativeTestJSON-1892401049-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:28Z,user_data=None,user_id='4863fb992d4c48de9a92f63ffb1174a8',uuid=5487c798-eb5a-4186-9693-a64ecd64b296,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.888 186962 DEBUG nova.network.os_vif_util [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converting VIF {"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.889 186962 DEBUG nova.network.os_vif_util [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:ab:ce,bridge_name='br-int',has_traffic_filtering=True,id=0273a9e9-32f2-4363-a3b4-aa1a87caf07c,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0273a9e9-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.889 186962 DEBUG os_vif [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:ab:ce,bridge_name='br-int',has_traffic_filtering=True,id=0273a9e9-32f2-4363-a3b4-aa1a87caf07c,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0273a9e9-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.890 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.890 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.891 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.893 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.894 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0273a9e9-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.894 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0273a9e9-32, col_values=(('external_ids', {'iface-id': '0273a9e9-32f2-4363-a3b4-aa1a87caf07c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:ab:ce', 'vm-uuid': '5487c798-eb5a-4186-9693-a64ecd64b296'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.896 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:33 np0005539505 NetworkManager[55134]: <info>  [1764401313.8969] manager: (tap0273a9e9-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.901 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.902 186962 INFO os_vif [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:ab:ce,bridge_name='br-int',has_traffic_filtering=True,id=0273a9e9-32f2-4363-a3b4-aa1a87caf07c,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0273a9e9-32')#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.952 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.953 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.954 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] No VIF found with MAC fa:16:3e:7d:ab:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:28:33 np0005539505 nova_compute[186958]: 2025-11-29 07:28:33.954 186962 INFO nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Using config drive#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.274 186962 INFO nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Creating config drive at /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.config#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.278 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz_t4_z0c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.403 186962 DEBUG oslo_concurrency.processutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz_t4_z0c" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:34 np0005539505 kernel: tap0273a9e9-32: entered promiscuous mode
Nov 29 02:28:34 np0005539505 NetworkManager[55134]: <info>  [1764401314.4540] manager: (tap0273a9e9-32): new Tun device (/org/freedesktop/NetworkManager/Devices/309)
Nov 29 02:28:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:34Z|00624|binding|INFO|Claiming lport 0273a9e9-32f2-4363-a3b4-aa1a87caf07c for this chassis.
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.460 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:34Z|00625|binding|INFO|0273a9e9-32f2-4363-a3b4-aa1a87caf07c: Claiming fa:16:3e:7d:ab:ce 10.100.0.8
Nov 29 02:28:34 np0005539505 NetworkManager[55134]: <info>  [1764401314.4663] device (tap0273a9e9-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:28:34 np0005539505 NetworkManager[55134]: <info>  [1764401314.4675] device (tap0273a9e9-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.471 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:ab:ce 10.100.0.8'], port_security=['fa:16:3e:7d:ab:ce 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-008329a1-d4dc-4cfb-be68-95f658d9813d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b04b69be-f431-4979-89c6-4e231888644a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7e71bac-297c-4031-8579-254c834f5859, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=0273a9e9-32f2-4363-a3b4-aa1a87caf07c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.472 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 0273a9e9-32f2-4363-a3b4-aa1a87caf07c in datapath 008329a1-d4dc-4cfb-be68-95f658d9813d bound to our chassis#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.474 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 008329a1-d4dc-4cfb-be68-95f658d9813d#033[00m
Nov 29 02:28:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:34Z|00626|binding|INFO|Setting lport 0273a9e9-32f2-4363-a3b4-aa1a87caf07c ovn-installed in OVS
Nov 29 02:28:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:34Z|00627|binding|INFO|Setting lport 0273a9e9-32f2-4363-a3b4-aa1a87caf07c up in Southbound
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.479 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:34Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:93:e0 10.100.0.7
Nov 29 02:28:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:34Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:93:e0 10.100.0.7
Nov 29 02:28:34 np0005539505 systemd-machined[153285]: New machine qemu-71-instance-00000089.
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.491 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5d889565-f702-45ef-abe5-b9d6ad0dd42e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.492 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap008329a1-d1 in ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.494 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap008329a1-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.494 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[99b5c10c-27a5-46ec-be7b-eec7c1a14a4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.495 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[959189a1-a32d-4c56-8b1b-aacc3718c562]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 systemd[1]: Started Virtual Machine qemu-71-instance-00000089.
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.504 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[47895224-584b-4b50-8f8a-3eedffff7fd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.518 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ed45b917-2d7a-4bdd-8bdb-d54aa8ebed09]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.560 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[702fbec7-3c5e-481c-b63f-4e157ab39b73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 NetworkManager[55134]: <info>  [1764401314.5730] manager: (tap008329a1-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/310)
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.571 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[50fef59c-9462-4c27-865f-7db86d5b8921]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 systemd-udevd[241065]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.606 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[17f052ac-0dd5-4045-ae6d-5ef7867c2791]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.610 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ef83300e-70c3-465d-aece-9044bf541e6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 NetworkManager[55134]: <info>  [1764401314.6364] device (tap008329a1-d0): carrier: link connected
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.644 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[7729a264-55a1-4044-b3a6-22e637aa10ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.663 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1b3cbe23-26ce-4950-8de5-30581d32fce8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap008329a1-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:0d:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 676222, 'reachable_time': 38557, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241085, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.681 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[227da4c0-43b8-46e0-830c-3dfa4b5baf92]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:da9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 676222, 'tstamp': 676222}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241086, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.697 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5466b38c-6c1e-4ed6-bc29-e34d5694b56a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap008329a1-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:0d:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 198], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 676222, 'reachable_time': 38557, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241089, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.733 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f6249390-0525-4eb6-b993-d24ec7638b9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.792 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401314.7914317, 5487c798-eb5a-4186-9693-a64ecd64b296 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.793 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] VM Started (Lifecycle Event)#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.792 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e180e7f5-eace-426a-93c6-acf3cba3ee90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.795 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap008329a1-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.795 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.795 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap008329a1-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:34 np0005539505 NetworkManager[55134]: <info>  [1764401314.7980] manager: (tap008329a1-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.797 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:34 np0005539505 kernel: tap008329a1-d0: entered promiscuous mode
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.800 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.801 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap008329a1-d0, col_values=(('external_ids', {'iface-id': 'd36011d9-2f3d-4616-b3ba-40f6405df460'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.801 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:34Z|00628|binding|INFO|Releasing lport d36011d9-2f3d-4616-b3ba-40f6405df460 from this chassis (sb_readonly=0)
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.811 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.814 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.814 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/008329a1-d4dc-4cfb-be68-95f658d9813d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/008329a1-d4dc-4cfb-be68-95f658d9813d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.815 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[061c798e-a5a1-4f77-8d2a-2ce69b8bff90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.816 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-008329a1-d4dc-4cfb-be68-95f658d9813d
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/008329a1-d4dc-4cfb-be68-95f658d9813d.pid.haproxy
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 008329a1-d4dc-4cfb-be68-95f658d9813d
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:28:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:34.816 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'env', 'PROCESS_TAG=haproxy-008329a1-d4dc-4cfb-be68-95f658d9813d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/008329a1-d4dc-4cfb-be68-95f658d9813d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.818 186962 DEBUG nova.compute.manager [req-e5205236-a494-40fb-945e-0feda892bb2c req-90924c8c-fc83-441b-8ed0-a8dc13a696c9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.818 186962 DEBUG oslo_concurrency.lockutils [req-e5205236-a494-40fb-945e-0feda892bb2c req-90924c8c-fc83-441b-8ed0-a8dc13a696c9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.819 186962 DEBUG oslo_concurrency.lockutils [req-e5205236-a494-40fb-945e-0feda892bb2c req-90924c8c-fc83-441b-8ed0-a8dc13a696c9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.819 186962 DEBUG oslo_concurrency.lockutils [req-e5205236-a494-40fb-945e-0feda892bb2c req-90924c8c-fc83-441b-8ed0-a8dc13a696c9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.819 186962 DEBUG nova.compute.manager [req-e5205236-a494-40fb-945e-0feda892bb2c req-90924c8c-fc83-441b-8ed0-a8dc13a696c9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Processing event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.820 186962 DEBUG nova.compute.manager [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.825 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401314.791695, 5487c798-eb5a-4186-9693-a64ecd64b296 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.825 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.827 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.830 186962 INFO nova.virt.libvirt.driver [-] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Instance spawned successfully.#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.830 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.846 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.852 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401314.8238974, 5487c798-eb5a-4186-9693-a64ecd64b296 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.852 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.856 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.856 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.857 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.857 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.858 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.858 186962 DEBUG nova.virt.libvirt.driver [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.888 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.893 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.917 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.940 186962 INFO nova.compute.manager [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Took 6.51 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:28:34 np0005539505 nova_compute[186958]: 2025-11-29 07:28:34.941 186962 DEBUG nova.compute.manager [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.073 186962 INFO nova.compute.manager [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Took 7.15 seconds to build instance.#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.102 186962 DEBUG oslo_concurrency.lockutils [None req-eec97d2c-cc81-4aac-ae27-ffcb46135d25 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.380s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.107 186962 DEBUG nova.network.neutron [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Updated VIF entry in instance network info cache for port 0273a9e9-32f2-4363-a3b4-aa1a87caf07c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.107 186962 DEBUG nova.network.neutron [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Updating instance_info_cache with network_info: [{"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.127 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-5487c798-eb5a-4186-9693-a64ecd64b296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.128 186962 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.128 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.128 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.129 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.129 186962 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] No waiting events found dispatching network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.129 186962 WARNING nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received unexpected event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.129 186962 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.129 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.130 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.130 186962 DEBUG oslo_concurrency.lockutils [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.130 186962 DEBUG nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] No waiting events found dispatching network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.130 186962 WARNING nova.compute.manager [req-99f7ab0f-a51c-4a63-9248-69620f55847e req-564aa182-2f48-43f4-87a7-b3a709679980 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received unexpected event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:28:35 np0005539505 podman[241126]: 2025-11-29 07:28:35.181292024 +0000 UTC m=+0.059008571 container create 00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:28:35 np0005539505 systemd[1]: Started libpod-conmon-00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7.scope.
Nov 29 02:28:35 np0005539505 podman[241126]: 2025-11-29 07:28:35.158107358 +0000 UTC m=+0.035823925 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:28:35 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:28:35 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca9b21ffcc47252e9e09ad7acbe73bb4b9630206e844c6bda2dcfc95eea105cd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:28:35 np0005539505 podman[241126]: 2025-11-29 07:28:35.286475211 +0000 UTC m=+0.164191778 container init 00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:28:35 np0005539505 podman[241126]: 2025-11-29 07:28:35.290932167 +0000 UTC m=+0.168648724 container start 00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:28:35 np0005539505 podman[241136]: 2025-11-29 07:28:35.359453525 +0000 UTC m=+0.146358722 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:28:35 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241157]: [NOTICE]   (241183) : New worker (241190) forked
Nov 29 02:28:35 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241157]: [NOTICE]   (241183) : Loading success.
Nov 29 02:28:35 np0005539505 podman[241137]: 2025-11-29 07:28:35.391624537 +0000 UTC m=+0.176061634 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:28:35 np0005539505 nova_compute[186958]: 2025-11-29 07:28:35.680 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:37 np0005539505 nova_compute[186958]: 2025-11-29 07:28:37.009 186962 DEBUG nova.compute.manager [req-050dc3b7-7ded-4bd8-be67-c174c7d9ddd1 req-0f5d7572-56c5-40ec-b1b2-6705407ad521 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:37 np0005539505 nova_compute[186958]: 2025-11-29 07:28:37.010 186962 DEBUG oslo_concurrency.lockutils [req-050dc3b7-7ded-4bd8-be67-c174c7d9ddd1 req-0f5d7572-56c5-40ec-b1b2-6705407ad521 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:37 np0005539505 nova_compute[186958]: 2025-11-29 07:28:37.011 186962 DEBUG oslo_concurrency.lockutils [req-050dc3b7-7ded-4bd8-be67-c174c7d9ddd1 req-0f5d7572-56c5-40ec-b1b2-6705407ad521 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:37 np0005539505 nova_compute[186958]: 2025-11-29 07:28:37.011 186962 DEBUG oslo_concurrency.lockutils [req-050dc3b7-7ded-4bd8-be67-c174c7d9ddd1 req-0f5d7572-56c5-40ec-b1b2-6705407ad521 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:37 np0005539505 nova_compute[186958]: 2025-11-29 07:28:37.012 186962 DEBUG nova.compute.manager [req-050dc3b7-7ded-4bd8-be67-c174c7d9ddd1 req-0f5d7572-56c5-40ec-b1b2-6705407ad521 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] No waiting events found dispatching network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:37 np0005539505 nova_compute[186958]: 2025-11-29 07:28:37.015 186962 WARNING nova.compute.manager [req-050dc3b7-7ded-4bd8-be67-c174c7d9ddd1 req-0f5d7572-56c5-40ec-b1b2-6705407ad521 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received unexpected event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c for instance with vm_state active and task_state None.#033[00m
Nov 29 02:28:37 np0005539505 nova_compute[186958]: 2025-11-29 07:28:37.313 186962 INFO nova.compute.manager [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Rescuing#033[00m
Nov 29 02:28:37 np0005539505 nova_compute[186958]: 2025-11-29 07:28:37.314 186962 DEBUG oslo_concurrency.lockutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "refresh_cache-5487c798-eb5a-4186-9693-a64ecd64b296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:28:37 np0005539505 nova_compute[186958]: 2025-11-29 07:28:37.314 186962 DEBUG oslo_concurrency.lockutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquired lock "refresh_cache-5487c798-eb5a-4186-9693-a64ecd64b296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:28:37 np0005539505 nova_compute[186958]: 2025-11-29 07:28:37.314 186962 DEBUG nova.network.neutron [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:28:38 np0005539505 nova_compute[186958]: 2025-11-29 07:28:38.432 186962 DEBUG nova.network.neutron [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Updating instance_info_cache with network_info: [{"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:28:38 np0005539505 podman[241201]: 2025-11-29 07:28:38.728989751 +0000 UTC m=+0.056455859 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 02:28:38 np0005539505 podman[241200]: 2025-11-29 07:28:38.738711176 +0000 UTC m=+0.065793073 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2)
Nov 29 02:28:38 np0005539505 nova_compute[186958]: 2025-11-29 07:28:38.897 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:39 np0005539505 nova_compute[186958]: 2025-11-29 07:28:39.327 186962 DEBUG oslo_concurrency.lockutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Releasing lock "refresh_cache-5487c798-eb5a-4186-9693-a64ecd64b296" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:28:40 np0005539505 nova_compute[186958]: 2025-11-29 07:28:40.682 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:40 np0005539505 nova_compute[186958]: 2025-11-29 07:28:40.886 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:28:41 np0005539505 nova_compute[186958]: 2025-11-29 07:28:41.424 186962 INFO nova.compute.manager [None req-522aa00a-f06c-4f5b-9ddd-009653682c7b bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Get console output#033[00m
Nov 29 02:28:41 np0005539505 nova_compute[186958]: 2025-11-29 07:28:41.430 213540 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:28:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:43.851 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:43.853 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:28:43 np0005539505 nova_compute[186958]: 2025-11-29 07:28:43.852 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:43 np0005539505 nova_compute[186958]: 2025-11-29 07:28:43.898 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:43 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:43Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a4:02:7c 10.100.0.14
Nov 29 02:28:43 np0005539505 nova_compute[186958]: 2025-11-29 07:28:43.934 186962 DEBUG nova.objects.instance [None req-865f1f23-0fc8-4094-a168-6c2b1f100132 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 97b39e54-312a-4ebc-863b-e1ef5f4cf363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:44 np0005539505 nova_compute[186958]: 2025-11-29 07:28:44.707 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401324.7077408, 97b39e54-312a-4ebc-863b-e1ef5f4cf363 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:28:44 np0005539505 nova_compute[186958]: 2025-11-29 07:28:44.708 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:28:45 np0005539505 nova_compute[186958]: 2025-11-29 07:28:45.456 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:45 np0005539505 nova_compute[186958]: 2025-11-29 07:28:45.460 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:28:45 np0005539505 nova_compute[186958]: 2025-11-29 07:28:45.684 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:45 np0005539505 nova_compute[186958]: 2025-11-29 07:28:45.723 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Nov 29 02:28:46 np0005539505 kernel: tap9083d4b6-b3 (unregistering): left promiscuous mode
Nov 29 02:28:46 np0005539505 NetworkManager[55134]: <info>  [1764401326.3174] device (tap9083d4b6-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:28:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:46Z|00629|binding|INFO|Releasing lport 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 from this chassis (sb_readonly=0)
Nov 29 02:28:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:46Z|00630|binding|INFO|Setting lport 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 down in Southbound
Nov 29 02:28:46 np0005539505 nova_compute[186958]: 2025-11-29 07:28:46.326 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:46Z|00631|binding|INFO|Removing iface tap9083d4b6-b3 ovn-installed in OVS
Nov 29 02:28:46 np0005539505 nova_compute[186958]: 2025-11-29 07:28:46.329 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:46 np0005539505 nova_compute[186958]: 2025-11-29 07:28:46.339 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:46 np0005539505 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000086.scope: Deactivated successfully.
Nov 29 02:28:46 np0005539505 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000086.scope: Consumed 13.406s CPU time.
Nov 29 02:28:46 np0005539505 systemd-machined[153285]: Machine qemu-69-instance-00000086 terminated.
Nov 29 02:28:46 np0005539505 nova_compute[186958]: 2025-11-29 07:28:46.552 186962 DEBUG nova.compute.manager [None req-865f1f23-0fc8-4094-a168-6c2b1f100132 bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:28:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:46.766 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:93:e0 10.100.0.7'], port_security=['fa:16:3e:19:93:e0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '97b39e54-312a-4ebc-863b-e1ef5f4cf363', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '4', 'neutron:security_group_ids': '99684fb2-cb2f-45bc-99fa-10934e68636b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.224'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=48e2bcd2-f544-4812-a73e-4d43e4ef323e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:46.768 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 in datapath 717f1c01-fb17-41f0-848c-cebdb3841bf9 unbound from our chassis#033[00m
Nov 29 02:28:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:46.771 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 717f1c01-fb17-41f0-848c-cebdb3841bf9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:28:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:46.773 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fd7a24ea-0719-4408-b618-d0705d5537c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:46.774 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9 namespace which is not needed anymore#033[00m
Nov 29 02:28:46 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[240692]: [NOTICE]   (240696) : haproxy version is 2.8.14-c23fe91
Nov 29 02:28:46 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[240692]: [NOTICE]   (240696) : path to executable is /usr/sbin/haproxy
Nov 29 02:28:46 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[240692]: [WARNING]  (240696) : Exiting Master process...
Nov 29 02:28:46 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[240692]: [ALERT]    (240696) : Current worker (240698) exited with code 143 (Terminated)
Nov 29 02:28:46 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[240692]: [WARNING]  (240696) : All workers exited. Exiting... (0)
Nov 29 02:28:46 np0005539505 systemd[1]: libpod-082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02.scope: Deactivated successfully.
Nov 29 02:28:46 np0005539505 podman[241295]: 2025-11-29 07:28:46.969605678 +0000 UTC m=+0.111605330 container died 082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:28:47 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02-userdata-shm.mount: Deactivated successfully.
Nov 29 02:28:47 np0005539505 systemd[1]: var-lib-containers-storage-overlay-c00cf8ba1b3bebf1e619726fe0040ad93dc7cf07c97a4efbb93fc0168bbceb3f-merged.mount: Deactivated successfully.
Nov 29 02:28:47 np0005539505 podman[241295]: 2025-11-29 07:28:47.113601784 +0000 UTC m=+0.255601436 container cleanup 082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 02:28:47 np0005539505 systemd[1]: libpod-conmon-082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02.scope: Deactivated successfully.
Nov 29 02:28:47 np0005539505 podman[241342]: 2025-11-29 07:28:47.175919457 +0000 UTC m=+0.041496695 container remove 082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 29 02:28:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:47.180 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0c7b861c-6d05-49b6-a878-2e28ce8abc61]: (4, ('Sat Nov 29 07:28:46 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9 (082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02)\n082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02\nSat Nov 29 07:28:47 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9 (082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02)\n082133769c36e20c38065200bf2a23b0e1834adad27d0f365f8ced74688e7d02\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:47.182 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[225a0909-0027-4548-9ee2-c9b5166d3247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:47.183 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap717f1c01-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:47 np0005539505 nova_compute[186958]: 2025-11-29 07:28:47.185 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:47 np0005539505 kernel: tap717f1c01-f0: left promiscuous mode
Nov 29 02:28:47 np0005539505 nova_compute[186958]: 2025-11-29 07:28:47.201 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:47.205 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a7c4e434-240c-4e76-8111-96a4ba3320c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:47.219 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[30db52f1-158d-448d-bbf5-f98ac87f0666]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:47.221 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[82d30c1f-16c4-4131-82b5-304632c8da2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:47.242 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8acc8263-7595-40e3-9293-f0c729f42f90]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 674695, 'reachable_time': 42225, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241361, 'error': None, 'target': 'ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:47 np0005539505 systemd[1]: run-netns-ovnmeta\x2d717f1c01\x2dfb17\x2d41f0\x2d848c\x2dcebdb3841bf9.mount: Deactivated successfully.
Nov 29 02:28:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:47.247 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:28:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:47.247 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[d98a09b1-c3c3-4e58-973e-50332fb9bde8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.110 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '97b39e54-312a-4ebc-863b-e1ef5f4cf363', 'name': 'tempest-TestNetworkAdvancedServerOps-server-468452072', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000086', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'c231e63624d44fc19e0989abfb1afb22', 'user_id': 'bfd2024670594b10941cec8a59d2573f', 'hostId': 'be6ac3c60c3bdcbdfbbe8e739fd3ca0db5c644ecd2349431ff96119c', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.117 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000083', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7843cfa993a1428aaaa660321ebba1ac', 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'hostId': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.119 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000089', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'hostId': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.122 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.162 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.read.requests volume: 1210 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.165 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.202 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.read.requests volume: 976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.203 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.read.requests volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4d58837-d81e-4af1-8de8-d97c204bb0ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1210, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-vda', 'timestamp': '2025-11-29T07:28:48.120819', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b91c296-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': 'ad4f4fc034b0c10a904632400ba9e2746aef039668d30f0363593c5560032541'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-sda', 'timestamp': '2025-11-29T07:28:48.120819', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b921980-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': 'cc2a35413b881887c40c002ad6119a9e70e4e3dc336197b624f0c03d402d9303'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 976, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-vda', 'timestamp': '2025-11-29T07:28:48.120819', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b97cff6-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': '186ed00e036c884a762711d9d196501c17d9d7036f765bfc4c9a16584bee7d05'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 95, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-sda', 'timestamp': '2025-11-29T07:28:48.120819', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b97dfaa-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': 'd6a851c4ebc1813bac717663e1c13a1051cb011cfd81f65c4aa11656f66ce768'}]}, 'timestamp': '2025-11-29 07:28:48.204411', '_unique_id': '613884a6733b4bc09c41e4248afcc1ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.212 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.215 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 19c7d8b2-3f1a-40cf-a538-dd2752970ffb / tap1e2538b2-92 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.216 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.218 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5487c798-eb5a-4186-9693-a64ecd64b296 / tap0273a9e9-32 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.219 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3de7c2ba-b123-48dc-8c61-4d448e8b963c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000083-19c7d8b2-3f1a-40cf-a538-dd2752970ffb-tap1e2538b2-92', 'timestamp': '2025-11-29T07:28:48.211203', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'tap1e2538b2-92', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:02:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e2538b2-92'}, 'message_id': '0b99bf50-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.85400882, 'message_signature': 'fbe92b35e69b45801ca636acd4c5b423875e0698fe05da57dba99718bc25a120'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000089-5487c798-eb5a-4186-9693-a64ecd64b296-tap0273a9e9-32', 'timestamp': '2025-11-29T07:28:48.211203', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'tap0273a9e9-32', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:ab:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0273a9e9-32'}, 'message_id': '0b9a34ee-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.857459767, 'message_signature': '11b3a984ad73995fcdea55d8e0d3b75dcc8338ff19eeb59f2ec7eb6f5a6923f1'}]}, 'timestamp': '2025-11-29 07:28:48.219629', '_unique_id': '4c626d51a1ca4fe6a3c902459b7dac09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.222 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.222 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-468452072>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1624881395>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-400688363>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-468452072>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1624881395>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-400688363>]
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.224 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.224 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.224 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f488afa-26bc-4067-98b0-7fd633c48e84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000083-19c7d8b2-3f1a-40cf-a538-dd2752970ffb-tap1e2538b2-92', 'timestamp': '2025-11-29T07:28:48.222802', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'tap1e2538b2-92', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:02:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e2538b2-92'}, 'message_id': '0b9af7e4-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.85400882, 'message_signature': 'def17c651ad60e51132770edf20b8ffe08d5c2f22b2334cfbbc9e75f800a7041'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000089-5487c798-eb5a-4186-9693-a64ecd64b296-tap0273a9e9-32', 'timestamp': '2025-11-29T07:28:48.222802', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'tap0273a9e9-32', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:ab:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0273a9e9-32'}, 'message_id': '0b9b0cde-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.857459767, 'message_signature': 'e9a0c21280cb316e52392848535ea1211483c3d56b24c327f941e5aa014ed54d'}]}, 'timestamp': '2025-11-29 07:28:48.225141', '_unique_id': '0c51341071594aa5a3edb402823982e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.225 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.229 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.229 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/network.incoming.bytes volume: 1071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.230 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/network.incoming.bytes volume: 622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8f02272-0c1b-4ef2-96f2-d7c76c5bcd3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1071, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000083-19c7d8b2-3f1a-40cf-a538-dd2752970ffb-tap1e2538b2-92', 'timestamp': '2025-11-29T07:28:48.227956', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'tap1e2538b2-92', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:02:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e2538b2-92'}, 'message_id': '0b9bcebc-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.85400882, 'message_signature': 'eaa99a0d6f0e44782980f9a235805618fee6759a35e242ed00da8b8c7919d4b7'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 622, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000089-5487c798-eb5a-4186-9693-a64ecd64b296-tap0273a9e9-32', 'timestamp': '2025-11-29T07:28:48.227956', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'tap0273a9e9-32', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:ab:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0273a9e9-32'}, 'message_id': '0b9bea3c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.857459767, 'message_signature': '99a2293f37f5dbbd7b34741e19189ae932c1e8e7f5637f04c00cb36a2056312e'}]}, 'timestamp': '2025-11-29 07:28:48.230913', '_unique_id': '67a83e0d89da41ebb9393b5c3c4af428'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.232 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.234 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.235 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.258 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/cpu volume: 11720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.279 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/cpu volume: 11910000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '365b7ded-bb28-47d7-86f6-d1081faf9d31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11720000000, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'timestamp': '2025-11-29T07:28:48.234608', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0ba05f0e-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.899648261, 'message_signature': '3804f6cdf0c8d6fbf026935360a14889b96e29e8baf478fb84b70a9b7493ed1e'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11910000000, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'timestamp': '2025-11-29T07:28:48.234608', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0ba37cfc-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.920187843, 'message_signature': '8f1d421d41e1f51997da131f5a30a85f41198d5accc0e265ff47987543ff94a5'}]}, 'timestamp': '2025-11-29 07:28:48.280580', '_unique_id': 'd42ef75e753d445282a51383236883d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.282 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.283 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.283 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.283 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-468452072>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1624881395>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-400688363>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-468452072>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1624881395>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-400688363>]
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.286 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.286 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.write.requests volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.287 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.287 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.write.requests volume: 244 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.287 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bd64942-0291-4d5e-831b-380dbff2e882', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 21, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-vda', 'timestamp': '2025-11-29T07:28:48.284358', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0ba48232-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': 'e5906e81cafe8eb574706a66cb1aede1b8388ed4199d7d8523711809e7166704'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-sda', 'timestamp': '2025-11-29T07:28:48.284358', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0ba4936c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': '8014d337d8ad8db442d22d5726a3d9f1bcf4e4a014889c94c06acae3e39a9911'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 244, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-vda', 'timestamp': '2025-11-29T07:28:48.284358', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0ba49f56-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': 'e03e64d9d39381dc2fd43d1e031b6293310e10c96a346cd70dce26f6e806a6a4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-sda', 'timestamp': '2025-11-29T07:28:48.284358', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0ba4abea-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': '61e5a093d50ac093e00e0a405b7fdb87ea5acd3f3d39d2aee7f91b0d51e4d12d'}]}, 'timestamp': '2025-11-29 07:28:48.288181', '_unique_id': 'e59335c7992646a985a92290883543ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.289 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.290 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.291 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.292 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.292 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/memory.usage volume: 40.44921875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fd9de7c-aee7-4fc0-82f2-63917bb60bcf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'timestamp': '2025-11-29T07:28:48.290897', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0ba55342-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.899648261, 'message_signature': 'a2dd219477a104cecf34ae336f23494f9f72f989d6b2ffd0a749db0d75817c41'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.44921875, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'timestamp': '2025-11-29T07:28:48.290897', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0ba561fc-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.920187843, 'message_signature': '90d3311ae33d69f94a81cff3951b1ce0f8d2cfb3e504bfb7fa6c0ccaceef74a4'}]}, 'timestamp': '2025-11-29 07:28:48.292861', '_unique_id': '0398d489a81941ea8bf3b8676d557dd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.293 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.295 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.296 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.296 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.296 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba0c6fce-f078-449c-ad7d-8445be93388b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000083-19c7d8b2-3f1a-40cf-a538-dd2752970ffb-tap1e2538b2-92', 'timestamp': '2025-11-29T07:28:48.295290', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'tap1e2538b2-92', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:02:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e2538b2-92'}, 'message_id': '0ba5f8a6-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.85400882, 'message_signature': '66dbd0124a7cf4faa06ca1222163dd55246b2768b5031442d99595c357b3ab27'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000089-5487c798-eb5a-4186-9693-a64ecd64b296-tap0273a9e9-32', 'timestamp': '2025-11-29T07:28:48.295290', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'tap0273a9e9-32', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:ab:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0273a9e9-32'}, 'message_id': '0ba60846-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.857459767, 'message_signature': '03d643b14ed353a3e0726d98bf4b25f7f2fa7cf61b051fa10d4c5c7dbd6c9c4e'}]}, 'timestamp': '2025-11-29 07:28:48.297136', '_unique_id': 'd01a9ff5ca1e4ab19628e698f2c5b327'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.298 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.299 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.300 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.301 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.301 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/network.incoming.packets volume: 7 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbb9c4a2-50e3-418c-9656-bf0dd0df77c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000083-19c7d8b2-3f1a-40cf-a538-dd2752970ffb-tap1e2538b2-92', 'timestamp': '2025-11-29T07:28:48.300048', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'tap1e2538b2-92', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:02:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e2538b2-92'}, 'message_id': '0ba6adf0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.85400882, 'message_signature': 'f4fa21bb14e88527fe0d031bfbe6c9d1a60c091328a3c3de552157e13ea332ee'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 7, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000089-5487c798-eb5a-4186-9693-a64ecd64b296-tap0273a9e9-32', 'timestamp': '2025-11-29T07:28:48.300048', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'tap0273a9e9-32', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:ab:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0273a9e9-32'}, 'message_id': '0ba6bd0e-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.857459767, 'message_signature': '966f82af8fc0585ffbb5403049de99cea10416ef9f3f63c441853ec87eb84b16'}]}, 'timestamp': '2025-11-29 07:28:48.301755', '_unique_id': '0aed54523248459086ce2555cadd3a39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.302 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.303 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.304 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.304 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/network.outgoing.bytes volume: 984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.305 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/network.outgoing.bytes volume: 266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e84a6188-f13d-418e-aa62-e1d4c5c456b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 984, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000083-19c7d8b2-3f1a-40cf-a538-dd2752970ffb-tap1e2538b2-92', 'timestamp': '2025-11-29T07:28:48.303867', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'tap1e2538b2-92', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:02:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e2538b2-92'}, 'message_id': '0ba742f6-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.85400882, 'message_signature': '91b4e4011cfb4218552c3967c1e3c26822b537fb3207dffea336f5a97873e75d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 266, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000089-5487c798-eb5a-4186-9693-a64ecd64b296-tap0273a9e9-32', 'timestamp': '2025-11-29T07:28:48.303867', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'tap0273a9e9-32', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:ab:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0273a9e9-32'}, 'message_id': '0ba751ec-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.857459767, 'message_signature': 'c2e9ce68492fcf2936466fbe6657c03a4a7878a6aaeff23f77c25df1d10b3497'}]}, 'timestamp': '2025-11-29 07:28:48.305563', '_unique_id': '5c3e0d12f8d444c9926d5a762f479752'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.306 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.307 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.308 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.308 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.read.bytes volume: 32040960 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.308 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.309 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.read.bytes volume: 28113920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.309 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.read.bytes volume: 221502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a76b42e1-c81d-44ae-9983-004aa794a408', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32040960, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-vda', 'timestamp': '2025-11-29T07:28:48.307663', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0ba7d6c6-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': '9cacae741401d5a7f90605b6921c5415ca628d025b96c14a8e40b173ea518105'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-sda', 'timestamp': '2025-11-29T07:28:48.307663', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0ba7e40e-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': '25dd89bb9d32a79b4c15ea3612684926e1e7dc6d0afc85ecadd4ee1fabc41400'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28113920, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-vda', 'timestamp': '2025-11-29T07:28:48.307663', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0ba7f1e2-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': '471e524e56b230c4070e02b9781ba9c50dd425dc095647b003b4a3ead969c9ac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221502, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-sda', 'timestamp': '2025-11-29T07:28:48.307663', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0ba7febc-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': 'f4cffd4bebd6c696e9ff01abdc45233eff143f8c1a533186150a163ba02242fc'}]}, 'timestamp': '2025-11-29 07:28:48.309974', '_unique_id': '7828c09d83dd4aa1a90bca568abdbd73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.310 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.312 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.312 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.312 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-468452072>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1624881395>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-400688363>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-468452072>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1624881395>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-400688363>]
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.312 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.317 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.328 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.329 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.339 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.allocation volume: 29106176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.340 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33b2b87e-739d-48b2-86c7-41a259c8643d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-vda', 'timestamp': '2025-11-29T07:28:48.312820', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0baae60e-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.958021473, 'message_signature': '7e443f25f28571eab5c8219b09a63ff66632e82d08cab142fbdf66fa43b46232'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-sda', 'timestamp': '2025-11-29T07:28:48.312820', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0baafe1e-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.958021473, 'message_signature': 'f8775c6993dc1101aec098a18ee7dfb86118fd74993b062bc0306b3d3e7f37e1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29106176, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-vda', 'timestamp': '2025-11-29T07:28:48.312820', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0baca278-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.970502577, 'message_signature': '5647fdcc71a0ba34945c9ea11602dce0376e14d107ceed484c89dd7764d7ab12'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-sda', 'timestamp': '2025-11-29T07:28:48.312820', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0bacb1e6-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.970502577, 'message_signature': '7a2ed8eb4462ecc9cf3e6d2b92b63df21d699c561ecade57a2a7c513df3a63f0'}]}, 'timestamp': '2025-11-29 07:28:48.340803', '_unique_id': '1b5a17db1d7d474b8ce8ae0b681f37ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.342 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.344 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.345 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.read.latency volume: 239327822 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.345 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.read.latency volume: 18685383 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.345 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.read.latency volume: 215653164 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.346 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.read.latency volume: 16392755 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5ae3504-48e8-4674-8e89-b05e208973b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 239327822, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-vda', 'timestamp': '2025-11-29T07:28:48.344005', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0bad6b86-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': '1fd5347d05f6154bec725a4bea26f4b8fe61495aca01c642953af3f91c3e970f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18685383, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-sda', 'timestamp': '2025-11-29T07:28:48.344005', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0bad7acc-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': '90b8c0d14c1df07758ef6df2ae66bc899f4b6ed8b50a75ed8fddb4278faed3b8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 215653164, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-vda', 'timestamp': '2025-11-29T07:28:48.344005', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0bad8882-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': '2e7237cf8d84952414eb96f0b7ca9ddfccf308924cd5bd3b049af24e9fdc4d76'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16392755, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-sda', 'timestamp': '2025-11-29T07:28:48.344005', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0bad96d8-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': '427a68acde5d68f48c2ca3a21664f40dc487f1626e940d49166f240e22871684'}]}, 'timestamp': '2025-11-29 07:28:48.346640', '_unique_id': '535b820d5854423a84700b57db248138'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.347 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.348 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.349 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.350 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.350 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19e2850f-6b4a-4ad9-a608-88daddd963f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000083-19c7d8b2-3f1a-40cf-a538-dd2752970ffb-tap1e2538b2-92', 'timestamp': '2025-11-29T07:28:48.349086', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'tap1e2538b2-92', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:02:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e2538b2-92'}, 'message_id': '0bae2ee0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.85400882, 'message_signature': '910a0be0d1106fd1f71aed00f1b0cdf99fb35bf9afc5f25221d6ce46b814ab4e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000089-5487c798-eb5a-4186-9693-a64ecd64b296-tap0273a9e9-32', 'timestamp': '2025-11-29T07:28:48.349086', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'tap0273a9e9-32', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:ab:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0273a9e9-32'}, 'message_id': '0bae3d86-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.857459767, 'message_signature': '889a0ec293f95b3f144c49c6c8dc28ffc7b8c8a7dbdf0a764ccb9c90889104ab'}]}, 'timestamp': '2025-11-29 07:28:48.350918', '_unique_id': 'e9d05b8dd9d54805a6c745ee9d0cc6a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.351 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.353 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.354 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.write.latency volume: 17783422 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.354 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.354 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.write.latency volume: 6011348357 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.355 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c58cda2-f5b1-4b00-b272-55209461875b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17783422, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-vda', 'timestamp': '2025-11-29T07:28:48.353133', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0baec65c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': '6f677d4c83feefd257be2f6ea5c2c7c1ee1c82c75144075a3656c596429dac68'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-sda', 'timestamp': '2025-11-29T07:28:48.353133', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0baed3cc-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': '2fb6baedd0b31e3f5c3620008e3cad1296d23d5dc3560657281adfe7158c3e7a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6011348357, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-vda', 'timestamp': '2025-11-29T07:28:48.353133', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0baee0ce-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': 'e57dc5c7f8daa6d30ede99291dd6c5342a753609c7d55e58dd1778430528fe8d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-sda', 'timestamp': '2025-11-29T07:28:48.353133', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0baeeeca-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': 'e350137debb990f915feb4db7e70de75e85430aa53b4789db464aecfa2e71c61'}]}, 'timestamp': '2025-11-29 07:28:48.355438', '_unique_id': '375f92a38fbd4f43aa906073ebe7d701'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.356 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.357 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.357 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.358 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.358 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6e4e8cf-fc0d-41d9-8077-5b4a37a20e6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000083-19c7d8b2-3f1a-40cf-a538-dd2752970ffb-tap1e2538b2-92', 'timestamp': '2025-11-29T07:28:48.357269', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'tap1e2538b2-92', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:02:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e2538b2-92'}, 'message_id': '0baf60da-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.85400882, 'message_signature': 'ea47dee6bd87efd62dc30acef348fa4337355833d1ebd3a9cd8f99699cf931e1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000089-5487c798-eb5a-4186-9693-a64ecd64b296-tap0273a9e9-32', 'timestamp': '2025-11-29T07:28:48.357269', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'tap0273a9e9-32', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:ab:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0273a9e9-32'}, 'message_id': '0baf6c1a-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.857459767, 'message_signature': 'a77a9780969f75e413075a97a756e5a32ef87a096e02d971f60296eb5141aa7d'}]}, 'timestamp': '2025-11-29 07:28:48.358610', '_unique_id': '3b58ce3c1d2c4a5fabc6731c77b54107'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.359 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.360 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.360 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.361 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.write.bytes volume: 143360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.361 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.361 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.write.bytes volume: 25628672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.361 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4377bc4-a601-4570-85b6-65a18c10074a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 143360, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-vda', 'timestamp': '2025-11-29T07:28:48.360153', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0bafd4de-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': 'fa0460b4be74a77bca9dc5742c4aa5f4bef9d3f15b90854ecd8f08cfa1d18e2d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-sda', 'timestamp': '2025-11-29T07:28:48.360153', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0bafdf1a-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.763439886, 'message_signature': '013cb05a9829711fdc0e4d941a27bfc3abf0612d878914cdb16e00bf541c3c73'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25628672, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-vda', 'timestamp': '2025-11-29T07:28:48.360153', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0bafe816-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': 'bd5e68075a0a0c19e376d8f1bff4a63a746dfebff5426964f86185cbb26f17fe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-sda', 'timestamp': '2025-11-29T07:28:48.360153', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0baff1d0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.807501103, 'message_signature': '54eb205e559f9017502b8d4653f8e1d0574a86c53860b03504ca501b223c56b1'}]}, 'timestamp': '2025-11-29 07:28:48.362023', '_unique_id': '17d344e19d614c36986f91d8d8ab244c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.362 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.363 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.364 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.364 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.364 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.365 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.365 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05e90aca-5297-4afa-9532-137a01b8a57e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-vda', 'timestamp': '2025-11-29T07:28:48.363842', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0bb05ef4-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.958021473, 'message_signature': '15cadf9eda4336495ae2df085b0e4558dac2465ceff6ac512cd0cc190a7c839b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-sda', 'timestamp': '2025-11-29T07:28:48.363842', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0bb0682c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.958021473, 'message_signature': '3322819fed92bd0125fe7418f9cb445f4aa89f939df3c06b839628bb6eb3e719'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-vda', 'timestamp': '2025-11-29T07:28:48.363842', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0bb0711e-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.970502577, 'message_signature': '2022e7b0578867b73cf866f318c6a63192c93cae6ad43fab1c18effb59f70c54'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-sda', 'timestamp': '2025-11-29T07:28:48.363842', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0bb07b28-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.970502577, 'message_signature': 'ff4482192b7984ec64a3670a2294beb2d585a6dd9cc2c8ddfd7c29c18d4c759a'}]}, 'timestamp': '2025-11-29 07:28:48.365539', '_unique_id': 'c8eae679533b4291b0c5245461cfe431'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.366 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.367 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.367 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.368 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.368 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.368 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76f62a8e-8640-453a-b0e3-c4fcf0fd1b6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-vda', 'timestamp': '2025-11-29T07:28:48.367075', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0bb0e0b8-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.958021473, 'message_signature': '07949a63b0221f46a9600f04f6d584e121153515048567835d1a11ed0de7a163'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb-sda', 'timestamp': '2025-11-29T07:28:48.367075', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'instance-00000083', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0bb0ecb6-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.958021473, 'message_signature': '871eda9c312b4c08d1265ba64c102a1c2878b5eaed433304182b1dd06766afec'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-vda', 'timestamp': '2025-11-29T07:28:48.367075', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0bb0f8f0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.970502577, 'message_signature': '08bce939d79b4a8cef63316954d5b52a12a6ee73999548601184e07466f2f86d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': '5487c798-eb5a-4186-9693-a64ecd64b296-sda', 'timestamp': '2025-11-29T07:28:48.367075', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'instance-00000089', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0bb10b9c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.970502577, 'message_signature': '7482f71a3e85eef16a35f6ad7a5fb72a1f3ced2ca151e7d2d906145ca6d232f9'}]}, 'timestamp': '2025-11-29 07:28:48.369287', '_unique_id': 'a91c09934e9d45319cb47638b0b7d691'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.370 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.371 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.371 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.371 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-468452072>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1624881395>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-400688363>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-468452072>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1624881395>, <NovaLikeServer: tempest-ServerRescueNegativeTestJSON-server-400688363>]
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.371 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.372 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.372 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.372 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/network.outgoing.packets volume: 3 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5403f9b3-f12a-423e-82ab-a24f0039a15d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000083-19c7d8b2-3f1a-40cf-a538-dd2752970ffb-tap1e2538b2-92', 'timestamp': '2025-11-29T07:28:48.371663', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'tap1e2538b2-92', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:02:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e2538b2-92'}, 'message_id': '0bb19b84-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.85400882, 'message_signature': '0d2e08e984bd48190f9b139cb1500f3ffc01fc7474fd893122bce9cd56e295dc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 3, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000089-5487c798-eb5a-4186-9693-a64ecd64b296-tap0273a9e9-32', 'timestamp': '2025-11-29T07:28:48.371663', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'tap0273a9e9-32', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:ab:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0273a9e9-32'}, 'message_id': '0bb1a57a-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.857459767, 'message_signature': 'fc7075e5450d66d65b8e1b4fd18d3ce0bbb6f719d1d23499c24f1d572c895b8f'}]}, 'timestamp': '2025-11-29 07:28:48.373185', '_unique_id': '51eff6ff811b4a38857d5b94676a8b17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.373 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.374 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.375 12 DEBUG ceilometer.compute.pollsters [-] Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000086, id=97b39e54-312a-4ebc-863b-e1ef5f4cf363>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.375 12 DEBUG ceilometer.compute.pollsters [-] 19c7d8b2-3f1a-40cf-a538-dd2752970ffb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.375 12 DEBUG ceilometer.compute.pollsters [-] 5487c798-eb5a-4186-9693-a64ecd64b296/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4950ad30-cdbe-4ee1-9e62-6141fb1a6a50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3e2a40601ced4de78fe1767769f262c0', 'user_name': None, 'project_id': '7843cfa993a1428aaaa660321ebba1ac', 'project_name': None, 'resource_id': 'instance-00000083-19c7d8b2-3f1a-40cf-a538-dd2752970ffb-tap1e2538b2-92', 'timestamp': '2025-11-29T07:28:48.374803', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1624881395', 'name': 'tap1e2538b2-92', 'instance_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'instance_type': 'm1.nano', 'host': 'e6ca3404db64e7dd35a397b91945fb76da4566c34ae37f1aea448b23', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a4:02:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e2538b2-92'}, 'message_id': '0bb21294-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.85400882, 'message_signature': 'c837e78d53eef0e7a180f4810c59dc9354d22d9ae615b6a02350e703469687d2'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4863fb992d4c48de9a92f63ffb1174a8', 'user_name': None, 'project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'project_name': None, 'resource_id': 'instance-00000089-5487c798-eb5a-4186-9693-a64ecd64b296-tap0273a9e9-32', 'timestamp': '2025-11-29T07:28:48.374803', 'resource_metadata': {'display_name': 'tempest-ServerRescueNegativeTestJSON-server-400688363', 'name': 'tap0273a9e9-32', 'instance_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'instance_type': 'm1.nano', 'host': '55a1b26e37832e95c95f6b4da5f07fafa263792b158229c730421338', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7d:ab:ce', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0273a9e9-32'}, 'message_id': '0bb21c3a-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6775.857459767, 'message_signature': '2eaed453ca8af916e90ae9a15b51028446886763e1809b7214566f407416e0a3'}]}, 'timestamp': '2025-11-29 07:28:48.376229', '_unique_id': 'd87b0dd81a1c4211a6de6dc8df5fd4fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:28:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:28:48.376 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:28:48 np0005539505 nova_compute[186958]: 2025-11-29 07:28:48.901 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:49 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:49Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7d:ab:ce 10.100.0.8
Nov 29 02:28:49 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:49Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7d:ab:ce 10.100.0.8
Nov 29 02:28:50 np0005539505 nova_compute[186958]: 2025-11-29 07:28:50.691 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:50 np0005539505 nova_compute[186958]: 2025-11-29 07:28:50.941 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Nov 29 02:28:51 np0005539505 nova_compute[186958]: 2025-11-29 07:28:51.225 186962 DEBUG nova.compute.manager [req-b1de5703-e755-432d-9302-2f372cfa960e req-8c2a96b2-5348-4e5a-82fc-4b5329e87acc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-vif-unplugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:51 np0005539505 nova_compute[186958]: 2025-11-29 07:28:51.225 186962 DEBUG oslo_concurrency.lockutils [req-b1de5703-e755-432d-9302-2f372cfa960e req-8c2a96b2-5348-4e5a-82fc-4b5329e87acc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:51 np0005539505 nova_compute[186958]: 2025-11-29 07:28:51.225 186962 DEBUG oslo_concurrency.lockutils [req-b1de5703-e755-432d-9302-2f372cfa960e req-8c2a96b2-5348-4e5a-82fc-4b5329e87acc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:51 np0005539505 nova_compute[186958]: 2025-11-29 07:28:51.226 186962 DEBUG oslo_concurrency.lockutils [req-b1de5703-e755-432d-9302-2f372cfa960e req-8c2a96b2-5348-4e5a-82fc-4b5329e87acc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:51 np0005539505 nova_compute[186958]: 2025-11-29 07:28:51.226 186962 DEBUG nova.compute.manager [req-b1de5703-e755-432d-9302-2f372cfa960e req-8c2a96b2-5348-4e5a-82fc-4b5329e87acc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] No waiting events found dispatching network-vif-unplugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:51 np0005539505 nova_compute[186958]: 2025-11-29 07:28:51.226 186962 WARNING nova.compute.manager [req-b1de5703-e755-432d-9302-2f372cfa960e req-8c2a96b2-5348-4e5a-82fc-4b5329e87acc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received unexpected event network-vif-unplugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 for instance with vm_state suspended and task_state None.#033[00m
Nov 29 02:28:52 np0005539505 nova_compute[186958]: 2025-11-29 07:28:52.984 186962 INFO nova.compute.manager [None req-193cd3b0-d81c-4a26-8c35-e436c5f744cc bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Get console output#033[00m
Nov 29 02:28:53 np0005539505 kernel: tap0273a9e9-32 (unregistering): left promiscuous mode
Nov 29 02:28:53 np0005539505 NetworkManager[55134]: <info>  [1764401333.1596] device (tap0273a9e9-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.176 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:53 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:53Z|00632|binding|INFO|Releasing lport 0273a9e9-32f2-4363-a3b4-aa1a87caf07c from this chassis (sb_readonly=0)
Nov 29 02:28:53 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:53Z|00633|binding|INFO|Setting lport 0273a9e9-32f2-4363-a3b4-aa1a87caf07c down in Southbound
Nov 29 02:28:53 np0005539505 ovn_controller[95143]: 2025-11-29T07:28:53Z|00634|binding|INFO|Removing iface tap0273a9e9-32 ovn-installed in OVS
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.180 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.207 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:53 np0005539505 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000089.scope: Deactivated successfully.
Nov 29 02:28:53 np0005539505 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000089.scope: Consumed 14.006s CPU time.
Nov 29 02:28:53 np0005539505 systemd-machined[153285]: Machine qemu-71-instance-00000089 terminated.
Nov 29 02:28:53 np0005539505 podman[241364]: 2025-11-29 07:28:53.27093847 +0000 UTC m=+0.078751270 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Nov 29 02:28:53 np0005539505 podman[241367]: 2025-11-29 07:28:53.273448131 +0000 UTC m=+0.070812715 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:28:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:53.657 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:ab:ce 10.100.0.8'], port_security=['fa:16:3e:7d:ab:ce 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-008329a1-d4dc-4cfb-be68-95f658d9813d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b04b69be-f431-4979-89c6-4e231888644a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7e71bac-297c-4031-8579-254c834f5859, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=0273a9e9-32f2-4363-a3b4-aa1a87caf07c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:28:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:53.660 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 0273a9e9-32f2-4363-a3b4-aa1a87caf07c in datapath 008329a1-d4dc-4cfb-be68-95f658d9813d unbound from our chassis#033[00m
Nov 29 02:28:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:53.664 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 008329a1-d4dc-4cfb-be68-95f658d9813d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:28:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:53.667 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[78bcfcab-5a08-4acb-b185-275107eb3ec9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:53.668 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d namespace which is not needed anymore#033[00m
Nov 29 02:28:53 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241157]: [NOTICE]   (241183) : haproxy version is 2.8.14-c23fe91
Nov 29 02:28:53 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241157]: [NOTICE]   (241183) : path to executable is /usr/sbin/haproxy
Nov 29 02:28:53 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241157]: [WARNING]  (241183) : Exiting Master process...
Nov 29 02:28:53 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241157]: [ALERT]    (241183) : Current worker (241190) exited with code 143 (Terminated)
Nov 29 02:28:53 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241157]: [WARNING]  (241183) : All workers exited. Exiting... (0)
Nov 29 02:28:53 np0005539505 systemd[1]: libpod-00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7.scope: Deactivated successfully.
Nov 29 02:28:53 np0005539505 podman[241450]: 2025-11-29 07:28:53.84190268 +0000 UTC m=+0.054310788 container died 00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.850 186962 DEBUG nova.compute.manager [req-165b7b5e-90bc-41e8-8210-e72b88ceb741 req-af6033b1-3438-4fe5-a79e-8e1bc543869b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.851 186962 DEBUG oslo_concurrency.lockutils [req-165b7b5e-90bc-41e8-8210-e72b88ceb741 req-af6033b1-3438-4fe5-a79e-8e1bc543869b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.851 186962 DEBUG oslo_concurrency.lockutils [req-165b7b5e-90bc-41e8-8210-e72b88ceb741 req-af6033b1-3438-4fe5-a79e-8e1bc543869b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.852 186962 DEBUG oslo_concurrency.lockutils [req-165b7b5e-90bc-41e8-8210-e72b88ceb741 req-af6033b1-3438-4fe5-a79e-8e1bc543869b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.852 186962 DEBUG nova.compute.manager [req-165b7b5e-90bc-41e8-8210-e72b88ceb741 req-af6033b1-3438-4fe5-a79e-8e1bc543869b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] No waiting events found dispatching network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.852 186962 WARNING nova.compute.manager [req-165b7b5e-90bc-41e8-8210-e72b88ceb741 req-af6033b1-3438-4fe5-a79e-8e1bc543869b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received unexpected event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 for instance with vm_state suspended and task_state resuming.#033[00m
Nov 29 02:28:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:53.855 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:53 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7-userdata-shm.mount: Deactivated successfully.
Nov 29 02:28:53 np0005539505 systemd[1]: var-lib-containers-storage-overlay-ca9b21ffcc47252e9e09ad7acbe73bb4b9630206e844c6bda2dcfc95eea105cd-merged.mount: Deactivated successfully.
Nov 29 02:28:53 np0005539505 podman[241450]: 2025-11-29 07:28:53.878553827 +0000 UTC m=+0.090961905 container cleanup 00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:28:53 np0005539505 systemd[1]: libpod-conmon-00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7.scope: Deactivated successfully.
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.903 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:53 np0005539505 podman[241481]: 2025-11-29 07:28:53.948404294 +0000 UTC m=+0.046486496 container remove 00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:28:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:53.958 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[389f0fbd-ee23-4372-849b-b6be87224b64]: (4, ('Sat Nov 29 07:28:53 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d (00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7)\n00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7\nSat Nov 29 07:28:53 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d (00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7)\n00d125eac6b98dacb6c34e6c1de71c807018714874b4b287ec4916502fe96ad7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.960 186962 INFO nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Instance shutdown successfully after 13 seconds.#033[00m
Nov 29 02:28:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:53.962 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[971f2529-7ef9-458f-85bc-1e42861c5264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:53.963 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap008329a1-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.965 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:53 np0005539505 kernel: tap008329a1-d0: left promiscuous mode
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.971 186962 INFO nova.virt.libvirt.driver [-] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Instance destroyed successfully.#033[00m
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.971 186962 DEBUG nova.objects.instance [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'numa_topology' on Instance uuid 5487c798-eb5a-4186-9693-a64ecd64b296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:53 np0005539505 nova_compute[186958]: 2025-11-29 07:28:53.986 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:53.988 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a80fb9a1-623f-4b69-a2b2-90285c46a9b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:54.004 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6f062b0d-8347-418b-9c77-228aa0935055]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:54.005 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bd66b0de-de8a-4f4f-b8f6-d89e51b2df9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:54.022 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5c52437a-078d-4a82-9a5e-85122548d62d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 676213, 'reachable_time': 22036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241502, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:54.024 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:28:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:28:54.024 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[3350ead0-caab-42f4-b1ad-e97484b6e996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:28:54 np0005539505 systemd[1]: run-netns-ovnmeta\x2d008329a1\x2dd4dc\x2d4cfb\x2dbe68\x2d95f658d9813d.mount: Deactivated successfully.
Nov 29 02:28:54 np0005539505 nova_compute[186958]: 2025-11-29 07:28:54.151 186962 INFO nova.compute.manager [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Resuming#033[00m
Nov 29 02:28:54 np0005539505 nova_compute[186958]: 2025-11-29 07:28:54.152 186962 DEBUG nova.objects.instance [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'flavor' on Instance uuid 97b39e54-312a-4ebc-863b-e1ef5f4cf363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.169 186962 INFO nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Attempting rescue#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.170 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.175 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.175 186962 INFO nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Creating image(s)#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.176 186962 DEBUG oslo_concurrency.lockutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.176 186962 DEBUG oslo_concurrency.lockutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.177 186962 DEBUG oslo_concurrency.lockutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.177 186962 DEBUG nova.objects.instance [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'trusted_certs' on Instance uuid 5487c798-eb5a-4186-9693-a64ecd64b296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.589 186962 DEBUG nova.compute.manager [req-7d9ed5d1-300d-4114-a08b-ec94848f8c2c req-5aec4086-8a96-4e5e-855c-475393c45cc2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received event network-vif-unplugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.589 186962 DEBUG oslo_concurrency.lockutils [req-7d9ed5d1-300d-4114-a08b-ec94848f8c2c req-5aec4086-8a96-4e5e-855c-475393c45cc2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.590 186962 DEBUG oslo_concurrency.lockutils [req-7d9ed5d1-300d-4114-a08b-ec94848f8c2c req-5aec4086-8a96-4e5e-855c-475393c45cc2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.590 186962 DEBUG oslo_concurrency.lockutils [req-7d9ed5d1-300d-4114-a08b-ec94848f8c2c req-5aec4086-8a96-4e5e-855c-475393c45cc2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.590 186962 DEBUG nova.compute.manager [req-7d9ed5d1-300d-4114-a08b-ec94848f8c2c req-5aec4086-8a96-4e5e-855c-475393c45cc2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] No waiting events found dispatching network-vif-unplugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.590 186962 WARNING nova.compute.manager [req-7d9ed5d1-300d-4114-a08b-ec94848f8c2c req-5aec4086-8a96-4e5e-855c-475393c45cc2 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received unexpected event network-vif-unplugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:28:55 np0005539505 nova_compute[186958]: 2025-11-29 07:28:55.692 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:55 np0005539505 podman[241503]: 2025-11-29 07:28:55.727841426 +0000 UTC m=+0.055355108 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 02:28:57 np0005539505 nova_compute[186958]: 2025-11-29 07:28:57.421 186962 DEBUG oslo_concurrency.lockutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:57 np0005539505 nova_compute[186958]: 2025-11-29 07:28:57.422 186962 DEBUG oslo_concurrency.lockutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:57 np0005539505 nova_compute[186958]: 2025-11-29 07:28:57.445 186962 DEBUG oslo_concurrency.processutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:57 np0005539505 nova_compute[186958]: 2025-11-29 07:28:57.541 186962 DEBUG oslo_concurrency.processutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:57 np0005539505 nova_compute[186958]: 2025-11-29 07:28:57.542 186962 DEBUG oslo_concurrency.processutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:28:57 np0005539505 nova_compute[186958]: 2025-11-29 07:28:57.633 186962 DEBUG oslo_concurrency.processutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.rescue" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:28:57 np0005539505 nova_compute[186958]: 2025-11-29 07:28:57.634 186962 DEBUG oslo_concurrency.lockutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:57 np0005539505 nova_compute[186958]: 2025-11-29 07:28:57.634 186962 DEBUG nova.objects.instance [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'migration_context' on Instance uuid 5487c798-eb5a-4186-9693-a64ecd64b296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:28:58 np0005539505 nova_compute[186958]: 2025-11-29 07:28:58.906 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:28:59 np0005539505 nova_compute[186958]: 2025-11-29 07:28:59.954 186962 DEBUG nova.compute.manager [req-be124b2c-b3d0-498a-8ced-287e08aa3fad req-50dc1c96-f732-4a21-8497-5bac81b771e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:28:59 np0005539505 nova_compute[186958]: 2025-11-29 07:28:59.954 186962 DEBUG oslo_concurrency.lockutils [req-be124b2c-b3d0-498a-8ced-287e08aa3fad req-50dc1c96-f732-4a21-8497-5bac81b771e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:28:59 np0005539505 nova_compute[186958]: 2025-11-29 07:28:59.954 186962 DEBUG oslo_concurrency.lockutils [req-be124b2c-b3d0-498a-8ced-287e08aa3fad req-50dc1c96-f732-4a21-8497-5bac81b771e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:28:59 np0005539505 nova_compute[186958]: 2025-11-29 07:28:59.955 186962 DEBUG oslo_concurrency.lockutils [req-be124b2c-b3d0-498a-8ced-287e08aa3fad req-50dc1c96-f732-4a21-8497-5bac81b771e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:28:59 np0005539505 nova_compute[186958]: 2025-11-29 07:28:59.955 186962 DEBUG nova.compute.manager [req-be124b2c-b3d0-498a-8ced-287e08aa3fad req-50dc1c96-f732-4a21-8497-5bac81b771e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] No waiting events found dispatching network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:28:59 np0005539505 nova_compute[186958]: 2025-11-29 07:28:59.955 186962 WARNING nova.compute.manager [req-be124b2c-b3d0-498a-8ced-287e08aa3fad req-50dc1c96-f732-4a21-8497-5bac81b771e0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received unexpected event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:28:59 np0005539505 nova_compute[186958]: 2025-11-29 07:28:59.975 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:28:59 np0005539505 nova_compute[186958]: 2025-11-29 07:28:59.975 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Start _get_guest_xml network_info=[{"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "vif_mac": "fa:16:3e:7d:ab:ce"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:28:59 np0005539505 nova_compute[186958]: 2025-11-29 07:28:59.976 186962 DEBUG nova.objects.instance [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'resources' on Instance uuid 5487c798-eb5a-4186-9693-a64ecd64b296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.004 186962 WARNING nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.014 186962 DEBUG oslo_concurrency.lockutils [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.014 186962 DEBUG oslo_concurrency.lockutils [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquired lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.015 186962 DEBUG nova.network.neutron [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.016 186962 DEBUG nova.virt.libvirt.host [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.018 186962 DEBUG nova.virt.libvirt.host [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.022 186962 DEBUG nova.virt.libvirt.host [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.023 186962 DEBUG nova.virt.libvirt.host [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.024 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.024 186962 DEBUG nova.virt.hardware [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.024 186962 DEBUG nova.virt.hardware [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.025 186962 DEBUG nova.virt.hardware [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.025 186962 DEBUG nova.virt.hardware [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.025 186962 DEBUG nova.virt.hardware [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.025 186962 DEBUG nova.virt.hardware [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.026 186962 DEBUG nova.virt.hardware [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.026 186962 DEBUG nova.virt.hardware [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.026 186962 DEBUG nova.virt.hardware [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.026 186962 DEBUG nova.virt.hardware [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.027 186962 DEBUG nova.virt.hardware [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.027 186962 DEBUG nova.objects.instance [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5487c798-eb5a-4186-9693-a64ecd64b296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.072 186962 DEBUG nova.virt.libvirt.vif [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:28:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-400688363',display_name='tempest-ServerRescueNegativeTestJSON-server-400688363',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-400688363',id=137,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:28:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5d1e4f74add34e9b9a2084bd9586db0c',ramdisk_id='',reservation_id='r-1c7hw7ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1892401049',owner_user_name='tempest-ServerRescueNegativeTestJSON-1892401049-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:28:34Z,user_data=None,user_id='4863fb992d4c48de9a92f63ffb1174a8',uuid=5487c798-eb5a-4186-9693-a64ecd64b296,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "vif_mac": "fa:16:3e:7d:ab:ce"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.073 186962 DEBUG nova.network.os_vif_util [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converting VIF {"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "vif_mac": "fa:16:3e:7d:ab:ce"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.074 186962 DEBUG nova.network.os_vif_util [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7d:ab:ce,bridge_name='br-int',has_traffic_filtering=True,id=0273a9e9-32f2-4363-a3b4-aa1a87caf07c,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0273a9e9-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.075 186962 DEBUG nova.objects.instance [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'pci_devices' on Instance uuid 5487c798-eb5a-4186-9693-a64ecd64b296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.088 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  <uuid>5487c798-eb5a-4186-9693-a64ecd64b296</uuid>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  <name>instance-00000089</name>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-400688363</nova:name>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:29:00</nova:creationTime>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:        <nova:user uuid="4863fb992d4c48de9a92f63ffb1174a8">tempest-ServerRescueNegativeTestJSON-1892401049-project-member</nova:user>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:        <nova:project uuid="5d1e4f74add34e9b9a2084bd9586db0c">tempest-ServerRescueNegativeTestJSON-1892401049</nova:project>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:        <nova:port uuid="0273a9e9-32f2-4363-a3b4-aa1a87caf07c">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <entry name="serial">5487c798-eb5a-4186-9693-a64ecd64b296</entry>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <entry name="uuid">5487c798-eb5a-4186-9693-a64ecd64b296</entry>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.rescue"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <target dev="vdb" bus="virtio"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.config.rescue"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:7d:ab:ce"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <target dev="tap0273a9e9-32"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/console.log" append="off"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:29:00 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:29:00 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:29:00 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:29:00 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.098 186962 INFO nova.virt.libvirt.driver [-] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Instance destroyed successfully.#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.179 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.180 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.180 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.181 186962 DEBUG nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] No VIF found with MAC fa:16:3e:7d:ab:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.181 186962 INFO nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Using config drive#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.240 186962 DEBUG nova.objects.instance [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'ec2_ids' on Instance uuid 5487c798-eb5a-4186-9693-a64ecd64b296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.295 186962 DEBUG nova.objects.instance [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'keypairs' on Instance uuid 5487c798-eb5a-4186-9693-a64ecd64b296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.695 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.757 186962 INFO nova.virt.libvirt.driver [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Creating config drive at /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.config.rescue#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.761 186962 DEBUG oslo_concurrency.processutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpss50zmn_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.890 186962 DEBUG oslo_concurrency.processutils [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpss50zmn_" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:00 np0005539505 NetworkManager[55134]: <info>  [1764401340.9672] manager: (tap0273a9e9-32): new Tun device (/org/freedesktop/NetworkManager/Devices/312)
Nov 29 02:29:00 np0005539505 kernel: tap0273a9e9-32: entered promiscuous mode
Nov 29 02:29:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:00Z|00635|binding|INFO|Claiming lport 0273a9e9-32f2-4363-a3b4-aa1a87caf07c for this chassis.
Nov 29 02:29:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:00Z|00636|binding|INFO|0273a9e9-32f2-4363-a3b4-aa1a87caf07c: Claiming fa:16:3e:7d:ab:ce 10.100.0.8
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.971 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:00Z|00637|binding|INFO|Setting lport 0273a9e9-32f2-4363-a3b4-aa1a87caf07c ovn-installed in OVS
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.989 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:00 np0005539505 systemd-udevd[241549]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:29:00 np0005539505 nova_compute[186958]: 2025-11-29 07:29:00.996 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:00Z|00638|binding|INFO|Setting lport 0273a9e9-32f2-4363-a3b4-aa1a87caf07c up in Southbound
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:00.999 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:ab:ce 10.100.0.8'], port_security=['fa:16:3e:7d:ab:ce 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-008329a1-d4dc-4cfb-be68-95f658d9813d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'b04b69be-f431-4979-89c6-4e231888644a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7e71bac-297c-4031-8579-254c834f5859, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=0273a9e9-32f2-4363-a3b4-aa1a87caf07c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.000 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 0273a9e9-32f2-4363-a3b4-aa1a87caf07c in datapath 008329a1-d4dc-4cfb-be68-95f658d9813d bound to our chassis#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.002 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 008329a1-d4dc-4cfb-be68-95f658d9813d#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.013 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5ffa18df-f4f4-4717-9231-533c898068b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.014 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap008329a1-d1 in ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:29:01 np0005539505 NetworkManager[55134]: <info>  [1764401341.0152] device (tap0273a9e9-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:29:01 np0005539505 systemd-machined[153285]: New machine qemu-72-instance-00000089.
Nov 29 02:29:01 np0005539505 NetworkManager[55134]: <info>  [1764401341.0161] device (tap0273a9e9-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.016 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap008329a1-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.016 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[be892268-5497-4e80-8661-b9e2b88914ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.017 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb91767-58f8-4a07-832b-640287a14f73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.028 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[2da8233a-496d-4ba6-92d5-26d83ae356a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 systemd[1]: Started Virtual Machine qemu-72-instance-00000089.
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.053 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[aa4a7515-ac8f-4439-80f9-b5c06a4a1662]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.089 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8535e8-8e45-49d3-8aa8-468679e444f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.094 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b870b1fc-e681-401c-8ec0-9a053132dac0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 NetworkManager[55134]: <info>  [1764401341.0956] manager: (tap008329a1-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/313)
Nov 29 02:29:01 np0005539505 systemd-udevd[241555]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.129 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[aad8a7b4-bf65-48cf-bbe4-6aeaa737d2a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.132 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c2035e0d-361c-44b6-b59e-2b69a423eabf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 NetworkManager[55134]: <info>  [1764401341.1543] device (tap008329a1-d0): carrier: link connected
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.161 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[93631ba9-d63c-4b8c-8586-2323cb0e32e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.181 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fb8b6bad-976c-4e30-930c-9829016da649]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap008329a1-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:0d:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678873, 'reachable_time': 38834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241588, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.197 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[24bf0799-c1af-49b9-a67c-e668160ecc02]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec4:da9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 678873, 'tstamp': 678873}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241589, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.215 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5a8f0f8b-069f-4412-aeb2-13f6e817d8c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap008329a1-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c4:0d:a9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678873, 'reachable_time': 38834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241590, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.247 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8a6001ed-47b4-4890-bfb6-649309b67b34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.314 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2cfbffb3-0c7a-442d-b866-973abfde1e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.316 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap008329a1-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.316 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.317 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap008329a1-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.318 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539505 NetworkManager[55134]: <info>  [1764401341.3195] manager: (tap008329a1-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/314)
Nov 29 02:29:01 np0005539505 kernel: tap008329a1-d0: entered promiscuous mode
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.322 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.324 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap008329a1-d0, col_values=(('external_ids', {'iface-id': 'd36011d9-2f3d-4616-b3ba-40f6405df460'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.325 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:01Z|00639|binding|INFO|Releasing lport d36011d9-2f3d-4616-b3ba-40f6405df460 from this chassis (sb_readonly=0)
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.336 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.340 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.341 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/008329a1-d4dc-4cfb-be68-95f658d9813d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/008329a1-d4dc-4cfb-be68-95f658d9813d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.342 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4290203d-c70b-4a98-a867-5c9d47f2cca0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.342 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-008329a1-d4dc-4cfb-be68-95f658d9813d
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/008329a1-d4dc-4cfb-be68-95f658d9813d.pid.haproxy
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 008329a1-d4dc-4cfb-be68-95f658d9813d
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.345 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'env', 'PROCESS_TAG=haproxy-008329a1-d4dc-4cfb-be68-95f658d9813d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/008329a1-d4dc-4cfb-be68-95f658d9813d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.370 186962 DEBUG nova.network.neutron [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Updating instance_info_cache with network_info: [{"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.398 186962 DEBUG oslo_concurrency.lockutils [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Releasing lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.404 186962 DEBUG nova.virt.libvirt.vif [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:28:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-468452072',display_name='tempest-TestNetworkAdvancedServerOps-server-468452072',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-468452072',id=134,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYkyrD6GH7Y6zXJE6UdUt8xCR/1va5hW1FscfbM7L30ylZqlz2D8TZeKR1ExqmSnTdiWL+UU0qKusuYO7HAthTySgqkMobIXogk9BOrKSRDnqKaAi1AWXXuuhtQyh3SVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1220905380',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:28:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-4no3w984',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:28:47Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=97b39e54-312a-4ebc-863b-e1ef5f4cf363,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.404 186962 DEBUG nova.network.os_vif_util [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.405 186962 DEBUG nova.network.os_vif_util [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:93:e0,bridge_name='br-int',has_traffic_filtering=True,id=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5,network=Network(717f1c01-fb17-41f0-848c-cebdb3841bf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9083d4b6-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:01 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.406 186962 DEBUG os_vif [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:93:e0,bridge_name='br-int',has_traffic_filtering=True,id=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5,network=Network(717f1c01-fb17-41f0-848c-cebdb3841bf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9083d4b6-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:29:01 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.406 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.406 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.407 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.409 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.410 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9083d4b6-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.410 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9083d4b6-b3, col_values=(('external_ids', {'iface-id': '9083d4b6-b3e2-451d-b686-a9cab7e5a2f5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:93:e0', 'vm-uuid': '97b39e54-312a-4ebc-863b-e1ef5f4cf363'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.410 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.410 186962 INFO os_vif [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:93:e0,bridge_name='br-int',has_traffic_filtering=True,id=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5,network=Network(717f1c01-fb17-41f0-848c-cebdb3841bf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9083d4b6-b3')#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.437 186962 DEBUG nova.objects.instance [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'numa_topology' on Instance uuid 97b39e54-312a-4ebc-863b-e1ef5f4cf363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:01 np0005539505 kernel: tap9083d4b6-b3: entered promiscuous mode
Nov 29 02:29:01 np0005539505 NetworkManager[55134]: <info>  [1764401341.5175] manager: (tap9083d4b6-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/315)
Nov 29 02:29:01 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:01Z|00640|binding|INFO|Claiming lport 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 for this chassis.
Nov 29 02:29:01 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:01Z|00641|binding|INFO|9083d4b6-b3e2-451d-b686-a9cab7e5a2f5: Claiming fa:16:3e:19:93:e0 10.100.0.7
Nov 29 02:29:01 np0005539505 systemd-udevd[241574]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.521 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.536 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:01Z|00642|binding|INFO|Setting lport 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 ovn-installed in OVS
Nov 29 02:29:01 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:01Z|00643|binding|INFO|Setting lport 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 up in Southbound
Nov 29 02:29:01 np0005539505 NetworkManager[55134]: <info>  [1764401341.5377] device (tap9083d4b6-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.537 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539505 NetworkManager[55134]: <info>  [1764401341.5384] device (tap9083d4b6-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.537 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:93:e0 10.100.0.7'], port_security=['fa:16:3e:19:93:e0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '97b39e54-312a-4ebc-863b-e1ef5f4cf363', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '5', 'neutron:security_group_ids': '99684fb2-cb2f-45bc-99fa-10934e68636b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.224'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=48e2bcd2-f544-4812-a73e-4d43e4ef323e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.541 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.554 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401326.5521142, 97b39e54-312a-4ebc-863b-e1ef5f4cf363 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.554 186962 INFO nova.compute.manager [-] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:29:01 np0005539505 systemd-machined[153285]: New machine qemu-73-instance-00000086.
Nov 29 02:29:01 np0005539505 systemd[1]: Started Virtual Machine qemu-73-instance-00000086.
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.590 186962 DEBUG nova.compute.manager [None req-accc1733-7837-431b-9277-dbcced306794 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:01 np0005539505 podman[241645]: 2025-11-29 07:29:01.744719477 +0000 UTC m=+0.053966018 container create fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 02:29:01 np0005539505 systemd[1]: Started libpod-conmon-fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45.scope.
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.787 186962 DEBUG nova.compute.manager [req-41874863-79f6-42a2-afa9-2c7dfc4cf369 req-1e23aaa9-171e-4499-9265-6a5f2f9aea36 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.788 186962 DEBUG oslo_concurrency.lockutils [req-41874863-79f6-42a2-afa9-2c7dfc4cf369 req-1e23aaa9-171e-4499-9265-6a5f2f9aea36 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.789 186962 DEBUG oslo_concurrency.lockutils [req-41874863-79f6-42a2-afa9-2c7dfc4cf369 req-1e23aaa9-171e-4499-9265-6a5f2f9aea36 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.789 186962 DEBUG oslo_concurrency.lockutils [req-41874863-79f6-42a2-afa9-2c7dfc4cf369 req-1e23aaa9-171e-4499-9265-6a5f2f9aea36 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.789 186962 DEBUG nova.compute.manager [req-41874863-79f6-42a2-afa9-2c7dfc4cf369 req-1e23aaa9-171e-4499-9265-6a5f2f9aea36 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] No waiting events found dispatching network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:01 np0005539505 nova_compute[186958]: 2025-11-29 07:29:01.789 186962 WARNING nova.compute.manager [req-41874863-79f6-42a2-afa9-2c7dfc4cf369 req-1e23aaa9-171e-4499-9265-6a5f2f9aea36 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received unexpected event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 for instance with vm_state suspended and task_state resuming.#033[00m
Nov 29 02:29:01 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:29:01 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/985a307ca8f7fe6921035a0598b6b43de18753c8260cd2790ab5d260ae3e7bde/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:29:01 np0005539505 podman[241645]: 2025-11-29 07:29:01.712053013 +0000 UTC m=+0.021299584 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:29:01 np0005539505 podman[241645]: 2025-11-29 07:29:01.820188883 +0000 UTC m=+0.129435474 container init fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:29:01 np0005539505 podman[241645]: 2025-11-29 07:29:01.826261015 +0000 UTC m=+0.135507576 container start fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 02:29:01 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241660]: [NOTICE]   (241664) : New worker (241671) forked
Nov 29 02:29:01 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241660]: [NOTICE]   (241664) : Loading success.
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.888 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 in datapath 717f1c01-fb17-41f0-848c-cebdb3841bf9 unbound from our chassis#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.890 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 717f1c01-fb17-41f0-848c-cebdb3841bf9#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.903 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[84a1c1be-37df-451e-8a3e-c438455800b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.904 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap717f1c01-f1 in ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.905 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap717f1c01-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.905 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6a49246a-2a2b-49dd-ae20-d28fc57d73dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.906 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ebd4f9ce-37bd-40bf-9dd0-442fdc2a54cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.919 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[269c6533-f153-4e1d-825d-58136bee2e22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.943 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0829918f-4ab1-4568-b87a-8fbcda4bb80a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.974 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[75972921-90c6-4801-979f-e6756cbedecb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:01 np0005539505 NetworkManager[55134]: <info>  [1764401341.9822] manager: (tap717f1c01-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/316)
Nov 29 02:29:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:01.987 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a938bef0-e65b-4e5d-9177-035670d3348d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.011 186962 DEBUG oslo_concurrency.lockutils [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.012 186962 DEBUG oslo_concurrency.lockutils [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.012 186962 DEBUG oslo_concurrency.lockutils [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.012 186962 DEBUG oslo_concurrency.lockutils [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.012 186962 DEBUG oslo_concurrency.lockutils [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.022 186962 INFO nova.compute.manager [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Terminating instance#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.024 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[85e5ab75-9271-42fc-9a42-cea264aeb34f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.028 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[d421aeb3-3d6d-466d-89ae-71a9f84c48d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.032 186962 DEBUG nova.compute.manager [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:29:02 np0005539505 NetworkManager[55134]: <info>  [1764401342.0485] device (tap717f1c01-f0): carrier: link connected
Nov 29 02:29:02 np0005539505 kernel: tap1e2538b2-92 (unregistering): left promiscuous mode
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.055 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a85d344a-8305-48a2-a024-3bc7f5269297]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:02 np0005539505 NetworkManager[55134]: <info>  [1764401342.0586] device (tap1e2538b2-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.072 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:02 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:02Z|00644|binding|INFO|Releasing lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 from this chassis (sb_readonly=0)
Nov 29 02:29:02 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:02Z|00645|binding|INFO|Setting lport 1e2538b2-9233-45f7-9334-e7fcdba1da31 down in Southbound
Nov 29 02:29:02 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:02Z|00646|binding|INFO|Removing iface tap1e2538b2-92 ovn-installed in OVS
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.082 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:02:7c 10.100.0.14'], port_security=['fa:16:3e:a4:02:7c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '19c7d8b2-3f1a-40cf-a538-dd2752970ffb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28412826-5463-46e4-95cb-a7d788b1ab15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7843cfa993a1428aaaa660321ebba1ac', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b91ab01c-e143-4067-9931-a92270268d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3cbf7b29-c247-42f8-abc3-94d1e6be8d3f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=1e2538b2-9233-45f7-9334-e7fcdba1da31) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.083 186962 DEBUG nova.compute.manager [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.084 186962 DEBUG oslo_concurrency.lockutils [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.084 186962 DEBUG oslo_concurrency.lockutils [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.084 186962 DEBUG oslo_concurrency.lockutils [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.084 186962 DEBUG nova.compute.manager [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] No waiting events found dispatching network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.085 186962 WARNING nova.compute.manager [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received unexpected event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.085 186962 DEBUG nova.compute.manager [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.086 186962 DEBUG oslo_concurrency.lockutils [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.086 186962 DEBUG oslo_concurrency.lockutils [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.086 186962 DEBUG oslo_concurrency.lockutils [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.086 186962 DEBUG nova.compute.manager [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] No waiting events found dispatching network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.086 186962 WARNING nova.compute.manager [req-78bcedc3-dd80-44f8-8ee6-eb266b5df7a3 req-9aedca69-9c8b-4041-9361-2512e809ffad 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received unexpected event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c for instance with vm_state active and task_state rescuing.#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.087 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.087 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8601fb93-d7fd-4854-ab16-3c6bf7115698]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap717f1c01-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:5b:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678963, 'reachable_time': 43238, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241694, 'error': None, 'target': 'ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.103 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b31791c8-2064-4d56-9d87-a62057143225]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:5bed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 678963, 'tstamp': 678963}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241696, 'error': None, 'target': 'ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:02 np0005539505 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000083.scope: Deactivated successfully.
Nov 29 02:29:02 np0005539505 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000083.scope: Consumed 13.869s CPU time.
Nov 29 02:29:02 np0005539505 systemd-machined[153285]: Machine qemu-70-instance-00000083 terminated.
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.127 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa2ec73-638f-473f-9bbd-a6c313c7f58a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap717f1c01-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:5b:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 204], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678963, 'reachable_time': 43238, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241697, 'error': None, 'target': 'ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.136 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401342.1363008, 97b39e54-312a-4ebc-863b-e1ef5f4cf363 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.137 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] VM Started (Lifecycle Event)#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.160 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c2ca15c5-92e1-4b51-88b6-3f3b04d94eee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.166 186962 DEBUG nova.compute.manager [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.166 186962 DEBUG nova.objects.instance [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'pci_devices' on Instance uuid 97b39e54-312a-4ebc-863b-e1ef5f4cf363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.172 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.177 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.188 186962 INFO nova.virt.libvirt.driver [-] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Instance running successfully.#033[00m
Nov 29 02:29:02 np0005539505 virtqemud[186353]: argument unsupported: QEMU guest agent is not configured
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.191 186962 DEBUG nova.virt.libvirt.guest [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.191 186962 DEBUG nova.compute.manager [None req-61b0bbbb-456b-4bb4-a433-af2bb37cc3ae bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.206 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.207 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401342.1410503, 97b39e54-312a-4ebc-863b-e1ef5f4cf363 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.207 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.222 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4705801b-c214-452d-9d5a-d6e5eb1b8dde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.223 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap717f1c01-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.224 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.224 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap717f1c01-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.226 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:02 np0005539505 NetworkManager[55134]: <info>  [1764401342.2270] manager: (tap717f1c01-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Nov 29 02:29:02 np0005539505 kernel: tap717f1c01-f0: entered promiscuous mode
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.231 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.233 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap717f1c01-f0, col_values=(('external_ids', {'iface-id': '84ef03fa-56c5-40a8-989e-6c02a5b46df5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:02 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:02Z|00647|binding|INFO|Releasing lport 84ef03fa-56c5-40a8-989e-6c02a5b46df5 from this chassis (sb_readonly=0)
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.234 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.248 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.249 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:02 np0005539505 NetworkManager[55134]: <info>  [1764401342.2514] manager: (tap1e2538b2-92): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.253 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.256 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/717f1c01-fb17-41f0-848c-cebdb3841bf9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/717f1c01-fb17-41f0-848c-cebdb3841bf9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.255 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.257 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7902819c-3af9-44ce-a181-f61f9035008b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.258 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-717f1c01-fb17-41f0-848c-cebdb3841bf9
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/717f1c01-fb17-41f0-848c-cebdb3841bf9.pid.haproxy
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 717f1c01-fb17-41f0-848c-cebdb3841bf9
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.258 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'env', 'PROCESS_TAG=haproxy-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/717f1c01-fb17-41f0-848c-cebdb3841bf9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.291 186962 INFO nova.virt.libvirt.driver [-] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Instance destroyed successfully.#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.292 186962 DEBUG nova.objects.instance [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lazy-loading 'resources' on Instance uuid 19c7d8b2-3f1a-40cf-a538-dd2752970ffb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.326 186962 DEBUG nova.virt.libvirt.vif [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:27:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1624881395',display_name='tempest-ListServerFiltersTestJSON-instance-1624881395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1624881395',id=131,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:27:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7843cfa993a1428aaaa660321ebba1ac',ramdisk_id='',reservation_id='r-br107bnl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1571311845',owner_user_name='tempest-ListServerFiltersTestJSON-1571311845-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:28:31Z,user_data=None,user_id='3e2a40601ced4de78fe1767769f262c0',uuid=19c7d8b2-3f1a-40cf-a538-dd2752970ffb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.326 186962 DEBUG nova.network.os_vif_util [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converting VIF {"id": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "address": "fa:16:3e:a4:02:7c", "network": {"id": "28412826-5463-46e4-95cb-a7d788b1ab15", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1363473809-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7843cfa993a1428aaaa660321ebba1ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e2538b2-92", "ovs_interfaceid": "1e2538b2-9233-45f7-9334-e7fcdba1da31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.327 186962 DEBUG nova.network.os_vif_util [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.328 186962 DEBUG os_vif [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.329 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.329 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e2538b2-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.331 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.333 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.335 186962 INFO os_vif [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:02:7c,bridge_name='br-int',has_traffic_filtering=True,id=1e2538b2-9233-45f7-9334-e7fcdba1da31,network=Network(28412826-5463-46e4-95cb-a7d788b1ab15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e2538b2-92')#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.336 186962 INFO nova.virt.libvirt.driver [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Deleting instance files /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb_del#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.337 186962 INFO nova.virt.libvirt.driver [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Deletion of /var/lib/nova/instances/19c7d8b2-3f1a-40cf-a538-dd2752970ffb_del complete#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.429 186962 INFO nova.compute.manager [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.429 186962 DEBUG oslo.service.loopingcall [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.430 186962 DEBUG nova.compute.manager [-] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.430 186962 DEBUG nova.network.neutron [-] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:29:02 np0005539505 podman[241744]: 2025-11-29 07:29:02.62463197 +0000 UTC m=+0.044659955 container create e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:29:02 np0005539505 systemd[1]: Started libpod-conmon-e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01.scope.
Nov 29 02:29:02 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:29:02 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c31a38509ce20379f44aaa6b7bc04156e4b012440e52f180231a029d689ed1e0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:29:02 np0005539505 podman[241744]: 2025-11-29 07:29:02.600048464 +0000 UTC m=+0.020076469 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:29:02 np0005539505 podman[241744]: 2025-11-29 07:29:02.699133048 +0000 UTC m=+0.119161033 container init e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:29:02 np0005539505 podman[241744]: 2025-11-29 07:29:02.705117618 +0000 UTC m=+0.125145603 container start e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:29:02 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[241759]: [NOTICE]   (241764) : New worker (241771) forked
Nov 29 02:29:02 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[241759]: [NOTICE]   (241764) : Loading success.
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.769 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 1e2538b2-9233-45f7-9334-e7fcdba1da31 in datapath 28412826-5463-46e4-95cb-a7d788b1ab15 unbound from our chassis#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.771 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28412826-5463-46e4-95cb-a7d788b1ab15, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.772 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[766d59ca-74a6-4f74-95b8-990ac72926c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:02.772 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 namespace which is not needed anymore#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.831 186962 DEBUG nova.virt.libvirt.host [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Removed pending event for 5487c798-eb5a-4186-9693-a64ecd64b296 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.831 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401342.8305182, 5487c798-eb5a-4186-9693-a64ecd64b296 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.832 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.852 186962 DEBUG nova.compute.manager [None req-66a13008-1a13-4f44-8fc2-c920117a7a6f 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.883 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.887 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:02 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[241006]: [NOTICE]   (241010) : haproxy version is 2.8.14-c23fe91
Nov 29 02:29:02 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[241006]: [NOTICE]   (241010) : path to executable is /usr/sbin/haproxy
Nov 29 02:29:02 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[241006]: [WARNING]  (241010) : Exiting Master process...
Nov 29 02:29:02 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[241006]: [WARNING]  (241010) : Exiting Master process...
Nov 29 02:29:02 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[241006]: [ALERT]    (241010) : Current worker (241012) exited with code 143 (Terminated)
Nov 29 02:29:02 np0005539505 neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15[241006]: [WARNING]  (241010) : All workers exited. Exiting... (0)
Nov 29 02:29:02 np0005539505 systemd[1]: libpod-e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9.scope: Deactivated successfully.
Nov 29 02:29:02 np0005539505 conmon[241006]: conmon e5c43faaf86bb4bb183a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9.scope/container/memory.events
Nov 29 02:29:02 np0005539505 podman[241798]: 2025-11-29 07:29:02.915515483 +0000 UTC m=+0.048538885 container died e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.923 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.924 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401342.8321276, 5487c798-eb5a-4186-9693-a64ecd64b296 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.924 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] VM Started (Lifecycle Event)#033[00m
Nov 29 02:29:02 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9-userdata-shm.mount: Deactivated successfully.
Nov 29 02:29:02 np0005539505 systemd[1]: var-lib-containers-storage-overlay-a634b3bce9b25e2f7774bd2016dfffbabae8e78ae8229234d1ae66b2dd2a682d-merged.mount: Deactivated successfully.
Nov 29 02:29:02 np0005539505 podman[241798]: 2025-11-29 07:29:02.950814322 +0000 UTC m=+0.083837724 container cleanup e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.955 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:02 np0005539505 nova_compute[186958]: 2025-11-29 07:29:02.958 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:02 np0005539505 systemd[1]: libpod-conmon-e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9.scope: Deactivated successfully.
Nov 29 02:29:03 np0005539505 podman[241827]: 2025-11-29 07:29:03.013802854 +0000 UTC m=+0.038422658 container remove e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:29:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:03.018 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e29e2040-dcc5-412e-81ae-3e9d61c86100]: (4, ('Sat Nov 29 07:29:02 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 (e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9)\ne5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9\nSat Nov 29 07:29:02 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 (e5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9)\ne5c43faaf86bb4bb183ae6545dd3d12989d26063e35a291b35d3313dfbd9fcc9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:03.020 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1c830a04-058e-4751-9b3e-417648bb36e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:03.021 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28412826-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.022 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:03 np0005539505 kernel: tap28412826-50: left promiscuous mode
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.034 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:03.038 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a18bd2f1-43f6-463c-b1ac-d3239a0a9498]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:03.054 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7fce314f-ddf0-47c2-9fd6-c52d135d1199]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:03.056 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d2f5dbd9-9209-4842-b12b-00427caac081]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:03.070 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0bc75f11-201b-4fee-bfd0-d94eae500dc9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 675917, 'reachable_time': 31470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241841, 'error': None, 'target': 'ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:03 np0005539505 systemd[1]: run-netns-ovnmeta\x2d28412826\x2d5463\x2d46e4\x2d95cb\x2da7d788b1ab15.mount: Deactivated successfully.
Nov 29 02:29:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:03.073 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-28412826-5463-46e4-95cb-a7d788b1ab15 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:29:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:03.073 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[871d7906-70bc-4559-a2f6-a29dac50aa08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.233 186962 DEBUG nova.network.neutron [-] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.265 186962 INFO nova.compute.manager [-] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Took 0.84 seconds to deallocate network for instance.#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.364 186962 DEBUG oslo_concurrency.lockutils [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.364 186962 DEBUG oslo_concurrency.lockutils [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.467 186962 DEBUG nova.compute.provider_tree [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.491 186962 DEBUG nova.scheduler.client.report [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.541 186962 DEBUG oslo_concurrency.lockutils [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.735 186962 INFO nova.scheduler.client.report [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Deleted allocations for instance 19c7d8b2-3f1a-40cf-a538-dd2752970ffb#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.806 186962 DEBUG oslo_concurrency.lockutils [None req-fd3aef3d-7f6e-4fd2-83a8-07f1de05f316 3e2a40601ced4de78fe1767769f262c0 7843cfa993a1428aaaa660321ebba1ac - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.794s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.892 186962 DEBUG nova.compute.manager [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.893 186962 DEBUG oslo_concurrency.lockutils [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.893 186962 DEBUG oslo_concurrency.lockutils [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.894 186962 DEBUG oslo_concurrency.lockutils [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.894 186962 DEBUG nova.compute.manager [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] No waiting events found dispatching network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.894 186962 WARNING nova.compute.manager [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received unexpected event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.894 186962 DEBUG nova.compute.manager [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-unplugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.895 186962 DEBUG oslo_concurrency.lockutils [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.895 186962 DEBUG oslo_concurrency.lockutils [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.895 186962 DEBUG oslo_concurrency.lockutils [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.896 186962 DEBUG nova.compute.manager [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] No waiting events found dispatching network-vif-unplugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.896 186962 WARNING nova.compute.manager [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received unexpected event network-vif-unplugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.896 186962 DEBUG nova.compute.manager [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.896 186962 DEBUG oslo_concurrency.lockutils [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.897 186962 DEBUG oslo_concurrency.lockutils [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.897 186962 DEBUG oslo_concurrency.lockutils [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "19c7d8b2-3f1a-40cf-a538-dd2752970ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.897 186962 DEBUG nova.compute.manager [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] No waiting events found dispatching network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.898 186962 WARNING nova.compute.manager [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received unexpected event network-vif-plugged-1e2538b2-9233-45f7-9334-e7fcdba1da31 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:29:03 np0005539505 nova_compute[186958]: 2025-11-29 07:29:03.898 186962 DEBUG nova.compute.manager [req-ae17b909-b5b7-4092-a795-609de15ff5f9 req-691798c4-aa9a-44be-9db0-95c7a87449ca 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Received event network-vif-deleted-1e2538b2-9233-45f7-9334-e7fcdba1da31 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:05 np0005539505 nova_compute[186958]: 2025-11-29 07:29:05.027 186962 INFO nova.compute.manager [None req-97620a2e-14e9-4361-a010-efbe1407f98c bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Get console output#033[00m
Nov 29 02:29:05 np0005539505 nova_compute[186958]: 2025-11-29 07:29:05.032 213540 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:29:05 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:05Z|00648|binding|INFO|Releasing lport d36011d9-2f3d-4616-b3ba-40f6405df460 from this chassis (sb_readonly=0)
Nov 29 02:29:05 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:05Z|00649|binding|INFO|Releasing lport 84ef03fa-56c5-40a8-989e-6c02a5b46df5 from this chassis (sb_readonly=0)
Nov 29 02:29:05 np0005539505 nova_compute[186958]: 2025-11-29 07:29:05.645 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:05 np0005539505 nova_compute[186958]: 2025-11-29 07:29:05.699 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:05 np0005539505 podman[241842]: 2025-11-29 07:29:05.738051167 +0000 UTC m=+0.064636060 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:29:05 np0005539505 podman[241843]: 2025-11-29 07:29:05.815009035 +0000 UTC m=+0.135559438 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.244 186962 DEBUG nova.compute.manager [req-711ef02b-7c89-4e88-b892-5c355b01191c req-a78f49f9-feb5-4dde-9004-311faa1bdcfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-changed-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.244 186962 DEBUG nova.compute.manager [req-711ef02b-7c89-4e88-b892-5c355b01191c req-a78f49f9-feb5-4dde-9004-311faa1bdcfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Refreshing instance network info cache due to event network-changed-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.244 186962 DEBUG oslo_concurrency.lockutils [req-711ef02b-7c89-4e88-b892-5c355b01191c req-a78f49f9-feb5-4dde-9004-311faa1bdcfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.245 186962 DEBUG oslo_concurrency.lockutils [req-711ef02b-7c89-4e88-b892-5c355b01191c req-a78f49f9-feb5-4dde-9004-311faa1bdcfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.245 186962 DEBUG nova.network.neutron [req-711ef02b-7c89-4e88-b892-5c355b01191c req-a78f49f9-feb5-4dde-9004-311faa1bdcfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Refreshing network info cache for port 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.294 186962 DEBUG oslo_concurrency.lockutils [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.295 186962 DEBUG oslo_concurrency.lockutils [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.295 186962 DEBUG oslo_concurrency.lockutils [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.296 186962 DEBUG oslo_concurrency.lockutils [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.296 186962 DEBUG oslo_concurrency.lockutils [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.307 186962 INFO nova.compute.manager [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Terminating instance#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.320 186962 DEBUG nova.compute.manager [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:29:06 np0005539505 kernel: tap9083d4b6-b3 (unregistering): left promiscuous mode
Nov 29 02:29:06 np0005539505 NetworkManager[55134]: <info>  [1764401346.3563] device (tap9083d4b6-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.376 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:06 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:06Z|00650|binding|INFO|Releasing lport 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 from this chassis (sb_readonly=0)
Nov 29 02:29:06 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:06Z|00651|binding|INFO|Setting lport 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 down in Southbound
Nov 29 02:29:06 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:06Z|00652|binding|INFO|Removing iface tap9083d4b6-b3 ovn-installed in OVS
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.379 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.385 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:93:e0 10.100.0.7'], port_security=['fa:16:3e:19:93:e0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '97b39e54-312a-4ebc-863b-e1ef5f4cf363', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c231e63624d44fc19e0989abfb1afb22', 'neutron:revision_number': '6', 'neutron:security_group_ids': '99684fb2-cb2f-45bc-99fa-10934e68636b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=48e2bcd2-f544-4812-a73e-4d43e4ef323e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.386 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 in datapath 717f1c01-fb17-41f0-848c-cebdb3841bf9 unbound from our chassis#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.388 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 717f1c01-fb17-41f0-848c-cebdb3841bf9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.389 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[40c73206-3742-444c-942a-ccc3cfd9923f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.390 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9 namespace which is not needed anymore#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.397 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:06 np0005539505 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000086.scope: Deactivated successfully.
Nov 29 02:29:06 np0005539505 systemd-machined[153285]: Machine qemu-73-instance-00000086 terminated.
Nov 29 02:29:06 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[241759]: [NOTICE]   (241764) : haproxy version is 2.8.14-c23fe91
Nov 29 02:29:06 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[241759]: [NOTICE]   (241764) : path to executable is /usr/sbin/haproxy
Nov 29 02:29:06 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[241759]: [WARNING]  (241764) : Exiting Master process...
Nov 29 02:29:06 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[241759]: [WARNING]  (241764) : Exiting Master process...
Nov 29 02:29:06 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[241759]: [ALERT]    (241764) : Current worker (241771) exited with code 143 (Terminated)
Nov 29 02:29:06 np0005539505 neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9[241759]: [WARNING]  (241764) : All workers exited. Exiting... (0)
Nov 29 02:29:06 np0005539505 systemd[1]: libpod-e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01.scope: Deactivated successfully.
Nov 29 02:29:06 np0005539505 podman[241911]: 2025-11-29 07:29:06.530440973 +0000 UTC m=+0.055245905 container died e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 29 02:29:06 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01-userdata-shm.mount: Deactivated successfully.
Nov 29 02:29:06 np0005539505 systemd[1]: var-lib-containers-storage-overlay-c31a38509ce20379f44aaa6b7bc04156e4b012440e52f180231a029d689ed1e0-merged.mount: Deactivated successfully.
Nov 29 02:29:06 np0005539505 podman[241911]: 2025-11-29 07:29:06.603717526 +0000 UTC m=+0.128522448 container cleanup e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:29:06 np0005539505 systemd[1]: libpod-conmon-e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01.scope: Deactivated successfully.
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.633 186962 INFO nova.virt.libvirt.driver [-] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Instance destroyed successfully.#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.634 186962 DEBUG nova.objects.instance [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lazy-loading 'resources' on Instance uuid 97b39e54-312a-4ebc-863b-e1ef5f4cf363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.651 186962 DEBUG nova.virt.libvirt.vif [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:28:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-468452072',display_name='tempest-TestNetworkAdvancedServerOps-server-468452072',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-468452072',id=134,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNYkyrD6GH7Y6zXJE6UdUt8xCR/1va5hW1FscfbM7L30ylZqlz2D8TZeKR1ExqmSnTdiWL+UU0qKusuYO7HAthTySgqkMobIXogk9BOrKSRDnqKaAi1AWXXuuhtQyh3SVQ==',key_name='tempest-TestNetworkAdvancedServerOps-1220905380',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:28:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c231e63624d44fc19e0989abfb1afb22',ramdisk_id='',reservation_id='r-4no3w984',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1380683659',owner_user_name='tempest-TestNetworkAdvancedServerOps-1380683659-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:29:02Z,user_data=None,user_id='bfd2024670594b10941cec8a59d2573f',uuid=97b39e54-312a-4ebc-863b-e1ef5f4cf363,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.652 186962 DEBUG nova.network.os_vif_util [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converting VIF {"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.653 186962 DEBUG nova.network.os_vif_util [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:93:e0,bridge_name='br-int',has_traffic_filtering=True,id=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5,network=Network(717f1c01-fb17-41f0-848c-cebdb3841bf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9083d4b6-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.653 186962 DEBUG os_vif [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:93:e0,bridge_name='br-int',has_traffic_filtering=True,id=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5,network=Network(717f1c01-fb17-41f0-848c-cebdb3841bf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9083d4b6-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.655 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.655 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9083d4b6-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.657 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.659 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.664 186962 INFO os_vif [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:93:e0,bridge_name='br-int',has_traffic_filtering=True,id=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5,network=Network(717f1c01-fb17-41f0-848c-cebdb3841bf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9083d4b6-b3')#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.665 186962 INFO nova.virt.libvirt.driver [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Deleting instance files /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363_del#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.667 186962 INFO nova.virt.libvirt.driver [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Deletion of /var/lib/nova/instances/97b39e54-312a-4ebc-863b-e1ef5f4cf363_del complete#033[00m
Nov 29 02:29:06 np0005539505 podman[241953]: 2025-11-29 07:29:06.681967751 +0000 UTC m=+0.049073290 container remove e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.688 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d99fb10e-1be7-42ae-b94b-696d61899f18]: (4, ('Sat Nov 29 07:29:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9 (e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01)\ne02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01\nSat Nov 29 07:29:06 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9 (e02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01)\ne02eba786c3fc8e81d0e53fe8e137c55046abc08d5536710439f24412896bf01\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.690 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[44fd1975-fa38-46f3-8055-0c76b800c688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.690 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap717f1c01-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.693 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:06 np0005539505 kernel: tap717f1c01-f0: left promiscuous mode
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.695 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.697 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c4c817f4-a381-4a72-b2f5-99d6fe9fee4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.711 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.716 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ce711c-6b7e-44a6-a5af-84a8d318784d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.718 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c86861-1106-4b3a-aef9-c83f4d69b934]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.732 186962 DEBUG nova.compute.manager [req-ba0c0bb4-3bd3-4b26-b66e-bdad56c362ee req-fdfd1b48-2b99-4400-be7d-ff6a2de8c105 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-vif-unplugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.732 186962 DEBUG oslo_concurrency.lockutils [req-ba0c0bb4-3bd3-4b26-b66e-bdad56c362ee req-fdfd1b48-2b99-4400-be7d-ff6a2de8c105 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.733 186962 DEBUG oslo_concurrency.lockutils [req-ba0c0bb4-3bd3-4b26-b66e-bdad56c362ee req-fdfd1b48-2b99-4400-be7d-ff6a2de8c105 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.733 186962 DEBUG oslo_concurrency.lockutils [req-ba0c0bb4-3bd3-4b26-b66e-bdad56c362ee req-fdfd1b48-2b99-4400-be7d-ff6a2de8c105 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.733 186962 DEBUG nova.compute.manager [req-ba0c0bb4-3bd3-4b26-b66e-bdad56c362ee req-fdfd1b48-2b99-4400-be7d-ff6a2de8c105 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] No waiting events found dispatching network-vif-unplugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.733 186962 DEBUG nova.compute.manager [req-ba0c0bb4-3bd3-4b26-b66e-bdad56c362ee req-fdfd1b48-2b99-4400-be7d-ff6a2de8c105 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-vif-unplugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.735 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5356abb6-31fb-4da7-bc56-c17f9c4f4afa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678955, 'reachable_time': 40380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241967, 'error': None, 'target': 'ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:06 np0005539505 systemd[1]: run-netns-ovnmeta\x2d717f1c01\x2dfb17\x2d41f0\x2d848c\x2dcebdb3841bf9.mount: Deactivated successfully.
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.738 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-717f1c01-fb17-41f0-848c-cebdb3841bf9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:29:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:06.738 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[adab0f9c-ade8-4fc8-b0e5-bb4110284ce9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.741 186962 INFO nova.compute.manager [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.742 186962 DEBUG oslo.service.loopingcall [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.742 186962 DEBUG nova.compute.manager [-] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:29:06 np0005539505 nova_compute[186958]: 2025-11-29 07:29:06.742 186962 DEBUG nova.network.neutron [-] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.465 186962 DEBUG nova.network.neutron [-] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.491 186962 INFO nova.compute.manager [-] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Took 0.75 seconds to deallocate network for instance.#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.645 186962 DEBUG oslo_concurrency.lockutils [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.645 186962 DEBUG oslo_concurrency.lockutils [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.707 186962 DEBUG nova.network.neutron [req-711ef02b-7c89-4e88-b892-5c355b01191c req-a78f49f9-feb5-4dde-9004-311faa1bdcfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Updated VIF entry in instance network info cache for port 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.708 186962 DEBUG nova.network.neutron [req-711ef02b-7c89-4e88-b892-5c355b01191c req-a78f49f9-feb5-4dde-9004-311faa1bdcfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Updating instance_info_cache with network_info: [{"id": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "address": "fa:16:3e:19:93:e0", "network": {"id": "717f1c01-fb17-41f0-848c-cebdb3841bf9", "bridge": "br-int", "label": "tempest-network-smoke--1973334957", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c231e63624d44fc19e0989abfb1afb22", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9083d4b6-b3", "ovs_interfaceid": "9083d4b6-b3e2-451d-b686-a9cab7e5a2f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.718 186962 DEBUG nova.compute.provider_tree [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.729 186962 DEBUG oslo_concurrency.lockutils [req-711ef02b-7c89-4e88-b892-5c355b01191c req-a78f49f9-feb5-4dde-9004-311faa1bdcfd 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-97b39e54-312a-4ebc-863b-e1ef5f4cf363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.734 186962 DEBUG nova.scheduler.client.report [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.750 186962 DEBUG oslo_concurrency.lockutils [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.768 186962 INFO nova.scheduler.client.report [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Deleted allocations for instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363#033[00m
Nov 29 02:29:07 np0005539505 nova_compute[186958]: 2025-11-29 07:29:07.839 186962 DEBUG oslo_concurrency.lockutils [None req-971fec1a-7f6f-4f7e-9c89-e5adc0ea241a bfd2024670594b10941cec8a59d2573f c231e63624d44fc19e0989abfb1afb22 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.544s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:08 np0005539505 nova_compute[186958]: 2025-11-29 07:29:08.414 186962 DEBUG nova.compute.manager [req-bd50ad25-dfc8-4fa7-9483-7c408cc658a4 req-c2bd5908-f705-450f-940d-c52501fceb76 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-vif-deleted-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:08 np0005539505 nova_compute[186958]: 2025-11-29 07:29:08.415 186962 INFO nova.compute.manager [req-bd50ad25-dfc8-4fa7-9483-7c408cc658a4 req-c2bd5908-f705-450f-940d-c52501fceb76 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Neutron deleted interface 9083d4b6-b3e2-451d-b686-a9cab7e5a2f5; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:29:08 np0005539505 nova_compute[186958]: 2025-11-29 07:29:08.415 186962 DEBUG nova.network.neutron [req-bd50ad25-dfc8-4fa7-9483-7c408cc658a4 req-c2bd5908-f705-450f-940d-c52501fceb76 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Nov 29 02:29:08 np0005539505 nova_compute[186958]: 2025-11-29 07:29:08.416 186962 DEBUG nova.compute.manager [req-bd50ad25-dfc8-4fa7-9483-7c408cc658a4 req-c2bd5908-f705-450f-940d-c52501fceb76 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Detach interface failed, port_id=9083d4b6-b3e2-451d-b686-a9cab7e5a2f5, reason: Instance 97b39e54-312a-4ebc-863b-e1ef5f4cf363 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:29:08 np0005539505 nova_compute[186958]: 2025-11-29 07:29:08.918 186962 DEBUG nova.compute.manager [req-49d969d5-e34d-4ca2-8cef-499973859557 req-b938dd64-1112-47a3-8274-77bf132b9d80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:08 np0005539505 nova_compute[186958]: 2025-11-29 07:29:08.919 186962 DEBUG oslo_concurrency.lockutils [req-49d969d5-e34d-4ca2-8cef-499973859557 req-b938dd64-1112-47a3-8274-77bf132b9d80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:08 np0005539505 nova_compute[186958]: 2025-11-29 07:29:08.920 186962 DEBUG oslo_concurrency.lockutils [req-49d969d5-e34d-4ca2-8cef-499973859557 req-b938dd64-1112-47a3-8274-77bf132b9d80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:08 np0005539505 nova_compute[186958]: 2025-11-29 07:29:08.920 186962 DEBUG oslo_concurrency.lockutils [req-49d969d5-e34d-4ca2-8cef-499973859557 req-b938dd64-1112-47a3-8274-77bf132b9d80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "97b39e54-312a-4ebc-863b-e1ef5f4cf363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:08 np0005539505 nova_compute[186958]: 2025-11-29 07:29:08.921 186962 DEBUG nova.compute.manager [req-49d969d5-e34d-4ca2-8cef-499973859557 req-b938dd64-1112-47a3-8274-77bf132b9d80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] No waiting events found dispatching network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:08 np0005539505 nova_compute[186958]: 2025-11-29 07:29:08.922 186962 WARNING nova.compute.manager [req-49d969d5-e34d-4ca2-8cef-499973859557 req-b938dd64-1112-47a3-8274-77bf132b9d80 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Received unexpected event network-vif-plugged-9083d4b6-b3e2-451d-b686-a9cab7e5a2f5 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:29:09 np0005539505 podman[241968]: 2025-11-29 07:29:09.74070592 +0000 UTC m=+0.067198673 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:29:09 np0005539505 podman[241969]: 2025-11-29 07:29:09.753142782 +0000 UTC m=+0.073764629 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.252 186962 DEBUG oslo_concurrency.lockutils [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.253 186962 DEBUG oslo_concurrency.lockutils [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.254 186962 DEBUG oslo_concurrency.lockutils [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.254 186962 DEBUG oslo_concurrency.lockutils [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.255 186962 DEBUG oslo_concurrency.lockutils [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.274 186962 INFO nova.compute.manager [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Terminating instance#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.307 186962 DEBUG nova.compute.manager [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:29:10 np0005539505 kernel: tap0273a9e9-32 (unregistering): left promiscuous mode
Nov 29 02:29:10 np0005539505 NetworkManager[55134]: <info>  [1764401350.3309] device (tap0273a9e9-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.337 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:10Z|00653|binding|INFO|Releasing lport 0273a9e9-32f2-4363-a3b4-aa1a87caf07c from this chassis (sb_readonly=0)
Nov 29 02:29:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:10Z|00654|binding|INFO|Setting lport 0273a9e9-32f2-4363-a3b4-aa1a87caf07c down in Southbound
Nov 29 02:29:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:10Z|00655|binding|INFO|Removing iface tap0273a9e9-32 ovn-installed in OVS
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.340 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.347 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:ab:ce 10.100.0.8'], port_security=['fa:16:3e:7d:ab:ce 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '5487c798-eb5a-4186-9693-a64ecd64b296', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-008329a1-d4dc-4cfb-be68-95f658d9813d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5d1e4f74add34e9b9a2084bd9586db0c', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b04b69be-f431-4979-89c6-4e231888644a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e7e71bac-297c-4031-8579-254c834f5859, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=0273a9e9-32f2-4363-a3b4-aa1a87caf07c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.348 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 0273a9e9-32f2-4363-a3b4-aa1a87caf07c in datapath 008329a1-d4dc-4cfb-be68-95f658d9813d unbound from our chassis#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.350 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 008329a1-d4dc-4cfb-be68-95f658d9813d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.351 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[82c1ffee-5f83-4204-800a-15e80f6293a2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.352 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d namespace which is not needed anymore#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.358 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:10 np0005539505 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000089.scope: Deactivated successfully.
Nov 29 02:29:10 np0005539505 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000089.scope: Consumed 9.414s CPU time.
Nov 29 02:29:10 np0005539505 systemd-machined[153285]: Machine qemu-72-instance-00000089 terminated.
Nov 29 02:29:10 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241660]: [NOTICE]   (241664) : haproxy version is 2.8.14-c23fe91
Nov 29 02:29:10 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241660]: [NOTICE]   (241664) : path to executable is /usr/sbin/haproxy
Nov 29 02:29:10 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241660]: [WARNING]  (241664) : Exiting Master process...
Nov 29 02:29:10 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241660]: [ALERT]    (241664) : Current worker (241671) exited with code 143 (Terminated)
Nov 29 02:29:10 np0005539505 neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d[241660]: [WARNING]  (241664) : All workers exited. Exiting... (0)
Nov 29 02:29:10 np0005539505 systemd[1]: libpod-fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45.scope: Deactivated successfully.
Nov 29 02:29:10 np0005539505 podman[242031]: 2025-11-29 07:29:10.493746392 +0000 UTC m=+0.048718300 container died fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:29:10 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45-userdata-shm.mount: Deactivated successfully.
Nov 29 02:29:10 np0005539505 systemd[1]: var-lib-containers-storage-overlay-985a307ca8f7fe6921035a0598b6b43de18753c8260cd2790ab5d260ae3e7bde-merged.mount: Deactivated successfully.
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.530 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.535 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:10 np0005539505 podman[242031]: 2025-11-29 07:29:10.537369197 +0000 UTC m=+0.092341105 container cleanup fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 29 02:29:10 np0005539505 systemd[1]: libpod-conmon-fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45.scope: Deactivated successfully.
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.584 186962 INFO nova.virt.libvirt.driver [-] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Instance destroyed successfully.#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.584 186962 DEBUG nova.objects.instance [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lazy-loading 'resources' on Instance uuid 5487c798-eb5a-4186-9693-a64ecd64b296 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.598 186962 DEBUG nova.virt.libvirt.vif [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:28:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-400688363',display_name='tempest-ServerRescueNegativeTestJSON-server-400688363',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-400688363',id=137,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:29:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5d1e4f74add34e9b9a2084bd9586db0c',ramdisk_id='',reservation_id='r-1c7hw7ta',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-1892401049',owner_user_name='tempest-ServerRescueNegativeTestJSON-1892401049-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:29:02Z,user_data=None,user_id='4863fb992d4c48de9a92f63ffb1174a8',uuid=5487c798-eb5a-4186-9693-a64ecd64b296,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.598 186962 DEBUG nova.network.os_vif_util [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converting VIF {"id": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "address": "fa:16:3e:7d:ab:ce", "network": {"id": "008329a1-d4dc-4cfb-be68-95f658d9813d", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1895526191-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5d1e4f74add34e9b9a2084bd9586db0c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0273a9e9-32", "ovs_interfaceid": "0273a9e9-32f2-4363-a3b4-aa1a87caf07c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.599 186962 DEBUG nova.network.os_vif_util [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7d:ab:ce,bridge_name='br-int',has_traffic_filtering=True,id=0273a9e9-32f2-4363-a3b4-aa1a87caf07c,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0273a9e9-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.599 186962 DEBUG os_vif [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:ab:ce,bridge_name='br-int',has_traffic_filtering=True,id=0273a9e9-32f2-4363-a3b4-aa1a87caf07c,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0273a9e9-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.601 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.601 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0273a9e9-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.602 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.603 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.605 186962 INFO os_vif [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:ab:ce,bridge_name='br-int',has_traffic_filtering=True,id=0273a9e9-32f2-4363-a3b4-aa1a87caf07c,network=Network(008329a1-d4dc-4cfb-be68-95f658d9813d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0273a9e9-32')#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.605 186962 INFO nova.virt.libvirt.driver [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Deleting instance files /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296_del#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.606 186962 INFO nova.virt.libvirt.driver [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Deletion of /var/lib/nova/instances/5487c798-eb5a-4186-9693-a64ecd64b296_del complete#033[00m
Nov 29 02:29:10 np0005539505 podman[242073]: 2025-11-29 07:29:10.617536716 +0000 UTC m=+0.050630694 container remove fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.622 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3fceccda-df51-435c-97b5-aa8b5d34bd43]: (4, ('Sat Nov 29 07:29:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d (fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45)\nfa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45\nSat Nov 29 07:29:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d (fa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45)\nfa7adf622e5a8e4d1efbcccb58f5d4862550d9f6cbbba0b0a5339e6d40706f45\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.624 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[076bc606-efbb-4d61-8972-ad6f69e44298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.625 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap008329a1-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.627 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:10 np0005539505 kernel: tap008329a1-d0: left promiscuous mode
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.631 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6be7547b-3f7c-4d99-9780-3e188e31f84f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.639 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.657 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[798a4e3d-6761-407f-b8ee-da5addc2875c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.658 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1119ea0d-0803-400c-927a-34518157a550]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.676 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b135e4b0-2734-40b2-baa1-2c7078af7422]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678866, 'reachable_time': 38101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242095, 'error': None, 'target': 'ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.679 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-008329a1-d4dc-4cfb-be68-95f658d9813d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:29:10 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:10.679 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[96d4b553-4818-4bf9-896b-44daad6cea5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:10 np0005539505 systemd[1]: run-netns-ovnmeta\x2d008329a1\x2dd4dc\x2d4cfb\x2dbe68\x2d95f658d9813d.mount: Deactivated successfully.
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.703 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.709 186962 INFO nova.compute.manager [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.710 186962 DEBUG oslo.service.loopingcall [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.710 186962 DEBUG nova.compute.manager [-] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.711 186962 DEBUG nova.network.neutron [-] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.818 186962 DEBUG nova.compute.manager [req-000301c2-3e0b-44c0-addb-c5e4ce32aa0d req-931dcbd7-0e24-4c87-b23b-3c7da26c836a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received event network-vif-unplugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.819 186962 DEBUG oslo_concurrency.lockutils [req-000301c2-3e0b-44c0-addb-c5e4ce32aa0d req-931dcbd7-0e24-4c87-b23b-3c7da26c836a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.819 186962 DEBUG oslo_concurrency.lockutils [req-000301c2-3e0b-44c0-addb-c5e4ce32aa0d req-931dcbd7-0e24-4c87-b23b-3c7da26c836a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.819 186962 DEBUG oslo_concurrency.lockutils [req-000301c2-3e0b-44c0-addb-c5e4ce32aa0d req-931dcbd7-0e24-4c87-b23b-3c7da26c836a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.819 186962 DEBUG nova.compute.manager [req-000301c2-3e0b-44c0-addb-c5e4ce32aa0d req-931dcbd7-0e24-4c87-b23b-3c7da26c836a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] No waiting events found dispatching network-vif-unplugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:10 np0005539505 nova_compute[186958]: 2025-11-29 07:29:10.820 186962 DEBUG nova.compute.manager [req-000301c2-3e0b-44c0-addb-c5e4ce32aa0d req-931dcbd7-0e24-4c87-b23b-3c7da26c836a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received event network-vif-unplugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:29:11 np0005539505 nova_compute[186958]: 2025-11-29 07:29:11.558 186962 DEBUG nova.network.neutron [-] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:11 np0005539505 nova_compute[186958]: 2025-11-29 07:29:11.645 186962 INFO nova.compute.manager [-] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Took 0.93 seconds to deallocate network for instance.#033[00m
Nov 29 02:29:11 np0005539505 nova_compute[186958]: 2025-11-29 07:29:11.675 186962 DEBUG nova.compute.manager [req-2bbf7def-4fc2-4b92-82e1-3046f6f7c5f8 req-141cda87-fa45-4cd1-94ed-6208666aaa97 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received event network-vif-deleted-0273a9e9-32f2-4363-a3b4-aa1a87caf07c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:11 np0005539505 nova_compute[186958]: 2025-11-29 07:29:11.747 186962 DEBUG oslo_concurrency.lockutils [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:11 np0005539505 nova_compute[186958]: 2025-11-29 07:29:11.748 186962 DEBUG oslo_concurrency.lockutils [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:11 np0005539505 nova_compute[186958]: 2025-11-29 07:29:11.924 186962 DEBUG nova.compute.provider_tree [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:29:11 np0005539505 nova_compute[186958]: 2025-11-29 07:29:11.927 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:11 np0005539505 nova_compute[186958]: 2025-11-29 07:29:11.961 186962 DEBUG nova.scheduler.client.report [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:29:12 np0005539505 nova_compute[186958]: 2025-11-29 07:29:12.008 186962 DEBUG oslo_concurrency.lockutils [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:12 np0005539505 nova_compute[186958]: 2025-11-29 07:29:12.043 186962 INFO nova.scheduler.client.report [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Deleted allocations for instance 5487c798-eb5a-4186-9693-a64ecd64b296#033[00m
Nov 29 02:29:12 np0005539505 nova_compute[186958]: 2025-11-29 07:29:12.107 186962 DEBUG oslo_concurrency.lockutils [None req-e4393ece-ee37-4441-9c66-c136c86b3244 4863fb992d4c48de9a92f63ffb1174a8 5d1e4f74add34e9b9a2084bd9586db0c - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.854s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:12 np0005539505 nova_compute[186958]: 2025-11-29 07:29:12.130 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:12 np0005539505 nova_compute[186958]: 2025-11-29 07:29:12.944 186962 DEBUG nova.compute.manager [req-15fd2d8b-c328-4480-8130-30aead9cfca6 req-caa9c7d9-9543-433e-a869-20b12749e178 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:12 np0005539505 nova_compute[186958]: 2025-11-29 07:29:12.944 186962 DEBUG oslo_concurrency.lockutils [req-15fd2d8b-c328-4480-8130-30aead9cfca6 req-caa9c7d9-9543-433e-a869-20b12749e178 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:12 np0005539505 nova_compute[186958]: 2025-11-29 07:29:12.945 186962 DEBUG oslo_concurrency.lockutils [req-15fd2d8b-c328-4480-8130-30aead9cfca6 req-caa9c7d9-9543-433e-a869-20b12749e178 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:12 np0005539505 nova_compute[186958]: 2025-11-29 07:29:12.945 186962 DEBUG oslo_concurrency.lockutils [req-15fd2d8b-c328-4480-8130-30aead9cfca6 req-caa9c7d9-9543-433e-a869-20b12749e178 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "5487c798-eb5a-4186-9693-a64ecd64b296-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:12 np0005539505 nova_compute[186958]: 2025-11-29 07:29:12.945 186962 DEBUG nova.compute.manager [req-15fd2d8b-c328-4480-8130-30aead9cfca6 req-caa9c7d9-9543-433e-a869-20b12749e178 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] No waiting events found dispatching network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:12 np0005539505 nova_compute[186958]: 2025-11-29 07:29:12.945 186962 WARNING nova.compute.manager [req-15fd2d8b-c328-4480-8130-30aead9cfca6 req-caa9c7d9-9543-433e-a869-20b12749e178 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Received unexpected event network-vif-plugged-0273a9e9-32f2-4363-a3b4-aa1a87caf07c for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:29:13 np0005539505 nova_compute[186958]: 2025-11-29 07:29:13.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.404 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.405 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.471 186962 DEBUG nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.594 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.594 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.600 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.600 186962 INFO nova.compute.claims [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.751 186962 DEBUG nova.compute.provider_tree [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.767 186962 DEBUG nova.scheduler.client.report [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.790 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.791 186962 DEBUG nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.879 186962 DEBUG nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.880 186962 DEBUG nova.network.neutron [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.916 186962 INFO nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:29:14 np0005539505 nova_compute[186958]: 2025-11-29 07:29:14.960 186962 DEBUG nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.140 186962 DEBUG nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.142 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.142 186962 INFO nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Creating image(s)#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.143 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.143 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.144 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.157 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.215 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.216 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.217 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.228 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.263 186962 DEBUG nova.policy [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.284 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.285 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.320 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.321 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.322 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.374 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.375 186962 DEBUG nova.virt.disk.api [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.376 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.432 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.433 186962 DEBUG nova.virt.disk.api [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.433 186962 DEBUG nova.objects.instance [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid e7c569c9-0fdc-4213-b9d1-779719d43c2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.479 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.479 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Ensure instance console log exists: /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.480 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.480 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.480 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.604 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:15 np0005539505 nova_compute[186958]: 2025-11-29 07:29:15.702 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:16 np0005539505 nova_compute[186958]: 2025-11-29 07:29:16.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:17 np0005539505 nova_compute[186958]: 2025-11-29 07:29:17.137 186962 DEBUG nova.network.neutron [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Successfully created port: d68a4c07-f2f7-4755-9775-326f6b440bd1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:29:17 np0005539505 nova_compute[186958]: 2025-11-29 07:29:17.290 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401342.2885182, 19c7d8b2-3f1a-40cf-a538-dd2752970ffb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:17 np0005539505 nova_compute[186958]: 2025-11-29 07:29:17.290 186962 INFO nova.compute.manager [-] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:29:17 np0005539505 nova_compute[186958]: 2025-11-29 07:29:17.309 186962 DEBUG nova.compute.manager [None req-fd15d304-b564-40fe-a458-4eb566780f54 - - - - - -] [instance: 19c7d8b2-3f1a-40cf-a538-dd2752970ffb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:18 np0005539505 nova_compute[186958]: 2025-11-29 07:29:18.148 186962 DEBUG nova.network.neutron [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Successfully created port: a96afdaa-e76f-4d5d-a3e3-c94e58cce61b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:29:19 np0005539505 nova_compute[186958]: 2025-11-29 07:29:19.652 186962 DEBUG nova.network.neutron [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Successfully updated port: d68a4c07-f2f7-4755-9775-326f6b440bd1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:29:19 np0005539505 nova_compute[186958]: 2025-11-29 07:29:19.787 186962 DEBUG nova.compute.manager [req-82061504-8052-40da-a465-a04ce7f4e854 req-e1482020-d415-4027-9787-2df598964399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-changed-d68a4c07-f2f7-4755-9775-326f6b440bd1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:19 np0005539505 nova_compute[186958]: 2025-11-29 07:29:19.787 186962 DEBUG nova.compute.manager [req-82061504-8052-40da-a465-a04ce7f4e854 req-e1482020-d415-4027-9787-2df598964399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Refreshing instance network info cache due to event network-changed-d68a4c07-f2f7-4755-9775-326f6b440bd1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:29:19 np0005539505 nova_compute[186958]: 2025-11-29 07:29:19.788 186962 DEBUG oslo_concurrency.lockutils [req-82061504-8052-40da-a465-a04ce7f4e854 req-e1482020-d415-4027-9787-2df598964399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:19 np0005539505 nova_compute[186958]: 2025-11-29 07:29:19.788 186962 DEBUG oslo_concurrency.lockutils [req-82061504-8052-40da-a465-a04ce7f4e854 req-e1482020-d415-4027-9787-2df598964399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:19 np0005539505 nova_compute[186958]: 2025-11-29 07:29:19.788 186962 DEBUG nova.network.neutron [req-82061504-8052-40da-a465-a04ce7f4e854 req-e1482020-d415-4027-9787-2df598964399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Refreshing network info cache for port d68a4c07-f2f7-4755-9775-326f6b440bd1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:29:19 np0005539505 nova_compute[186958]: 2025-11-29 07:29:19.974 186962 DEBUG nova.network.neutron [req-82061504-8052-40da-a465-a04ce7f4e854 req-e1482020-d415-4027-9787-2df598964399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:29:20 np0005539505 nova_compute[186958]: 2025-11-29 07:29:20.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:20 np0005539505 nova_compute[186958]: 2025-11-29 07:29:20.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:29:20 np0005539505 nova_compute[186958]: 2025-11-29 07:29:20.460 186962 DEBUG nova.network.neutron [req-82061504-8052-40da-a465-a04ce7f4e854 req-e1482020-d415-4027-9787-2df598964399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:20 np0005539505 nova_compute[186958]: 2025-11-29 07:29:20.610 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:20 np0005539505 nova_compute[186958]: 2025-11-29 07:29:20.706 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:21 np0005539505 nova_compute[186958]: 2025-11-29 07:29:21.629 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401346.6276956, 97b39e54-312a-4ebc-863b-e1ef5f4cf363 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:21 np0005539505 nova_compute[186958]: 2025-11-29 07:29:21.630 186962 INFO nova.compute.manager [-] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:29:21 np0005539505 nova_compute[186958]: 2025-11-29 07:29:21.919 186962 DEBUG nova.compute.manager [None req-a1be0771-e18a-43bb-943a-dc58785d27a3 - - - - - -] [instance: 97b39e54-312a-4ebc-863b-e1ef5f4cf363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:21 np0005539505 nova_compute[186958]: 2025-11-29 07:29:21.920 186962 DEBUG oslo_concurrency.lockutils [req-82061504-8052-40da-a465-a04ce7f4e854 req-e1482020-d415-4027-9787-2df598964399 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:22 np0005539505 nova_compute[186958]: 2025-11-29 07:29:22.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:23 np0005539505 podman[242114]: 2025-11-29 07:29:23.737387906 +0000 UTC m=+0.059075073 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:29:23 np0005539505 podman[242113]: 2025-11-29 07:29:23.737682484 +0000 UTC m=+0.063676383 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Nov 29 02:29:24 np0005539505 nova_compute[186958]: 2025-11-29 07:29:24.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:24 np0005539505 nova_compute[186958]: 2025-11-29 07:29:24.746 186962 DEBUG nova.network.neutron [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Successfully updated port: a96afdaa-e76f-4d5d-a3e3-c94e58cce61b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:29:24 np0005539505 nova_compute[186958]: 2025-11-29 07:29:24.768 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:24 np0005539505 nova_compute[186958]: 2025-11-29 07:29:24.768 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:24 np0005539505 nova_compute[186958]: 2025-11-29 07:29:24.769 186962 DEBUG nova.network.neutron [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:29:24 np0005539505 nova_compute[186958]: 2025-11-29 07:29:24.831 186962 DEBUG nova.compute.manager [req-9dd4f1ed-7363-411b-8abd-9d676d31cb27 req-bd2b323f-08ae-4930-b2da-64d8012a7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-changed-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:24 np0005539505 nova_compute[186958]: 2025-11-29 07:29:24.832 186962 DEBUG nova.compute.manager [req-9dd4f1ed-7363-411b-8abd-9d676d31cb27 req-bd2b323f-08ae-4930-b2da-64d8012a7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Refreshing instance network info cache due to event network-changed-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:29:24 np0005539505 nova_compute[186958]: 2025-11-29 07:29:24.832 186962 DEBUG oslo_concurrency.lockutils [req-9dd4f1ed-7363-411b-8abd-9d676d31cb27 req-bd2b323f-08ae-4930-b2da-64d8012a7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.011 186962 DEBUG nova.network.neutron [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.410 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.410 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.436 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.437 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.437 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.437 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.576 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.577 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5588MB free_disk=73.07365036010742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.577 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.577 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.582 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401350.5815063, 5487c798-eb5a-4186-9693-a64ecd64b296 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.582 186962 INFO nova.compute.manager [-] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.613 186962 DEBUG nova.compute.manager [None req-8f94e8dc-0f06-4c2d-a600-cbcc9bb81c68 - - - - - -] [instance: 5487c798-eb5a-4186-9693-a64ecd64b296] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.614 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.707 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.841 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance e7c569c9-0fdc-4213-b9d1-779719d43c2f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.842 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.842 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.858 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.881 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.881 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.897 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.927 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:29:25 np0005539505 nova_compute[186958]: 2025-11-29 07:29:25.982 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:29:26 np0005539505 nova_compute[186958]: 2025-11-29 07:29:26.001 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:29:26 np0005539505 nova_compute[186958]: 2025-11-29 07:29:26.042 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:29:26 np0005539505 nova_compute[186958]: 2025-11-29 07:29:26.043 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.466s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:26 np0005539505 podman[242154]: 2025-11-29 07:29:26.730161998 +0000 UTC m=+0.055507012 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:29:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:27.508 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:27.509 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:27.510 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.844 186962 DEBUG nova.network.neutron [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Updating instance_info_cache with network_info: [{"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.885 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.886 186962 DEBUG nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Instance network_info: |[{"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.886 186962 DEBUG oslo_concurrency.lockutils [req-9dd4f1ed-7363-411b-8abd-9d676d31cb27 req-bd2b323f-08ae-4930-b2da-64d8012a7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.886 186962 DEBUG nova.network.neutron [req-9dd4f1ed-7363-411b-8abd-9d676d31cb27 req-bd2b323f-08ae-4930-b2da-64d8012a7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Refreshing network info cache for port a96afdaa-e76f-4d5d-a3e3-c94e58cce61b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.890 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Start _get_guest_xml network_info=[{"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.895 186962 WARNING nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.901 186962 DEBUG nova.virt.libvirt.host [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.901 186962 DEBUG nova.virt.libvirt.host [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.908 186962 DEBUG nova.virt.libvirt.host [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.909 186962 DEBUG nova.virt.libvirt.host [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.911 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.911 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.912 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.912 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.913 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.913 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.913 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.914 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.914 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.915 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.915 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.915 186962 DEBUG nova.virt.hardware [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.921 186962 DEBUG nova.virt.libvirt.vif [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:29:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-664402122',display_name='tempest-TestGettingAddress-server-664402122',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-664402122',id=138,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-4pvdqrie',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:29:15Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e7c569c9-0fdc-4213-b9d1-779719d43c2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.922 186962 DEBUG nova.network.os_vif_util [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.923 186962 DEBUG nova.network.os_vif_util [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:57:5e,bridge_name='br-int',has_traffic_filtering=True,id=d68a4c07-f2f7-4755-9775-326f6b440bd1,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd68a4c07-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.924 186962 DEBUG nova.virt.libvirt.vif [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:29:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-664402122',display_name='tempest-TestGettingAddress-server-664402122',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-664402122',id=138,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-4pvdqrie',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:29:15Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e7c569c9-0fdc-4213-b9d1-779719d43c2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.924 186962 DEBUG nova.network.os_vif_util [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.925 186962 DEBUG nova.network.os_vif_util [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:67:d6,bridge_name='br-int',has_traffic_filtering=True,id=a96afdaa-e76f-4d5d-a3e3-c94e58cce61b,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa96afdaa-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.927 186962 DEBUG nova.objects.instance [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid e7c569c9-0fdc-4213-b9d1-779719d43c2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.951 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  <uuid>e7c569c9-0fdc-4213-b9d1-779719d43c2f</uuid>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  <name>instance-0000008a</name>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestGettingAddress-server-664402122</nova:name>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:29:27</nova:creationTime>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:        <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:        <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:        <nova:port uuid="d68a4c07-f2f7-4755-9775-326f6b440bd1">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:        <nova:port uuid="a96afdaa-e76f-4d5d-a3e3-c94e58cce61b">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe9f:67d6" ipVersion="6"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe9f:67d6" ipVersion="6"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <entry name="serial">e7c569c9-0fdc-4213-b9d1-779719d43c2f</entry>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <entry name="uuid">e7c569c9-0fdc-4213-b9d1-779719d43c2f</entry>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk.config"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:01:57:5e"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <target dev="tapd68a4c07-f2"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:9f:67:d6"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <target dev="tapa96afdaa-e7"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/console.log" append="off"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:29:27 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:29:27 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:29:27 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:29:27 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.953 186962 DEBUG nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Preparing to wait for external event network-vif-plugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.954 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.954 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.954 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.954 186962 DEBUG nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Preparing to wait for external event network-vif-plugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.954 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.955 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.955 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.955 186962 DEBUG nova.virt.libvirt.vif [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:29:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-664402122',display_name='tempest-TestGettingAddress-server-664402122',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-664402122',id=138,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-4pvdqrie',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:29:15Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e7c569c9-0fdc-4213-b9d1-779719d43c2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.956 186962 DEBUG nova.network.os_vif_util [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.956 186962 DEBUG nova.network.os_vif_util [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:57:5e,bridge_name='br-int',has_traffic_filtering=True,id=d68a4c07-f2f7-4755-9775-326f6b440bd1,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd68a4c07-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.956 186962 DEBUG os_vif [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:57:5e,bridge_name='br-int',has_traffic_filtering=True,id=d68a4c07-f2f7-4755-9775-326f6b440bd1,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd68a4c07-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.957 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.957 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.958 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.960 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.960 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd68a4c07-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.960 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd68a4c07-f2, col_values=(('external_ids', {'iface-id': 'd68a4c07-f2f7-4755-9775-326f6b440bd1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:57:5e', 'vm-uuid': 'e7c569c9-0fdc-4213-b9d1-779719d43c2f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.962 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:27 np0005539505 NetworkManager[55134]: <info>  [1764401367.9628] manager: (tapd68a4c07-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.964 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.967 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.968 186962 INFO os_vif [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:57:5e,bridge_name='br-int',has_traffic_filtering=True,id=d68a4c07-f2f7-4755-9775-326f6b440bd1,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd68a4c07-f2')#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.968 186962 DEBUG nova.virt.libvirt.vif [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:29:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-664402122',display_name='tempest-TestGettingAddress-server-664402122',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-664402122',id=138,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-4pvdqrie',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:29:15Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e7c569c9-0fdc-4213-b9d1-779719d43c2f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.969 186962 DEBUG nova.network.os_vif_util [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.969 186962 DEBUG nova.network.os_vif_util [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:67:d6,bridge_name='br-int',has_traffic_filtering=True,id=a96afdaa-e76f-4d5d-a3e3-c94e58cce61b,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa96afdaa-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.970 186962 DEBUG os_vif [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:67:d6,bridge_name='br-int',has_traffic_filtering=True,id=a96afdaa-e76f-4d5d-a3e3-c94e58cce61b,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa96afdaa-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.970 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.970 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.970 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.972 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.972 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa96afdaa-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.972 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa96afdaa-e7, col_values=(('external_ids', {'iface-id': 'a96afdaa-e76f-4d5d-a3e3-c94e58cce61b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:67:d6', 'vm-uuid': 'e7c569c9-0fdc-4213-b9d1-779719d43c2f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.973 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:27 np0005539505 NetworkManager[55134]: <info>  [1764401367.9744] manager: (tapa96afdaa-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.975 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.979 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:27 np0005539505 nova_compute[186958]: 2025-11-29 07:29:27.980 186962 INFO os_vif [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:67:d6,bridge_name='br-int',has_traffic_filtering=True,id=a96afdaa-e76f-4d5d-a3e3-c94e58cce61b,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa96afdaa-e7')#033[00m
Nov 29 02:29:28 np0005539505 nova_compute[186958]: 2025-11-29 07:29:28.034 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:29:28 np0005539505 nova_compute[186958]: 2025-11-29 07:29:28.035 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:29:28 np0005539505 nova_compute[186958]: 2025-11-29 07:29:28.035 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:01:57:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:29:28 np0005539505 nova_compute[186958]: 2025-11-29 07:29:28.035 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:9f:67:d6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:29:28 np0005539505 nova_compute[186958]: 2025-11-29 07:29:28.036 186962 INFO nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Using config drive#033[00m
Nov 29 02:29:28 np0005539505 nova_compute[186958]: 2025-11-29 07:29:28.037 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:28 np0005539505 nova_compute[186958]: 2025-11-29 07:29:28.827 186962 INFO nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Creating config drive at /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk.config#033[00m
Nov 29 02:29:28 np0005539505 nova_compute[186958]: 2025-11-29 07:29:28.836 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptur04sax execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:29:28 np0005539505 nova_compute[186958]: 2025-11-29 07:29:28.964 186962 DEBUG oslo_concurrency.processutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptur04sax" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:29:29 np0005539505 kernel: tapd68a4c07-f2: entered promiscuous mode
Nov 29 02:29:29 np0005539505 NetworkManager[55134]: <info>  [1764401369.0354] manager: (tapd68a4c07-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/321)
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.038 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:29Z|00656|binding|INFO|Claiming lport d68a4c07-f2f7-4755-9775-326f6b440bd1 for this chassis.
Nov 29 02:29:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:29Z|00657|binding|INFO|d68a4c07-f2f7-4755-9775-326f6b440bd1: Claiming fa:16:3e:01:57:5e 10.100.0.7
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.044 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.055 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:29 np0005539505 kernel: tapa96afdaa-e7: entered promiscuous mode
Nov 29 02:29:29 np0005539505 NetworkManager[55134]: <info>  [1764401369.0607] manager: (tapa96afdaa-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/322)
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.062 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:29 np0005539505 systemd-udevd[242199]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:29:29 np0005539505 systemd-udevd[242200]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.071 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.076 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:29 np0005539505 NetworkManager[55134]: <info>  [1764401369.0774] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Nov 29 02:29:29 np0005539505 NetworkManager[55134]: <info>  [1764401369.0797] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/324)
Nov 29 02:29:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:29Z|00658|if_status|INFO|Dropped 4 log messages in last 746 seconds (most recently, 746 seconds ago) due to excessive rate
Nov 29 02:29:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:29Z|00659|if_status|INFO|Not updating pb chassis for a96afdaa-e76f-4d5d-a3e3-c94e58cce61b now as sb is readonly
Nov 29 02:29:29 np0005539505 NetworkManager[55134]: <info>  [1764401369.0837] device (tapd68a4c07-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:29:29 np0005539505 NetworkManager[55134]: <info>  [1764401369.0860] device (tapa96afdaa-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:29:29 np0005539505 NetworkManager[55134]: <info>  [1764401369.0881] device (tapd68a4c07-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:29:29 np0005539505 NetworkManager[55134]: <info>  [1764401369.0892] device (tapa96afdaa-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:29:29 np0005539505 systemd-machined[153285]: New machine qemu-74-instance-0000008a.
Nov 29 02:29:29 np0005539505 systemd[1]: Started Virtual Machine qemu-74-instance-0000008a.
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.245 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.257 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.549 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401369.548491, e7c569c9-0fdc-4213-b9d1-779719d43c2f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.549 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] VM Started (Lifecycle Event)#033[00m
Nov 29 02:29:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:29Z|00660|binding|INFO|Claiming lport a96afdaa-e76f-4d5d-a3e3-c94e58cce61b for this chassis.
Nov 29 02:29:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:29Z|00661|binding|INFO|a96afdaa-e76f-4d5d-a3e3-c94e58cce61b: Claiming fa:16:3e:9f:67:d6 2001:db8:0:1:f816:3eff:fe9f:67d6 2001:db8::f816:3eff:fe9f:67d6
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.716 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:29Z|00662|binding|INFO|Setting lport d68a4c07-f2f7-4755-9775-326f6b440bd1 ovn-installed in OVS
Nov 29 02:29:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:29Z|00663|binding|INFO|Setting lport d68a4c07-f2f7-4755-9775-326f6b440bd1 up in Southbound
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.716 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:57:5e 10.100.0.7'], port_security=['fa:16:3e:01:57:5e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e7c569c9-0fdc-4213-b9d1-779719d43c2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94472368-b72a-4e5d-ac59-40b24b7ba792', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '291ecd66-4825-4cfd-92f0-8370bd5efe48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c99c04c3-6b8c-480e-be26-e44e383928c7, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=d68a4c07-f2f7-4755-9775-326f6b440bd1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.717 104094 INFO neutron.agent.ovn.metadata.agent [-] Port d68a4c07-f2f7-4755-9775-326f6b440bd1 in datapath 94472368-b72a-4e5d-ac59-40b24b7ba792 bound to our chassis#033[00m
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.718 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.719 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94472368-b72a-4e5d-ac59-40b24b7ba792#033[00m
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.722 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401369.5486655, e7c569c9-0fdc-4213-b9d1-779719d43c2f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.723 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.729 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a167a7-eb29-461a-97b9-618b1eb0ca83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.730 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap94472368-b1 in ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.732 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap94472368-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.732 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[697c8012-7f7d-4495-8806-f3f74b058806]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.735 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:67:d6 2001:db8:0:1:f816:3eff:fe9f:67d6 2001:db8::f816:3eff:fe9f:67d6'], port_security=['fa:16:3e:9f:67:d6 2001:db8:0:1:f816:3eff:fe9f:67d6 2001:db8::f816:3eff:fe9f:67d6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe9f:67d6/64 2001:db8::f816:3eff:fe9f:67d6/64', 'neutron:device_id': 'e7c569c9-0fdc-4213-b9d1-779719d43c2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '291ecd66-4825-4cfd-92f0-8370bd5efe48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed5ad144-c783-4b67-a226-e0c5588d3535, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=a96afdaa-e76f-4d5d-a3e3-c94e58cce61b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.738 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cc84dabd-9eb5-435c-b0df-0af7b1c8a1dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:29Z|00664|binding|INFO|Setting lport a96afdaa-e76f-4d5d-a3e3-c94e58cce61b ovn-installed in OVS
Nov 29 02:29:29 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:29Z|00665|binding|INFO|Setting lport a96afdaa-e76f-4d5d-a3e3-c94e58cce61b up in Southbound
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.749 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.753 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[8cfc5dcb-4525-4d15-9b53-79e457d3369e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.754 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.757 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.767 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[be4ad71c-aa5f-4110-99a6-1bc8224f1da4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 nova_compute[186958]: 2025-11-29 07:29:29.778 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.801 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[954d6b71-b650-4e2c-be90-1be484e98e9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 NetworkManager[55134]: <info>  [1764401369.8073] manager: (tap94472368-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/325)
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.807 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a6c40f6c-525b-435e-9a2d-8fc37860d8c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.834 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[eb32d17d-d4ba-4244-9f25-f6e6d0d98c65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.837 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e89e66cc-7f63-4082-bb8e-f92ffb278c2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 NetworkManager[55134]: <info>  [1764401369.8598] device (tap94472368-b0): carrier: link connected
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.864 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6355bf-e9a7-40e2-b23e-0880cce9c8a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.882 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e94cb444-a5f2-4cd4-b881-267e0c9674c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94472368-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:43:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 681744, 'reachable_time': 27104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242245, 'error': None, 'target': 'ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.897 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[238f9620-e575-4ea5-8f35-2df3e52dcfe8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:4356'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 681744, 'tstamp': 681744}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242246, 'error': None, 'target': 'ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.911 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[27ba851c-83bf-43ae-9ee4-6e63a8b4d374]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94472368-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:43:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 681744, 'reachable_time': 27104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242247, 'error': None, 'target': 'ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:29.942 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[389c491d-e4c8-4643-ae4f-2318f72909d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.000 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4172e0-5d02-452b-84bf-cebf5f87ac9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.001 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94472368-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.002 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.002 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94472368-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:30 np0005539505 NetworkManager[55134]: <info>  [1764401370.0047] manager: (tap94472368-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.004 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:30 np0005539505 kernel: tap94472368-b0: entered promiscuous mode
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.006 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.007 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94472368-b0, col_values=(('external_ids', {'iface-id': '9d125548-8068-4815-941c-f4536091ef07'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.008 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:30 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:30Z|00666|binding|INFO|Releasing lport 9d125548-8068-4815-941c-f4536091ef07 from this chassis (sb_readonly=0)
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.030 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.031 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/94472368-b72a-4e5d-ac59-40b24b7ba792.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/94472368-b72a-4e5d-ac59-40b24b7ba792.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.032 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9dfab8af-9c0c-4295-a75b-ca16ef85d7c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.032 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-94472368-b72a-4e5d-ac59-40b24b7ba792
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/94472368-b72a-4e5d-ac59-40b24b7ba792.pid.haproxy
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 94472368-b72a-4e5d-ac59-40b24b7ba792
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.033 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792', 'env', 'PROCESS_TAG=haproxy-94472368-b72a-4e5d-ac59-40b24b7ba792', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/94472368-b72a-4e5d-ac59-40b24b7ba792.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.088 186962 DEBUG nova.compute.manager [req-11ef276c-b87c-4635-ac6b-9466622c0c62 req-3e1db09a-4922-45e6-9612-9c3636d85fc1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-vif-plugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.089 186962 DEBUG oslo_concurrency.lockutils [req-11ef276c-b87c-4635-ac6b-9466622c0c62 req-3e1db09a-4922-45e6-9612-9c3636d85fc1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.089 186962 DEBUG oslo_concurrency.lockutils [req-11ef276c-b87c-4635-ac6b-9466622c0c62 req-3e1db09a-4922-45e6-9612-9c3636d85fc1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.090 186962 DEBUG oslo_concurrency.lockutils [req-11ef276c-b87c-4635-ac6b-9466622c0c62 req-3e1db09a-4922-45e6-9612-9c3636d85fc1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.090 186962 DEBUG nova.compute.manager [req-11ef276c-b87c-4635-ac6b-9466622c0c62 req-3e1db09a-4922-45e6-9612-9c3636d85fc1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Processing event network-vif-plugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.218 186962 DEBUG nova.compute.manager [req-b452ce38-1622-4b14-8bea-3a9167806adc req-ed53d1c6-fb22-44e5-8452-cc6fe472f9a7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-vif-plugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.219 186962 DEBUG oslo_concurrency.lockutils [req-b452ce38-1622-4b14-8bea-3a9167806adc req-ed53d1c6-fb22-44e5-8452-cc6fe472f9a7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.219 186962 DEBUG oslo_concurrency.lockutils [req-b452ce38-1622-4b14-8bea-3a9167806adc req-ed53d1c6-fb22-44e5-8452-cc6fe472f9a7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.220 186962 DEBUG oslo_concurrency.lockutils [req-b452ce38-1622-4b14-8bea-3a9167806adc req-ed53d1c6-fb22-44e5-8452-cc6fe472f9a7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.220 186962 DEBUG nova.compute.manager [req-b452ce38-1622-4b14-8bea-3a9167806adc req-ed53d1c6-fb22-44e5-8452-cc6fe472f9a7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Processing event network-vif-plugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.221 186962 DEBUG nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.230 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401370.2298412, e7c569c9-0fdc-4213-b9d1-779719d43c2f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.230 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.243 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.250 186962 INFO nova.virt.libvirt.driver [-] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Instance spawned successfully.#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.251 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.254 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.258 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.276 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.280 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.280 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.281 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.281 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.282 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.282 186962 DEBUG nova.virt.libvirt.driver [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.382 186962 INFO nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Took 15.24 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.383 186962 DEBUG nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:29:30 np0005539505 podman[242277]: 2025-11-29 07:29:30.404941841 +0000 UTC m=+0.053605398 container create c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:29:30 np0005539505 systemd[1]: Started libpod-conmon-c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87.scope.
Nov 29 02:29:30 np0005539505 podman[242277]: 2025-11-29 07:29:30.371720311 +0000 UTC m=+0.020383878 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:29:30 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:29:30 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b16bc3e482e4460309376a646882f55bf4a770c2d9569891a9a3b622202d24c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.495 186962 INFO nova.compute.manager [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Took 15.96 seconds to build instance.#033[00m
Nov 29 02:29:30 np0005539505 podman[242277]: 2025-11-29 07:29:30.502724549 +0000 UTC m=+0.151388126 container init c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:29:30 np0005539505 podman[242277]: 2025-11-29 07:29:30.507775222 +0000 UTC m=+0.156438779 container start c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.514 186962 DEBUG oslo_concurrency.lockutils [None req-d074e71f-787d-469c-90c0-fd48f7cc0228 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:30 np0005539505 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[242292]: [NOTICE]   (242298) : New worker (242300) forked
Nov 29 02:29:30 np0005539505 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[242292]: [NOTICE]   (242298) : Loading success.
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.598 104094 INFO neutron.agent.ovn.metadata.agent [-] Port a96afdaa-e76f-4d5d-a3e3-c94e58cce61b in datapath ff387e90-45c2-42d7-b536-fee4d2b6eb5e unbound from our chassis#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.600 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ff387e90-45c2-42d7-b536-fee4d2b6eb5e#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.610 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6a26a9fb-4ed9-4cd2-abb8-737419f19479]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.611 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapff387e90-41 in ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.613 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapff387e90-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.613 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0b202b74-d723-41e8-ad1a-f665d4505411]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.614 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[def4a90e-c530-4556-9b03-879e7934334f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.627 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[1691171d-f43d-46b5-94fd-cfda46199bbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.642 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[57c98ad1-d18f-4140-b996-7a6e2866abe7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.670 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[40b4f9e9-615a-4b42-8ec3-eac1fc7acebf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.677 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[13ba5a8d-5ec5-4917-b374-79381aa0f69d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 systemd-udevd[242237]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:29:30 np0005539505 NetworkManager[55134]: <info>  [1764401370.6789] manager: (tapff387e90-40): new Veth device (/org/freedesktop/NetworkManager/Devices/327)
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.708 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.713 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[12e8cd01-4bbe-4084-be53-33a817986e53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.716 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[def0af87-3481-4990-8f08-178b8bb64fcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 NetworkManager[55134]: <info>  [1764401370.7423] device (tapff387e90-40): carrier: link connected
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.747 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a84abf-4404-442f-bf4f-0c57f54fe508]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.764 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ece40222-872a-467c-8c2d-e17ec2138236]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff387e90-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:46:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 681832, 'reachable_time': 27380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242319, 'error': None, 'target': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.781 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1cc1fc30-fcb6-47df-ac3a-2dfb76bce73b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb9:4633'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 681832, 'tstamp': 681832}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242320, 'error': None, 'target': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.801 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dd22f458-9dd0-4856-8c6a-efcbb8ab748a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff387e90-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:46:33'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 211], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 681832, 'reachable_time': 27380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242321, 'error': None, 'target': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.808 186962 DEBUG nova.network.neutron [req-9dd4f1ed-7363-411b-8abd-9d676d31cb27 req-bd2b323f-08ae-4930-b2da-64d8012a7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Updated VIF entry in instance network info cache for port a96afdaa-e76f-4d5d-a3e3-c94e58cce61b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.809 186962 DEBUG nova.network.neutron [req-9dd4f1ed-7363-411b-8abd-9d676d31cb27 req-bd2b323f-08ae-4930-b2da-64d8012a7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Updating instance_info_cache with network_info: [{"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.846 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[18a74633-2632-4aae-911c-9afd34da3323]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.885 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[74550352-f649-41e5-8d1e-706f319dce12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.887 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff387e90-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.887 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.888 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff387e90-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.890 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:30 np0005539505 NetworkManager[55134]: <info>  [1764401370.8907] manager: (tapff387e90-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Nov 29 02:29:30 np0005539505 kernel: tapff387e90-40: entered promiscuous mode
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.892 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.895 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapff387e90-40, col_values=(('external_ids', {'iface-id': 'f0ce4da0-40ec-44ef-8179-4cbfad9b57f1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.896 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:30 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:30Z|00667|binding|INFO|Releasing lport f0ce4da0-40ec-44ef-8179-4cbfad9b57f1 from this chassis (sb_readonly=0)
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.897 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ff387e90-45c2-42d7-b536-fee4d2b6eb5e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ff387e90-45c2-42d7-b536-fee4d2b6eb5e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.899 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5c8a93-6a6f-4a63-91ae-2c85bb10178e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.900 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-ff387e90-45c2-42d7-b536-fee4d2b6eb5e
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/ff387e90-45c2-42d7-b536-fee4d2b6eb5e.pid.haproxy
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID ff387e90-45c2-42d7-b536-fee4d2b6eb5e
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:29:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:30.901 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'env', 'PROCESS_TAG=haproxy-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ff387e90-45c2-42d7-b536-fee4d2b6eb5e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:29:30 np0005539505 nova_compute[186958]: 2025-11-29 07:29:30.908 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:31 np0005539505 podman[242351]: 2025-11-29 07:29:31.218022302 +0000 UTC m=+0.024131334 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:29:31 np0005539505 nova_compute[186958]: 2025-11-29 07:29:31.484 186962 DEBUG oslo_concurrency.lockutils [req-9dd4f1ed-7363-411b-8abd-9d676d31cb27 req-bd2b323f-08ae-4930-b2da-64d8012a7692 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:31 np0005539505 podman[242351]: 2025-11-29 07:29:31.713188227 +0000 UTC m=+0.519297279 container create bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:29:32 np0005539505 systemd[1]: Started libpod-conmon-bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5.scope.
Nov 29 02:29:32 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:29:32 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cafcb4864c9adbfc3baff8c08ff5d2a4fffd05524ac7819a46c9843e63efcb51/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:29:32 np0005539505 podman[242351]: 2025-11-29 07:29:32.196948538 +0000 UTC m=+1.003057550 container init bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:29:32 np0005539505 podman[242351]: 2025-11-29 07:29:32.207384293 +0000 UTC m=+1.013493305 container start bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:29:32 np0005539505 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[242366]: [NOTICE]   (242370) : New worker (242372) forked
Nov 29 02:29:32 np0005539505 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[242366]: [NOTICE]   (242370) : Loading success.
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.481 186962 DEBUG nova.compute.manager [req-fe22b8a5-c2e8-4078-968c-ee207c24eb54 req-4399bf48-3600-486f-b6fc-dcacb3c8faa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-vif-plugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.482 186962 DEBUG oslo_concurrency.lockutils [req-fe22b8a5-c2e8-4078-968c-ee207c24eb54 req-4399bf48-3600-486f-b6fc-dcacb3c8faa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.482 186962 DEBUG oslo_concurrency.lockutils [req-fe22b8a5-c2e8-4078-968c-ee207c24eb54 req-4399bf48-3600-486f-b6fc-dcacb3c8faa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.483 186962 DEBUG oslo_concurrency.lockutils [req-fe22b8a5-c2e8-4078-968c-ee207c24eb54 req-4399bf48-3600-486f-b6fc-dcacb3c8faa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.484 186962 DEBUG nova.compute.manager [req-fe22b8a5-c2e8-4078-968c-ee207c24eb54 req-4399bf48-3600-486f-b6fc-dcacb3c8faa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] No waiting events found dispatching network-vif-plugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.485 186962 WARNING nova.compute.manager [req-fe22b8a5-c2e8-4078-968c-ee207c24eb54 req-4399bf48-3600-486f-b6fc-dcacb3c8faa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received unexpected event network-vif-plugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.652 186962 DEBUG nova.compute.manager [req-89bebbbb-5602-4981-8416-22607955debe req-37d4cc08-441c-4fe9-b981-8bcf19fdb497 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-vif-plugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.653 186962 DEBUG oslo_concurrency.lockutils [req-89bebbbb-5602-4981-8416-22607955debe req-37d4cc08-441c-4fe9-b981-8bcf19fdb497 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.653 186962 DEBUG oslo_concurrency.lockutils [req-89bebbbb-5602-4981-8416-22607955debe req-37d4cc08-441c-4fe9-b981-8bcf19fdb497 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.653 186962 DEBUG oslo_concurrency.lockutils [req-89bebbbb-5602-4981-8416-22607955debe req-37d4cc08-441c-4fe9-b981-8bcf19fdb497 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.653 186962 DEBUG nova.compute.manager [req-89bebbbb-5602-4981-8416-22607955debe req-37d4cc08-441c-4fe9-b981-8bcf19fdb497 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] No waiting events found dispatching network-vif-plugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.654 186962 WARNING nova.compute.manager [req-89bebbbb-5602-4981-8416-22607955debe req-37d4cc08-441c-4fe9-b981-8bcf19fdb497 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received unexpected event network-vif-plugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b for instance with vm_state active and task_state None.#033[00m
Nov 29 02:29:32 np0005539505 nova_compute[186958]: 2025-11-29 07:29:32.975 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:33 np0005539505 nova_compute[186958]: 2025-11-29 07:29:33.001 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:35 np0005539505 nova_compute[186958]: 2025-11-29 07:29:35.326 186962 DEBUG nova.compute.manager [req-738843ab-652e-44e3-ae01-169b734e2e31 req-12f5cb13-248f-4056-bf22-f6c003ed804b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-changed-d68a4c07-f2f7-4755-9775-326f6b440bd1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:35 np0005539505 nova_compute[186958]: 2025-11-29 07:29:35.327 186962 DEBUG nova.compute.manager [req-738843ab-652e-44e3-ae01-169b734e2e31 req-12f5cb13-248f-4056-bf22-f6c003ed804b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Refreshing instance network info cache due to event network-changed-d68a4c07-f2f7-4755-9775-326f6b440bd1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:29:35 np0005539505 nova_compute[186958]: 2025-11-29 07:29:35.327 186962 DEBUG oslo_concurrency.lockutils [req-738843ab-652e-44e3-ae01-169b734e2e31 req-12f5cb13-248f-4056-bf22-f6c003ed804b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:35 np0005539505 nova_compute[186958]: 2025-11-29 07:29:35.328 186962 DEBUG oslo_concurrency.lockutils [req-738843ab-652e-44e3-ae01-169b734e2e31 req-12f5cb13-248f-4056-bf22-f6c003ed804b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:35 np0005539505 nova_compute[186958]: 2025-11-29 07:29:35.328 186962 DEBUG nova.network.neutron [req-738843ab-652e-44e3-ae01-169b734e2e31 req-12f5cb13-248f-4056-bf22-f6c003ed804b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Refreshing network info cache for port d68a4c07-f2f7-4755-9775-326f6b440bd1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:29:35 np0005539505 nova_compute[186958]: 2025-11-29 07:29:35.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:29:35 np0005539505 nova_compute[186958]: 2025-11-29 07:29:35.721 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:36 np0005539505 podman[242381]: 2025-11-29 07:29:36.744138323 +0000 UTC m=+0.081364153 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:29:36 np0005539505 podman[242382]: 2025-11-29 07:29:36.752305644 +0000 UTC m=+0.085291725 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:29:37 np0005539505 nova_compute[186958]: 2025-11-29 07:29:37.264 186962 DEBUG nova.network.neutron [req-738843ab-652e-44e3-ae01-169b734e2e31 req-12f5cb13-248f-4056-bf22-f6c003ed804b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Updated VIF entry in instance network info cache for port d68a4c07-f2f7-4755-9775-326f6b440bd1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:29:37 np0005539505 nova_compute[186958]: 2025-11-29 07:29:37.266 186962 DEBUG nova.network.neutron [req-738843ab-652e-44e3-ae01-169b734e2e31 req-12f5cb13-248f-4056-bf22-f6c003ed804b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Updating instance_info_cache with network_info: [{"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:37 np0005539505 nova_compute[186958]: 2025-11-29 07:29:37.375 186962 DEBUG oslo_concurrency.lockutils [req-738843ab-652e-44e3-ae01-169b734e2e31 req-12f5cb13-248f-4056-bf22-f6c003ed804b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:37 np0005539505 nova_compute[186958]: 2025-11-29 07:29:37.978 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:40 np0005539505 nova_compute[186958]: 2025-11-29 07:29:40.724 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:40 np0005539505 podman[242431]: 2025-11-29 07:29:40.726920244 +0000 UTC m=+0.057497178 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125)
Nov 29 02:29:40 np0005539505 podman[242430]: 2025-11-29 07:29:40.758146808 +0000 UTC m=+0.088636709 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:29:43 np0005539505 nova_compute[186958]: 2025-11-29 07:29:43.030 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:43 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:43Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:01:57:5e 10.100.0.7
Nov 29 02:29:43 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:43Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:01:57:5e 10.100.0.7
Nov 29 02:29:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:45.297 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:45 np0005539505 nova_compute[186958]: 2025-11-29 07:29:45.297 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:45.298 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:29:45 np0005539505 nova_compute[186958]: 2025-11-29 07:29:45.726 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:47Z|00668|binding|INFO|Releasing lport 9d125548-8068-4815-941c-f4536091ef07 from this chassis (sb_readonly=0)
Nov 29 02:29:47 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:47Z|00669|binding|INFO|Releasing lport f0ce4da0-40ec-44ef-8179-4cbfad9b57f1 from this chassis (sb_readonly=0)
Nov 29 02:29:47 np0005539505 nova_compute[186958]: 2025-11-29 07:29:47.921 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:48 np0005539505 nova_compute[186958]: 2025-11-29 07:29:48.033 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:50 np0005539505 nova_compute[186958]: 2025-11-29 07:29:50.729 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:53 np0005539505 nova_compute[186958]: 2025-11-29 07:29:53.036 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:53.302 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.505 186962 DEBUG nova.compute.manager [req-e22a3ef0-7ff0-46ac-994d-284ae48efb2d req-7051375d-fc09-4c1b-a1f1-48eeb7bd33b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-changed-d68a4c07-f2f7-4755-9775-326f6b440bd1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.506 186962 DEBUG nova.compute.manager [req-e22a3ef0-7ff0-46ac-994d-284ae48efb2d req-7051375d-fc09-4c1b-a1f1-48eeb7bd33b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Refreshing instance network info cache due to event network-changed-d68a4c07-f2f7-4755-9775-326f6b440bd1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.506 186962 DEBUG oslo_concurrency.lockutils [req-e22a3ef0-7ff0-46ac-994d-284ae48efb2d req-7051375d-fc09-4c1b-a1f1-48eeb7bd33b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.506 186962 DEBUG oslo_concurrency.lockutils [req-e22a3ef0-7ff0-46ac-994d-284ae48efb2d req-7051375d-fc09-4c1b-a1f1-48eeb7bd33b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.506 186962 DEBUG nova.network.neutron [req-e22a3ef0-7ff0-46ac-994d-284ae48efb2d req-7051375d-fc09-4c1b-a1f1-48eeb7bd33b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Refreshing network info cache for port d68a4c07-f2f7-4755-9775-326f6b440bd1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.595 186962 DEBUG oslo_concurrency.lockutils [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.596 186962 DEBUG oslo_concurrency.lockutils [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.596 186962 DEBUG oslo_concurrency.lockutils [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.596 186962 DEBUG oslo_concurrency.lockutils [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.597 186962 DEBUG oslo_concurrency.lockutils [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.610 186962 INFO nova.compute.manager [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Terminating instance#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.621 186962 DEBUG nova.compute.manager [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:29:54 np0005539505 kernel: tapd68a4c07-f2 (unregistering): left promiscuous mode
Nov 29 02:29:54 np0005539505 NetworkManager[55134]: <info>  [1764401394.6535] device (tapd68a4c07-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.656 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:54Z|00670|binding|INFO|Releasing lport d68a4c07-f2f7-4755-9775-326f6b440bd1 from this chassis (sb_readonly=0)
Nov 29 02:29:54 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:54Z|00671|binding|INFO|Setting lport d68a4c07-f2f7-4755-9775-326f6b440bd1 down in Southbound
Nov 29 02:29:54 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:54Z|00672|binding|INFO|Removing iface tapd68a4c07-f2 ovn-installed in OVS
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.659 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:54.666 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:57:5e 10.100.0.7'], port_security=['fa:16:3e:01:57:5e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'e7c569c9-0fdc-4213-b9d1-779719d43c2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94472368-b72a-4e5d-ac59-40b24b7ba792', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '291ecd66-4825-4cfd-92f0-8370bd5efe48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c99c04c3-6b8c-480e-be26-e44e383928c7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=d68a4c07-f2f7-4755-9775-326f6b440bd1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:54.668 104094 INFO neutron.agent.ovn.metadata.agent [-] Port d68a4c07-f2f7-4755-9775-326f6b440bd1 in datapath 94472368-b72a-4e5d-ac59-40b24b7ba792 unbound from our chassis#033[00m
Nov 29 02:29:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:54.669 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 94472368-b72a-4e5d-ac59-40b24b7ba792, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:29:54 np0005539505 kernel: tapa96afdaa-e7 (unregistering): left promiscuous mode
Nov 29 02:29:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:54.671 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fca84709-f2a5-44fe-9153-56ef733b9978]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:54.672 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792 namespace which is not needed anymore#033[00m
Nov 29 02:29:54 np0005539505 NetworkManager[55134]: <info>  [1764401394.6768] device (tapa96afdaa-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.680 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:54Z|00673|binding|INFO|Releasing lport a96afdaa-e76f-4d5d-a3e3-c94e58cce61b from this chassis (sb_readonly=0)
Nov 29 02:29:54 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:54Z|00674|binding|INFO|Setting lport a96afdaa-e76f-4d5d-a3e3-c94e58cce61b down in Southbound
Nov 29 02:29:54 np0005539505 ovn_controller[95143]: 2025-11-29T07:29:54Z|00675|binding|INFO|Removing iface tapa96afdaa-e7 ovn-installed in OVS
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.689 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.690 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:54.696 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:67:d6 2001:db8:0:1:f816:3eff:fe9f:67d6 2001:db8::f816:3eff:fe9f:67d6'], port_security=['fa:16:3e:9f:67:d6 2001:db8:0:1:f816:3eff:fe9f:67d6 2001:db8::f816:3eff:fe9f:67d6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe9f:67d6/64 2001:db8::f816:3eff:fe9f:67d6/64', 'neutron:device_id': 'e7c569c9-0fdc-4213-b9d1-779719d43c2f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '291ecd66-4825-4cfd-92f0-8370bd5efe48', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed5ad144-c783-4b67-a226-e0c5588d3535, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=a96afdaa-e76f-4d5d-a3e3-c94e58cce61b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.718 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000008a.scope: Deactivated successfully.
Nov 29 02:29:54 np0005539505 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000008a.scope: Consumed 13.145s CPU time.
Nov 29 02:29:54 np0005539505 systemd-machined[153285]: Machine qemu-74-instance-0000008a terminated.
Nov 29 02:29:54 np0005539505 podman[242491]: 2025-11-29 07:29:54.741652061 +0000 UTC m=+0.069571750 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:29:54 np0005539505 podman[242487]: 2025-11-29 07:29:54.742048182 +0000 UTC m=+0.072197034 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:29:54 np0005539505 NetworkManager[55134]: <info>  [1764401394.8518] manager: (tapa96afdaa-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/329)
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.889 186962 INFO nova.virt.libvirt.driver [-] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Instance destroyed successfully.#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.890 186962 DEBUG nova.objects.instance [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid e7c569c9-0fdc-4213-b9d1-779719d43c2f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.912 186962 DEBUG nova.virt.libvirt.vif [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:29:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-664402122',display_name='tempest-TestGettingAddress-server-664402122',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-664402122',id=138,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:29:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-4pvdqrie',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:29:30Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e7c569c9-0fdc-4213-b9d1-779719d43c2f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.912 186962 DEBUG nova.network.os_vif_util [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.913 186962 DEBUG nova.network.os_vif_util [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:57:5e,bridge_name='br-int',has_traffic_filtering=True,id=d68a4c07-f2f7-4755-9775-326f6b440bd1,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd68a4c07-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.913 186962 DEBUG os_vif [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:57:5e,bridge_name='br-int',has_traffic_filtering=True,id=d68a4c07-f2f7-4755-9775-326f6b440bd1,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd68a4c07-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.915 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.915 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd68a4c07-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.916 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.919 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.920 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.923 186962 INFO os_vif [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:57:5e,bridge_name='br-int',has_traffic_filtering=True,id=d68a4c07-f2f7-4755-9775-326f6b440bd1,network=Network(94472368-b72a-4e5d-ac59-40b24b7ba792),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd68a4c07-f2')#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.924 186962 DEBUG nova.virt.libvirt.vif [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:29:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-664402122',display_name='tempest-TestGettingAddress-server-664402122',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-664402122',id=138,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMDVssChktFh4KaYAW3qIeGMuNugTc7Dbw5csxo4G7cMEBd3kE//4edEXQEzew5nx3IZ0PY0s8qOeS5wzlVZFNNAqhiy1Tau85BZFwFUT3MGeCBu5wHnKv5gWDKRFcpa7w==',key_name='tempest-TestGettingAddress-312359009',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:29:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-4pvdqrie',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:29:30Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=e7c569c9-0fdc-4213-b9d1-779719d43c2f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.924 186962 DEBUG nova.network.os_vif_util [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.925 186962 DEBUG nova.network.os_vif_util [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:67:d6,bridge_name='br-int',has_traffic_filtering=True,id=a96afdaa-e76f-4d5d-a3e3-c94e58cce61b,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa96afdaa-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.925 186962 DEBUG os_vif [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:67:d6,bridge_name='br-int',has_traffic_filtering=True,id=a96afdaa-e76f-4d5d-a3e3-c94e58cce61b,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa96afdaa-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.926 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.926 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa96afdaa-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.927 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.928 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.930 186962 INFO os_vif [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:67:d6,bridge_name='br-int',has_traffic_filtering=True,id=a96afdaa-e76f-4d5d-a3e3-c94e58cce61b,network=Network(ff387e90-45c2-42d7-b536-fee4d2b6eb5e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa96afdaa-e7')#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.930 186962 INFO nova.virt.libvirt.driver [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Deleting instance files /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f_del#033[00m
Nov 29 02:29:54 np0005539505 nova_compute[186958]: 2025-11-29 07:29:54.931 186962 INFO nova.virt.libvirt.driver [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Deletion of /var/lib/nova/instances/e7c569c9-0fdc-4213-b9d1-779719d43c2f_del complete#033[00m
Nov 29 02:29:55 np0005539505 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[242292]: [NOTICE]   (242298) : haproxy version is 2.8.14-c23fe91
Nov 29 02:29:55 np0005539505 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[242292]: [NOTICE]   (242298) : path to executable is /usr/sbin/haproxy
Nov 29 02:29:55 np0005539505 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[242292]: [WARNING]  (242298) : Exiting Master process...
Nov 29 02:29:55 np0005539505 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[242292]: [ALERT]    (242298) : Current worker (242300) exited with code 143 (Terminated)
Nov 29 02:29:55 np0005539505 neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792[242292]: [WARNING]  (242298) : All workers exited. Exiting... (0)
Nov 29 02:29:55 np0005539505 systemd[1]: libpod-c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87.scope: Deactivated successfully.
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.011 186962 INFO nova.compute.manager [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.011 186962 DEBUG oslo.service.loopingcall [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.012 186962 DEBUG nova.compute.manager [-] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.012 186962 DEBUG nova.network.neutron [-] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:29:55 np0005539505 podman[242555]: 2025-11-29 07:29:55.017976852 +0000 UTC m=+0.251141729 container died c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.268 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.280 186962 DEBUG nova.compute.manager [req-2ceccd6d-2920-4b7d-8b46-d752a8f0faca req-81840157-543d-49ff-9c1d-e9f799f0df29 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-vif-unplugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.280 186962 DEBUG oslo_concurrency.lockutils [req-2ceccd6d-2920-4b7d-8b46-d752a8f0faca req-81840157-543d-49ff-9c1d-e9f799f0df29 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.280 186962 DEBUG oslo_concurrency.lockutils [req-2ceccd6d-2920-4b7d-8b46-d752a8f0faca req-81840157-543d-49ff-9c1d-e9f799f0df29 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.280 186962 DEBUG oslo_concurrency.lockutils [req-2ceccd6d-2920-4b7d-8b46-d752a8f0faca req-81840157-543d-49ff-9c1d-e9f799f0df29 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.281 186962 DEBUG nova.compute.manager [req-2ceccd6d-2920-4b7d-8b46-d752a8f0faca req-81840157-543d-49ff-9c1d-e9f799f0df29 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] No waiting events found dispatching network-vif-unplugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.281 186962 DEBUG nova.compute.manager [req-2ceccd6d-2920-4b7d-8b46-d752a8f0faca req-81840157-543d-49ff-9c1d-e9f799f0df29 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-vif-unplugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:29:55 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87-userdata-shm.mount: Deactivated successfully.
Nov 29 02:29:55 np0005539505 systemd[1]: var-lib-containers-storage-overlay-6b16bc3e482e4460309376a646882f55bf4a770c2d9569891a9a3b622202d24c-merged.mount: Deactivated successfully.
Nov 29 02:29:55 np0005539505 podman[242555]: 2025-11-29 07:29:55.375956713 +0000 UTC m=+0.609121600 container cleanup c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:29:55 np0005539505 podman[242613]: 2025-11-29 07:29:55.468360179 +0000 UTC m=+0.061976445 container remove c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.476 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[afa1bf36-8c03-421b-bafc-ec2014283705]: (4, ('Sat Nov 29 07:29:54 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792 (c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87)\nc056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87\nSat Nov 29 07:29:55 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792 (c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87)\nc056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.478 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e672d32f-f5b5-46ba-ac55-0ad872dea04a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.479 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94472368-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.519 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:55 np0005539505 kernel: tap94472368-b0: left promiscuous mode
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.521 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.525 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d016f699-10a9-400f-a432-3e61f79be1ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.535 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.542 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d2244082-b3d7-4120-ae4a-1e6bbc4178ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.544 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[edbbffe0-1987-4d4b-bc76-01626388154d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.566 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[81bda34a-5f72-442d-b358-446a97787af4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 681737, 'reachable_time': 20774, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242628, 'error': None, 'target': 'ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.568 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-94472368-b72a-4e5d-ac59-40b24b7ba792 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.568 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[34431eb5-159c-45b5-aba4-e5e14d54191a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 systemd[1]: run-netns-ovnmeta\x2d94472368\x2db72a\x2d4e5d\x2dac59\x2d40b24b7ba792.mount: Deactivated successfully.
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.569 104094 INFO neutron.agent.ovn.metadata.agent [-] Port a96afdaa-e76f-4d5d-a3e3-c94e58cce61b in datapath ff387e90-45c2-42d7-b536-fee4d2b6eb5e unbound from our chassis#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.570 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff387e90-45c2-42d7-b536-fee4d2b6eb5e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.571 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0107b4-2adb-48f0-8b0b-32eb9f447dad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.571 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e namespace which is not needed anymore#033[00m
Nov 29 02:29:55 np0005539505 systemd[1]: libpod-conmon-c056c8868845cb5d1e786a69549fd62e461ebb10cca6722ffa85c9bf7d92ce87.scope: Deactivated successfully.
Nov 29 02:29:55 np0005539505 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[242366]: [NOTICE]   (242370) : haproxy version is 2.8.14-c23fe91
Nov 29 02:29:55 np0005539505 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[242366]: [NOTICE]   (242370) : path to executable is /usr/sbin/haproxy
Nov 29 02:29:55 np0005539505 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[242366]: [WARNING]  (242370) : Exiting Master process...
Nov 29 02:29:55 np0005539505 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[242366]: [ALERT]    (242370) : Current worker (242372) exited with code 143 (Terminated)
Nov 29 02:29:55 np0005539505 neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e[242366]: [WARNING]  (242370) : All workers exited. Exiting... (0)
Nov 29 02:29:55 np0005539505 systemd[1]: libpod-bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5.scope: Deactivated successfully.
Nov 29 02:29:55 np0005539505 podman[242646]: 2025-11-29 07:29:55.697067122 +0000 UTC m=+0.045017076 container died bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:29:55 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5-userdata-shm.mount: Deactivated successfully.
Nov 29 02:29:55 np0005539505 systemd[1]: var-lib-containers-storage-overlay-cafcb4864c9adbfc3baff8c08ff5d2a4fffd05524ac7819a46c9843e63efcb51-merged.mount: Deactivated successfully.
Nov 29 02:29:55 np0005539505 podman[242646]: 2025-11-29 07:29:55.730141558 +0000 UTC m=+0.078091522 container cleanup bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.730 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:55 np0005539505 systemd[1]: libpod-conmon-bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5.scope: Deactivated successfully.
Nov 29 02:29:55 np0005539505 podman[242677]: 2025-11-29 07:29:55.784713232 +0000 UTC m=+0.036276178 container remove bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.790 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f3af98f8-d57d-4bfb-ac79-f328e1a3fe5c]: (4, ('Sat Nov 29 07:29:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e (bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5)\nbfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5\nSat Nov 29 07:29:55 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e (bfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5)\nbfc4d753082f133d9109d2fed8fb4c275994b7ab2c410b3fa67cdcdf72c7ccf5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.791 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9cedf283-ceac-4a68-9a0c-a7d42298309a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.792 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff387e90-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.794 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:55 np0005539505 kernel: tapff387e90-40: left promiscuous mode
Nov 29 02:29:55 np0005539505 nova_compute[186958]: 2025-11-29 07:29:55.809 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.812 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[26a3fa6c-d21f-4bd7-b302-59bcb572fa18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.824 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[34b0e1f9-c042-4390-8260-9e1116d9e652]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.825 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c83e9d6a-2600-473b-b75b-6d33272bb8f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.840 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c349c52c-5e53-44e4-b248-89d275100ab9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 681825, 'reachable_time': 21790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242692, 'error': None, 'target': 'ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.842 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ff387e90-45c2-42d7-b536-fee4d2b6eb5e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:29:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:29:55.842 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[052d26ff-7b5c-42c0-87e7-f47246549dcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.193 186962 DEBUG nova.network.neutron [req-e22a3ef0-7ff0-46ac-994d-284ae48efb2d req-7051375d-fc09-4c1b-a1f1-48eeb7bd33b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Updated VIF entry in instance network info cache for port d68a4c07-f2f7-4755-9775-326f6b440bd1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.194 186962 DEBUG nova.network.neutron [req-e22a3ef0-7ff0-46ac-994d-284ae48efb2d req-7051375d-fc09-4c1b-a1f1-48eeb7bd33b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Updating instance_info_cache with network_info: [{"id": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "address": "fa:16:3e:01:57:5e", "network": {"id": "94472368-b72a-4e5d-ac59-40b24b7ba792", "bridge": "br-int", "label": "tempest-network-smoke--1761418689", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd68a4c07-f2", "ovs_interfaceid": "d68a4c07-f2f7-4755-9775-326f6b440bd1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "address": "fa:16:3e:9f:67:d6", "network": {"id": "ff387e90-45c2-42d7-b536-fee4d2b6eb5e", "bridge": "br-int", "label": "tempest-network-smoke--1344723239", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9f:67d6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa96afdaa-e7", "ovs_interfaceid": "a96afdaa-e76f-4d5d-a3e3-c94e58cce61b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.220 186962 DEBUG oslo_concurrency.lockutils [req-e22a3ef0-7ff0-46ac-994d-284ae48efb2d req-7051375d-fc09-4c1b-a1f1-48eeb7bd33b4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-e7c569c9-0fdc-4213-b9d1-779719d43c2f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:29:56 np0005539505 systemd[1]: run-netns-ovnmeta\x2dff387e90\x2d45c2\x2d42d7\x2db536\x2dfee4d2b6eb5e.mount: Deactivated successfully.
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.462 186962 DEBUG nova.network.neutron [-] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.505 186962 INFO nova.compute.manager [-] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Took 1.49 seconds to deallocate network for instance.#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.610 186962 DEBUG oslo_concurrency.lockutils [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.611 186962 DEBUG oslo_concurrency.lockutils [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.674 186962 DEBUG nova.compute.provider_tree [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.688 186962 DEBUG nova.scheduler.client.report [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.710 186962 DEBUG oslo_concurrency.lockutils [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.736 186962 DEBUG nova.compute.manager [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-vif-unplugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.736 186962 DEBUG oslo_concurrency.lockutils [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.737 186962 DEBUG oslo_concurrency.lockutils [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.737 186962 DEBUG oslo_concurrency.lockutils [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.737 186962 DEBUG nova.compute.manager [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] No waiting events found dispatching network-vif-unplugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.737 186962 WARNING nova.compute.manager [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received unexpected event network-vif-unplugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.738 186962 DEBUG nova.compute.manager [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-vif-plugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.738 186962 DEBUG oslo_concurrency.lockutils [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.738 186962 DEBUG oslo_concurrency.lockutils [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.738 186962 DEBUG oslo_concurrency.lockutils [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.739 186962 DEBUG nova.compute.manager [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] No waiting events found dispatching network-vif-plugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.739 186962 WARNING nova.compute.manager [req-cbd0aca6-fd1c-489f-a3e6-730d3cb5ebd0 req-ef1a63e6-4654-4ccb-ab81-505d58c754f7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received unexpected event network-vif-plugged-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.740 186962 INFO nova.scheduler.client.report [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance e7c569c9-0fdc-4213-b9d1-779719d43c2f#033[00m
Nov 29 02:29:56 np0005539505 nova_compute[186958]: 2025-11-29 07:29:56.816 186962 DEBUG oslo_concurrency.lockutils [None req-19ab0086-06b5-4aec-98ff-5575839996ec 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:57 np0005539505 nova_compute[186958]: 2025-11-29 07:29:57.390 186962 DEBUG nova.compute.manager [req-7010061d-162c-43e5-9565-97da982eca52 req-8cbe3d6d-78d8-4ad7-a6d0-c510199005f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-vif-plugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:57 np0005539505 nova_compute[186958]: 2025-11-29 07:29:57.390 186962 DEBUG oslo_concurrency.lockutils [req-7010061d-162c-43e5-9565-97da982eca52 req-8cbe3d6d-78d8-4ad7-a6d0-c510199005f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:29:57 np0005539505 nova_compute[186958]: 2025-11-29 07:29:57.391 186962 DEBUG oslo_concurrency.lockutils [req-7010061d-162c-43e5-9565-97da982eca52 req-8cbe3d6d-78d8-4ad7-a6d0-c510199005f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:29:57 np0005539505 nova_compute[186958]: 2025-11-29 07:29:57.391 186962 DEBUG oslo_concurrency.lockutils [req-7010061d-162c-43e5-9565-97da982eca52 req-8cbe3d6d-78d8-4ad7-a6d0-c510199005f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "e7c569c9-0fdc-4213-b9d1-779719d43c2f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:29:57 np0005539505 nova_compute[186958]: 2025-11-29 07:29:57.392 186962 DEBUG nova.compute.manager [req-7010061d-162c-43e5-9565-97da982eca52 req-8cbe3d6d-78d8-4ad7-a6d0-c510199005f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] No waiting events found dispatching network-vif-plugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:29:57 np0005539505 nova_compute[186958]: 2025-11-29 07:29:57.392 186962 WARNING nova.compute.manager [req-7010061d-162c-43e5-9565-97da982eca52 req-8cbe3d6d-78d8-4ad7-a6d0-c510199005f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received unexpected event network-vif-plugged-d68a4c07-f2f7-4755-9775-326f6b440bd1 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:29:57 np0005539505 nova_compute[186958]: 2025-11-29 07:29:57.393 186962 DEBUG nova.compute.manager [req-7010061d-162c-43e5-9565-97da982eca52 req-8cbe3d6d-78d8-4ad7-a6d0-c510199005f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-vif-deleted-a96afdaa-e76f-4d5d-a3e3-c94e58cce61b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:57 np0005539505 nova_compute[186958]: 2025-11-29 07:29:57.393 186962 DEBUG nova.compute.manager [req-7010061d-162c-43e5-9565-97da982eca52 req-8cbe3d6d-78d8-4ad7-a6d0-c510199005f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Received event network-vif-deleted-d68a4c07-f2f7-4755-9775-326f6b440bd1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:29:57 np0005539505 podman[242693]: 2025-11-29 07:29:57.724466141 +0000 UTC m=+0.052592400 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:29:58 np0005539505 nova_compute[186958]: 2025-11-29 07:29:58.306 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:29:59 np0005539505 nova_compute[186958]: 2025-11-29 07:29:59.930 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:00 np0005539505 nova_compute[186958]: 2025-11-29 07:30:00.732 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.233 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Acquiring lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.233 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.256 186962 DEBUG nova.compute.manager [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.375 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.375 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.381 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.382 186962 INFO nova.compute.claims [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.548 186962 DEBUG nova.compute.provider_tree [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.561 186962 DEBUG nova.scheduler.client.report [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.587 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.587 186962 DEBUG nova.compute.manager [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.647 186962 DEBUG nova.compute.manager [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.648 186962 DEBUG nova.network.neutron [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.696 186962 INFO nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.720 186962 DEBUG nova.compute.manager [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.857 186962 DEBUG nova.compute.manager [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.858 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.859 186962 INFO nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Creating image(s)#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.859 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Acquiring lock "/var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.859 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "/var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.860 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "/var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.872 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.897 186962 DEBUG nova.policy [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a22b815e615747308906551e90e82f75', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '65e2103ae49c4ff3a639d9ef42c848bc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.928 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.929 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.929 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.941 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.996 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:02 np0005539505 nova_compute[186958]: 2025-11-29 07:30:02.997 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.037 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.038 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.038 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.092 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.093 186962 DEBUG nova.virt.disk.api [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Checking if we can resize image /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.094 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.148 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.149 186962 DEBUG nova.virt.disk.api [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Cannot resize image /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.150 186962 DEBUG nova.objects.instance [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lazy-loading 'migration_context' on Instance uuid 96d9c767-4ce6-4a5f-93bc-236f9592b9d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.177 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.177 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Ensure instance console log exists: /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.177 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.178 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:03 np0005539505 nova_compute[186958]: 2025-11-29 07:30:03.178 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:04 np0005539505 nova_compute[186958]: 2025-11-29 07:30:04.102 186962 DEBUG nova.network.neutron [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Successfully created port: 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:30:04 np0005539505 nova_compute[186958]: 2025-11-29 07:30:04.933 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:05 np0005539505 nova_compute[186958]: 2025-11-29 07:30:05.784 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:05 np0005539505 nova_compute[186958]: 2025-11-29 07:30:05.984 186962 DEBUG nova.network.neutron [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Successfully updated port: 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:30:06 np0005539505 nova_compute[186958]: 2025-11-29 07:30:06.594 186962 DEBUG nova.compute.manager [req-100e82da-d87d-4f91-96c0-efd8297b5fc4 req-bde5852b-6158-4013-8c56-a47715478845 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Received event network-changed-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:30:06 np0005539505 nova_compute[186958]: 2025-11-29 07:30:06.595 186962 DEBUG nova.compute.manager [req-100e82da-d87d-4f91-96c0-efd8297b5fc4 req-bde5852b-6158-4013-8c56-a47715478845 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Refreshing instance network info cache due to event network-changed-3f16ba4d-248c-49cb-870a-e6bb1bcbe766. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:30:06 np0005539505 nova_compute[186958]: 2025-11-29 07:30:06.595 186962 DEBUG oslo_concurrency.lockutils [req-100e82da-d87d-4f91-96c0-efd8297b5fc4 req-bde5852b-6158-4013-8c56-a47715478845 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-96d9c767-4ce6-4a5f-93bc-236f9592b9d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:30:06 np0005539505 nova_compute[186958]: 2025-11-29 07:30:06.595 186962 DEBUG oslo_concurrency.lockutils [req-100e82da-d87d-4f91-96c0-efd8297b5fc4 req-bde5852b-6158-4013-8c56-a47715478845 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-96d9c767-4ce6-4a5f-93bc-236f9592b9d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:30:06 np0005539505 nova_compute[186958]: 2025-11-29 07:30:06.596 186962 DEBUG nova.network.neutron [req-100e82da-d87d-4f91-96c0-efd8297b5fc4 req-bde5852b-6158-4013-8c56-a47715478845 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Refreshing network info cache for port 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:30:06 np0005539505 nova_compute[186958]: 2025-11-29 07:30:06.610 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Acquiring lock "refresh_cache-96d9c767-4ce6-4a5f-93bc-236f9592b9d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:30:06 np0005539505 nova_compute[186958]: 2025-11-29 07:30:06.660 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:06 np0005539505 nova_compute[186958]: 2025-11-29 07:30:06.795 186962 DEBUG nova.network.neutron [req-100e82da-d87d-4f91-96c0-efd8297b5fc4 req-bde5852b-6158-4013-8c56-a47715478845 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:30:07 np0005539505 podman[242730]: 2025-11-29 07:30:07.762697663 +0000 UTC m=+0.089814393 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:30:07 np0005539505 podman[242731]: 2025-11-29 07:30:07.765125231 +0000 UTC m=+0.089243256 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller)
Nov 29 02:30:07 np0005539505 nova_compute[186958]: 2025-11-29 07:30:07.956 186962 DEBUG nova.network.neutron [req-100e82da-d87d-4f91-96c0-efd8297b5fc4 req-bde5852b-6158-4013-8c56-a47715478845 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:30:07 np0005539505 nova_compute[186958]: 2025-11-29 07:30:07.991 186962 DEBUG oslo_concurrency.lockutils [req-100e82da-d87d-4f91-96c0-efd8297b5fc4 req-bde5852b-6158-4013-8c56-a47715478845 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-96d9c767-4ce6-4a5f-93bc-236f9592b9d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:30:07 np0005539505 nova_compute[186958]: 2025-11-29 07:30:07.992 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Acquired lock "refresh_cache-96d9c767-4ce6-4a5f-93bc-236f9592b9d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:30:07 np0005539505 nova_compute[186958]: 2025-11-29 07:30:07.992 186962 DEBUG nova.network.neutron [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:30:09 np0005539505 nova_compute[186958]: 2025-11-29 07:30:09.833 186962 DEBUG nova.network.neutron [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:30:09 np0005539505 nova_compute[186958]: 2025-11-29 07:30:09.888 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401394.8870814, e7c569c9-0fdc-4213-b9d1-779719d43c2f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:30:09 np0005539505 nova_compute[186958]: 2025-11-29 07:30:09.889 186962 INFO nova.compute.manager [-] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:30:09 np0005539505 nova_compute[186958]: 2025-11-29 07:30:09.907 186962 DEBUG nova.compute.manager [None req-4c603cfd-52b9-43e8-96ca-0c2389d92ac4 - - - - - -] [instance: e7c569c9-0fdc-4213-b9d1-779719d43c2f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:30:09 np0005539505 nova_compute[186958]: 2025-11-29 07:30:09.935 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:10 np0005539505 nova_compute[186958]: 2025-11-29 07:30:10.786 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:11 np0005539505 podman[242780]: 2025-11-29 07:30:11.718172161 +0000 UTC m=+0.055065360 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:30:11 np0005539505 podman[242781]: 2025-11-29 07:30:11.718194912 +0000 UTC m=+0.051672654 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.829 186962 DEBUG nova.network.neutron [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Updating instance_info_cache with network_info: [{"id": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "address": "fa:16:3e:99:22:d6", "network": {"id": "3a51081b-97b8-4e31-8888-46d487c650ae", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1363888145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "65e2103ae49c4ff3a639d9ef42c848bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f16ba4d-24", "ovs_interfaceid": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.861 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Releasing lock "refresh_cache-96d9c767-4ce6-4a5f-93bc-236f9592b9d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.861 186962 DEBUG nova.compute.manager [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Instance network_info: |[{"id": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "address": "fa:16:3e:99:22:d6", "network": {"id": "3a51081b-97b8-4e31-8888-46d487c650ae", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1363888145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "65e2103ae49c4ff3a639d9ef42c848bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f16ba4d-24", "ovs_interfaceid": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.868 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Start _get_guest_xml network_info=[{"id": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "address": "fa:16:3e:99:22:d6", "network": {"id": "3a51081b-97b8-4e31-8888-46d487c650ae", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1363888145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "65e2103ae49c4ff3a639d9ef42c848bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f16ba4d-24", "ovs_interfaceid": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.874 186962 WARNING nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.887 186962 DEBUG nova.virt.libvirt.host [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.888 186962 DEBUG nova.virt.libvirt.host [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.892 186962 DEBUG nova.virt.libvirt.host [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.893 186962 DEBUG nova.virt.libvirt.host [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.895 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.896 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.896 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.897 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.898 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.898 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.898 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.899 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.899 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.900 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.900 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.901 186962 DEBUG nova.virt.hardware [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.908 186962 DEBUG nova.virt.libvirt.vif [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:30:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1444902494',display_name='tempest-ServerPasswordTestJSON-server-1444902494',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1444902494',id=142,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='65e2103ae49c4ff3a639d9ef42c848bc',ramdisk_id='',reservation_id='r-u4mv7oc1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-1334690423',owner_user_name='tempest-ServerPasswordTestJSON-1334690423-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:30:02Z,user_data=None,user_id='a22b815e615747308906551e90e82f75',uuid=96d9c767-4ce6-4a5f-93bc-236f9592b9d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "address": "fa:16:3e:99:22:d6", "network": {"id": "3a51081b-97b8-4e31-8888-46d487c650ae", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1363888145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "65e2103ae49c4ff3a639d9ef42c848bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f16ba4d-24", "ovs_interfaceid": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.909 186962 DEBUG nova.network.os_vif_util [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Converting VIF {"id": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "address": "fa:16:3e:99:22:d6", "network": {"id": "3a51081b-97b8-4e31-8888-46d487c650ae", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1363888145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "65e2103ae49c4ff3a639d9ef42c848bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f16ba4d-24", "ovs_interfaceid": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.910 186962 DEBUG nova.network.os_vif_util [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:22:d6,bridge_name='br-int',has_traffic_filtering=True,id=3f16ba4d-248c-49cb-870a-e6bb1bcbe766,network=Network(3a51081b-97b8-4e31-8888-46d487c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f16ba4d-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.911 186962 DEBUG nova.objects.instance [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lazy-loading 'pci_devices' on Instance uuid 96d9c767-4ce6-4a5f-93bc-236f9592b9d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.926 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  <uuid>96d9c767-4ce6-4a5f-93bc-236f9592b9d4</uuid>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  <name>instance-0000008e</name>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <nova:name>tempest-ServerPasswordTestJSON-server-1444902494</nova:name>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:30:12</nova:creationTime>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:        <nova:user uuid="a22b815e615747308906551e90e82f75">tempest-ServerPasswordTestJSON-1334690423-project-member</nova:user>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:        <nova:project uuid="65e2103ae49c4ff3a639d9ef42c848bc">tempest-ServerPasswordTestJSON-1334690423</nova:project>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:        <nova:port uuid="3f16ba4d-248c-49cb-870a-e6bb1bcbe766">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <entry name="serial">96d9c767-4ce6-4a5f-93bc-236f9592b9d4</entry>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <entry name="uuid">96d9c767-4ce6-4a5f-93bc-236f9592b9d4</entry>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk.config"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:99:22:d6"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <target dev="tap3f16ba4d-24"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/console.log" append="off"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:30:12 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:30:12 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:30:12 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:30:12 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.927 186962 DEBUG nova.compute.manager [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Preparing to wait for external event network-vif-plugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.928 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Acquiring lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.928 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.929 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.930 186962 DEBUG nova.virt.libvirt.vif [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:30:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1444902494',display_name='tempest-ServerPasswordTestJSON-server-1444902494',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1444902494',id=142,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='65e2103ae49c4ff3a639d9ef42c848bc',ramdisk_id='',reservation_id='r-u4mv7oc1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerPasswordTestJSON-1334690423',owner_user_name='tempest-ServerPasswordTestJSON-1334690423-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:30:02Z,user_data=None,user_id='a22b815e615747308906551e90e82f75',uuid=96d9c767-4ce6-4a5f-93bc-236f9592b9d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "address": "fa:16:3e:99:22:d6", "network": {"id": "3a51081b-97b8-4e31-8888-46d487c650ae", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1363888145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "65e2103ae49c4ff3a639d9ef42c848bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f16ba4d-24", "ovs_interfaceid": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.931 186962 DEBUG nova.network.os_vif_util [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Converting VIF {"id": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "address": "fa:16:3e:99:22:d6", "network": {"id": "3a51081b-97b8-4e31-8888-46d487c650ae", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1363888145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "65e2103ae49c4ff3a639d9ef42c848bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f16ba4d-24", "ovs_interfaceid": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.932 186962 DEBUG nova.network.os_vif_util [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:22:d6,bridge_name='br-int',has_traffic_filtering=True,id=3f16ba4d-248c-49cb-870a-e6bb1bcbe766,network=Network(3a51081b-97b8-4e31-8888-46d487c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f16ba4d-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.933 186962 DEBUG os_vif [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:22:d6,bridge_name='br-int',has_traffic_filtering=True,id=3f16ba4d-248c-49cb-870a-e6bb1bcbe766,network=Network(3a51081b-97b8-4e31-8888-46d487c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f16ba4d-24') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.934 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.935 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.935 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.938 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.939 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f16ba4d-24, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.940 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3f16ba4d-24, col_values=(('external_ids', {'iface-id': '3f16ba4d-248c-49cb-870a-e6bb1bcbe766', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:99:22:d6', 'vm-uuid': '96d9c767-4ce6-4a5f-93bc-236f9592b9d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.942 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:12 np0005539505 NetworkManager[55134]: <info>  [1764401412.9435] manager: (tap3f16ba4d-24): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/330)
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.946 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.948 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:12 np0005539505 nova_compute[186958]: 2025-11-29 07:30:12.949 186962 INFO os_vif [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:22:d6,bridge_name='br-int',has_traffic_filtering=True,id=3f16ba4d-248c-49cb-870a-e6bb1bcbe766,network=Network(3a51081b-97b8-4e31-8888-46d487c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f16ba4d-24')#033[00m
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.008 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.009 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.009 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] No VIF found with MAC fa:16:3e:99:22:d6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.009 186962 INFO nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Using config drive#033[00m
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.395 186962 INFO nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Creating config drive at /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk.config#033[00m
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.400 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj_llougm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.523 186962 DEBUG oslo_concurrency.processutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj_llougm" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:13 np0005539505 kernel: tap3f16ba4d-24: entered promiscuous mode
Nov 29 02:30:13 np0005539505 NetworkManager[55134]: <info>  [1764401413.5844] manager: (tap3f16ba4d-24): new Tun device (/org/freedesktop/NetworkManager/Devices/331)
Nov 29 02:30:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:13Z|00676|binding|INFO|Claiming lport 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 for this chassis.
Nov 29 02:30:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:13Z|00677|binding|INFO|3f16ba4d-248c-49cb-870a-e6bb1bcbe766: Claiming fa:16:3e:99:22:d6 10.100.0.9
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.625 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.641 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:22:d6 10.100.0.9'], port_security=['fa:16:3e:99:22:d6 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '96d9c767-4ce6-4a5f-93bc-236f9592b9d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a51081b-97b8-4e31-8888-46d487c650ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '65e2103ae49c4ff3a639d9ef42c848bc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '15b3f0a4-0666-49e7-a7a0-15391258b81e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4bd2635-b2d6-4c46-9491-38a374fb7555, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=3f16ba4d-248c-49cb-870a-e6bb1bcbe766) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:30:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:13Z|00678|binding|INFO|Setting lport 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 ovn-installed in OVS
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.643 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 in datapath 3a51081b-97b8-4e31-8888-46d487c650ae bound to our chassis#033[00m
Nov 29 02:30:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:13Z|00679|binding|INFO|Setting lport 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 up in Southbound
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.644 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.645 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3a51081b-97b8-4e31-8888-46d487c650ae#033[00m
Nov 29 02:30:13 np0005539505 systemd-udevd[242839]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.649 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.655 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ff356d84-739e-4f75-98d1-ebc72b6cc174]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.655 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3a51081b-91 in ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.658 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3a51081b-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.658 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d4b04286-7e2e-4fe6-a5b3-ef9375c509d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 systemd-machined[153285]: New machine qemu-75-instance-0000008e.
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.659 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cc8a666e-2aa0-47f7-ab0b-3196b0b3972b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 NetworkManager[55134]: <info>  [1764401413.6630] device (tap3f16ba4d-24): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:30:13 np0005539505 NetworkManager[55134]: <info>  [1764401413.6638] device (tap3f16ba4d-24): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:30:13 np0005539505 systemd[1]: Started Virtual Machine qemu-75-instance-0000008e.
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.671 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[28e38b59-d5da-41b6-91f9-9d9826586251]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.684 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6bbef9dc-9475-43a2-871d-294e1a0d2f5a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.687 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.712 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[dd43d6d6-1bf9-4335-8d9a-20cefc2ab745]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 systemd-udevd[242843]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:30:13 np0005539505 NetworkManager[55134]: <info>  [1764401413.7374] manager: (tap3a51081b-90): new Veth device (/org/freedesktop/NetworkManager/Devices/332)
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.737 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f4702f96-0e30-480f-88b0-74d5fc8f0641]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.765 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[43ca168a-1127-4dbb-8b78-ae944997adc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.769 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[0f790592-2214-4761-a555-f373e5d65415]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 NetworkManager[55134]: <info>  [1764401413.7889] device (tap3a51081b-90): carrier: link connected
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.794 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[d9075fbe-6f1a-4abc-a7fb-135f43ef511e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.811 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0783a78a-94c4-4cc8-b798-49fcb0a0a1ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a51081b-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:03:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 686137, 'reachable_time': 30979, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242873, 'error': None, 'target': 'ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.826 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[80c7c1c4-df14-4c3a-9e20-3d974a4854c6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:372'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 686137, 'tstamp': 686137}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242874, 'error': None, 'target': 'ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.843 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c34dbf46-401d-450e-af59-159fff5c16dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a51081b-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:03:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 215], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 686137, 'reachable_time': 30979, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242875, 'error': None, 'target': 'ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.874 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[174188f0-8c33-49c7-9410-1fd04e42f60a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.876 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.928 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b59f5bdf-3192-4937-87ee-7399ca3b9879]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.929 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a51081b-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.930 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.930 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a51081b-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:13 np0005539505 NetworkManager[55134]: <info>  [1764401413.9325] manager: (tap3a51081b-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/333)
Nov 29 02:30:13 np0005539505 kernel: tap3a51081b-90: entered promiscuous mode
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.932 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:13Z|00680|binding|INFO|Releasing lport 7591750a-ef4f-4120-8d70-396cb442c56e from this chassis (sb_readonly=0)
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.934 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3a51081b-90, col_values=(('external_ids', {'iface-id': '7591750a-ef4f-4120-8d70-396cb442c56e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:13 np0005539505 nova_compute[186958]: 2025-11-29 07:30:13.947 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.948 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a51081b-97b8-4e31-8888-46d487c650ae.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a51081b-97b8-4e31-8888-46d487c650ae.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.949 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f4355966-d482-495d-b435-f81435227527]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.950 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-3a51081b-97b8-4e31-8888-46d487c650ae
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/3a51081b-97b8-4e31-8888-46d487c650ae.pid.haproxy
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 3a51081b-97b8-4e31-8888-46d487c650ae
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:30:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:13.950 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae', 'env', 'PROCESS_TAG=haproxy-3a51081b-97b8-4e31-8888-46d487c650ae', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3a51081b-97b8-4e31-8888-46d487c650ae.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.086 186962 DEBUG nova.compute.manager [req-495532b6-1d5d-4722-946b-d8b370d6f271 req-4f1faa82-9b4e-411d-bfb5-2134800056e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Received event network-vif-plugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.087 186962 DEBUG oslo_concurrency.lockutils [req-495532b6-1d5d-4722-946b-d8b370d6f271 req-4f1faa82-9b4e-411d-bfb5-2134800056e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.087 186962 DEBUG oslo_concurrency.lockutils [req-495532b6-1d5d-4722-946b-d8b370d6f271 req-4f1faa82-9b4e-411d-bfb5-2134800056e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.087 186962 DEBUG oslo_concurrency.lockutils [req-495532b6-1d5d-4722-946b-d8b370d6f271 req-4f1faa82-9b4e-411d-bfb5-2134800056e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.087 186962 DEBUG nova.compute.manager [req-495532b6-1d5d-4722-946b-d8b370d6f271 req-4f1faa82-9b4e-411d-bfb5-2134800056e9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Processing event network-vif-plugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.285 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401414.2846584, 96d9c767-4ce6-4a5f-93bc-236f9592b9d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.285 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] VM Started (Lifecycle Event)#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.288 186962 DEBUG nova.compute.manager [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.292 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.296 186962 INFO nova.virt.libvirt.driver [-] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Instance spawned successfully.#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.296 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.309 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.311 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:30:14 np0005539505 podman[242914]: 2025-11-29 07:30:14.320161792 +0000 UTC m=+0.055449250 container create d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.324 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.324 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.325 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.325 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.325 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.326 186962 DEBUG nova.virt.libvirt.driver [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:14 np0005539505 systemd[1]: Started libpod-conmon-d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3.scope.
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.370 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.371 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401414.285523, 96d9c767-4ce6-4a5f-93bc-236f9592b9d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.371 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:14 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:30:14 np0005539505 podman[242914]: 2025-11-29 07:30:14.287173629 +0000 UTC m=+0.022461117 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:30:14 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/934deed2d60304ad8e120984f556a51339f5d04442000b7c17e38f942c5f8717/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:30:14 np0005539505 podman[242914]: 2025-11-29 07:30:14.395658439 +0000 UTC m=+0.130945927 container init d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:30:14 np0005539505 podman[242914]: 2025-11-29 07:30:14.401257728 +0000 UTC m=+0.136545186 container start d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.417 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.419 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:14 np0005539505 neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae[242929]: [NOTICE]   (242933) : New worker (242935) forked
Nov 29 02:30:14 np0005539505 neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae[242929]: [NOTICE]   (242933) : Loading success.
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.432 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401414.2914314, 96d9c767-4ce6-4a5f-93bc-236f9592b9d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.432 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.449 186962 INFO nova.compute.manager [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Took 11.59 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.450 186962 DEBUG nova.compute.manager [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.674 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.677 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.699 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.748 186962 INFO nova.compute.manager [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Took 12.41 seconds to build instance.#033[00m
Nov 29 02:30:14 np0005539505 nova_compute[186958]: 2025-11-29 07:30:14.767 186962 DEBUG oslo_concurrency.lockutils [None req-96c2296d-68c7-4dbf-ae9c-60fb5b3b009f a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:15 np0005539505 nova_compute[186958]: 2025-11-29 07:30:15.789 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.175 186962 DEBUG nova.compute.manager [req-ece7fc92-68ee-405c-89a9-340acb7d5fe8 req-86477772-0fb5-496e-9630-da739a4944b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Received event network-vif-plugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.175 186962 DEBUG oslo_concurrency.lockutils [req-ece7fc92-68ee-405c-89a9-340acb7d5fe8 req-86477772-0fb5-496e-9630-da739a4944b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.176 186962 DEBUG oslo_concurrency.lockutils [req-ece7fc92-68ee-405c-89a9-340acb7d5fe8 req-86477772-0fb5-496e-9630-da739a4944b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.176 186962 DEBUG oslo_concurrency.lockutils [req-ece7fc92-68ee-405c-89a9-340acb7d5fe8 req-86477772-0fb5-496e-9630-da739a4944b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.177 186962 DEBUG nova.compute.manager [req-ece7fc92-68ee-405c-89a9-340acb7d5fe8 req-86477772-0fb5-496e-9630-da739a4944b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] No waiting events found dispatching network-vif-plugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.177 186962 WARNING nova.compute.manager [req-ece7fc92-68ee-405c-89a9-340acb7d5fe8 req-86477772-0fb5-496e-9630-da739a4944b3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Received unexpected event network-vif-plugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.838 186962 DEBUG oslo_concurrency.lockutils [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Acquiring lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.839 186962 DEBUG oslo_concurrency.lockutils [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.840 186962 DEBUG oslo_concurrency.lockutils [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Acquiring lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.840 186962 DEBUG oslo_concurrency.lockutils [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.840 186962 DEBUG oslo_concurrency.lockutils [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.867 186962 INFO nova.compute.manager [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Terminating instance#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.883 186962 DEBUG nova.compute.manager [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:30:16 np0005539505 kernel: tap3f16ba4d-24 (unregistering): left promiscuous mode
Nov 29 02:30:16 np0005539505 NetworkManager[55134]: <info>  [1764401416.9070] device (tap3f16ba4d-24): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:30:16 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:16Z|00681|binding|INFO|Releasing lport 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 from this chassis (sb_readonly=0)
Nov 29 02:30:16 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:16Z|00682|binding|INFO|Setting lport 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 down in Southbound
Nov 29 02:30:16 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:16Z|00683|binding|INFO|Removing iface tap3f16ba4d-24 ovn-installed in OVS
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.959 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:16.966 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:22:d6 10.100.0.9'], port_security=['fa:16:3e:99:22:d6 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '96d9c767-4ce6-4a5f-93bc-236f9592b9d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a51081b-97b8-4e31-8888-46d487c650ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '65e2103ae49c4ff3a639d9ef42c848bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '15b3f0a4-0666-49e7-a7a0-15391258b81e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4bd2635-b2d6-4c46-9491-38a374fb7555, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=3f16ba4d-248c-49cb-870a-e6bb1bcbe766) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:30:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:16.968 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 in datapath 3a51081b-97b8-4e31-8888-46d487c650ae unbound from our chassis#033[00m
Nov 29 02:30:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:16.970 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a51081b-97b8-4e31-8888-46d487c650ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:30:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:16.971 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[35d3e0cd-c4a5-48b5-a96b-923e9d028769]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:16.971 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae namespace which is not needed anymore#033[00m
Nov 29 02:30:16 np0005539505 nova_compute[186958]: 2025-11-29 07:30:16.974 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:17 np0005539505 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000008e.scope: Deactivated successfully.
Nov 29 02:30:17 np0005539505 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000008e.scope: Consumed 3.188s CPU time.
Nov 29 02:30:17 np0005539505 systemd-machined[153285]: Machine qemu-75-instance-0000008e terminated.
Nov 29 02:30:17 np0005539505 kernel: tap3f16ba4d-24: entered promiscuous mode
Nov 29 02:30:17 np0005539505 neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae[242929]: [NOTICE]   (242933) : haproxy version is 2.8.14-c23fe91
Nov 29 02:30:17 np0005539505 neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae[242929]: [NOTICE]   (242933) : path to executable is /usr/sbin/haproxy
Nov 29 02:30:17 np0005539505 neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae[242929]: [WARNING]  (242933) : Exiting Master process...
Nov 29 02:30:17 np0005539505 kernel: tap3f16ba4d-24 (unregistering): left promiscuous mode
Nov 29 02:30:17 np0005539505 neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae[242929]: [ALERT]    (242933) : Current worker (242935) exited with code 143 (Terminated)
Nov 29 02:30:17 np0005539505 neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae[242929]: [WARNING]  (242933) : All workers exited. Exiting... (0)
Nov 29 02:30:17 np0005539505 NetworkManager[55134]: <info>  [1764401417.1069] manager: (tap3f16ba4d-24): new Tun device (/org/freedesktop/NetworkManager/Devices/334)
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.108 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:17Z|00684|binding|INFO|Claiming lport 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 for this chassis.
Nov 29 02:30:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:17Z|00685|binding|INFO|3f16ba4d-248c-49cb-870a-e6bb1bcbe766: Claiming fa:16:3e:99:22:d6 10.100.0.9
Nov 29 02:30:17 np0005539505 systemd[1]: libpod-d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3.scope: Deactivated successfully.
Nov 29 02:30:17 np0005539505 podman[242968]: 2025-11-29 07:30:17.113452429 +0000 UTC m=+0.050808009 container died d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.118 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:22:d6 10.100.0.9'], port_security=['fa:16:3e:99:22:d6 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '96d9c767-4ce6-4a5f-93bc-236f9592b9d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a51081b-97b8-4e31-8888-46d487c650ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '65e2103ae49c4ff3a639d9ef42c848bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '15b3f0a4-0666-49e7-a7a0-15391258b81e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4bd2635-b2d6-4c46-9491-38a374fb7555, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=3f16ba4d-248c-49cb-870a-e6bb1bcbe766) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:30:17 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:17Z|00686|binding|INFO|Releasing lport 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 from this chassis (sb_readonly=0)
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.126 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.134 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:22:d6 10.100.0.9'], port_security=['fa:16:3e:99:22:d6 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '96d9c767-4ce6-4a5f-93bc-236f9592b9d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a51081b-97b8-4e31-8888-46d487c650ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '65e2103ae49c4ff3a639d9ef42c848bc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '15b3f0a4-0666-49e7-a7a0-15391258b81e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4bd2635-b2d6-4c46-9491-38a374fb7555, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=3f16ba4d-248c-49cb-870a-e6bb1bcbe766) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:30:17 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3-userdata-shm.mount: Deactivated successfully.
Nov 29 02:30:17 np0005539505 systemd[1]: var-lib-containers-storage-overlay-934deed2d60304ad8e120984f556a51339f5d04442000b7c17e38f942c5f8717-merged.mount: Deactivated successfully.
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.155 186962 INFO nova.virt.libvirt.driver [-] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Instance destroyed successfully.#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.156 186962 DEBUG nova.objects.instance [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lazy-loading 'resources' on Instance uuid 96d9c767-4ce6-4a5f-93bc-236f9592b9d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:30:17 np0005539505 podman[242968]: 2025-11-29 07:30:17.165387609 +0000 UTC m=+0.102743209 container cleanup d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.167 186962 DEBUG nova.virt.libvirt.vif [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:30:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerPasswordTestJSON-server-1444902494',display_name='tempest-ServerPasswordTestJSON-server-1444902494',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverpasswordtestjson-server-1444902494',id=142,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:30:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='65e2103ae49c4ff3a639d9ef42c848bc',ramdisk_id='',reservation_id='r-u4mv7oc1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerPasswordTestJSON-1334690423',owner_user_name='tempest-ServerPasswordTestJSON-1334690423-project-member',password_0='',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:30:16Z,user_data=None,user_id='a22b815e615747308906551e90e82f75',uuid=96d9c767-4ce6-4a5f-93bc-236f9592b9d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "address": "fa:16:3e:99:22:d6", "network": {"id": "3a51081b-97b8-4e31-8888-46d487c650ae", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1363888145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "65e2103ae49c4ff3a639d9ef42c848bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f16ba4d-24", "ovs_interfaceid": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.168 186962 DEBUG nova.network.os_vif_util [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Converting VIF {"id": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "address": "fa:16:3e:99:22:d6", "network": {"id": "3a51081b-97b8-4e31-8888-46d487c650ae", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-1363888145-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "65e2103ae49c4ff3a639d9ef42c848bc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f16ba4d-24", "ovs_interfaceid": "3f16ba4d-248c-49cb-870a-e6bb1bcbe766", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.169 186962 DEBUG nova.network.os_vif_util [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:22:d6,bridge_name='br-int',has_traffic_filtering=True,id=3f16ba4d-248c-49cb-870a-e6bb1bcbe766,network=Network(3a51081b-97b8-4e31-8888-46d487c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f16ba4d-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.169 186962 DEBUG os_vif [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:22:d6,bridge_name='br-int',has_traffic_filtering=True,id=3f16ba4d-248c-49cb-870a-e6bb1bcbe766,network=Network(3a51081b-97b8-4e31-8888-46d487c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f16ba4d-24') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.170 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.171 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f16ba4d-24, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.174 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:17 np0005539505 systemd[1]: libpod-conmon-d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3.scope: Deactivated successfully.
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.177 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.179 186962 INFO os_vif [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:22:d6,bridge_name='br-int',has_traffic_filtering=True,id=3f16ba4d-248c-49cb-870a-e6bb1bcbe766,network=Network(3a51081b-97b8-4e31-8888-46d487c650ae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f16ba4d-24')#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.180 186962 INFO nova.virt.libvirt.driver [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Deleting instance files /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4_del#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.181 186962 INFO nova.virt.libvirt.driver [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Deletion of /var/lib/nova/instances/96d9c767-4ce6-4a5f-93bc-236f9592b9d4_del complete#033[00m
Nov 29 02:30:17 np0005539505 podman[243013]: 2025-11-29 07:30:17.226863869 +0000 UTC m=+0.040127497 container remove d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.234 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c33419a0-2516-4b8a-98dc-1209083f8624]: (4, ('Sat Nov 29 07:30:17 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae (d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3)\nd8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3\nSat Nov 29 07:30:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae (d8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3)\nd8780b38bca46cd7a9fcd82844ac0551f0881afdbe72669578f94e0f172084e3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.235 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[21359197-709b-4a77-8b21-6aeb254e40e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.237 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a51081b-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.239 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:17 np0005539505 kernel: tap3a51081b-90: left promiscuous mode
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.242 186962 INFO nova.compute.manager [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.244 186962 DEBUG oslo.service.loopingcall [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.244 186962 DEBUG nova.compute.manager [-] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.244 186962 DEBUG nova.network.neutron [-] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:30:17 np0005539505 nova_compute[186958]: 2025-11-29 07:30:17.252 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.255 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[82b51dc2-1b28-40a6-b147-20e76ee1f8d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.270 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a48b68-0027-45be-abe0-843085cf023c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.273 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[05805df4-9697-4c6d-9c63-f5317b3a4c95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.289 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[66c0d64a-f07d-49e6-9d58-649f4e4b95b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 686129, 'reachable_time': 41299, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243028, 'error': None, 'target': 'ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.291 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3a51081b-97b8-4e31-8888-46d487c650ae deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.291 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[1d43f32e-4aaf-4930-b4b6-a4c994107e6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.292 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 in datapath 3a51081b-97b8-4e31-8888-46d487c650ae unbound from our chassis#033[00m
Nov 29 02:30:17 np0005539505 systemd[1]: run-netns-ovnmeta\x2d3a51081b\x2d97b8\x2d4e31\x2d8888\x2d46d487c650ae.mount: Deactivated successfully.
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.293 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a51081b-97b8-4e31-8888-46d487c650ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.294 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7623cb-0a61-4bd0-897d-abec33b0cbd9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.294 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 3f16ba4d-248c-49cb-870a-e6bb1bcbe766 in datapath 3a51081b-97b8-4e31-8888-46d487c650ae unbound from our chassis#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.295 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a51081b-97b8-4e31-8888-46d487c650ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:30:17 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:17.295 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7a93b97a-f058-45e7-9f80-4f57259d1651]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.124 186962 DEBUG nova.network.neutron [-] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.138 186962 INFO nova.compute.manager [-] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Took 0.89 seconds to deallocate network for instance.#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.216 186962 DEBUG nova.compute.manager [req-3c5457fe-5fb2-411e-883a-ebae84c0d90c req-db0d59cb-a880-4a65-af91-b74a2c93f90b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Received event network-vif-deleted-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.476 186962 DEBUG nova.compute.manager [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Received event network-vif-unplugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.476 186962 DEBUG oslo_concurrency.lockutils [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.477 186962 DEBUG oslo_concurrency.lockutils [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.477 186962 DEBUG oslo_concurrency.lockutils [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.477 186962 DEBUG nova.compute.manager [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] No waiting events found dispatching network-vif-unplugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.478 186962 WARNING nova.compute.manager [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Received unexpected event network-vif-unplugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.478 186962 DEBUG nova.compute.manager [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Received event network-vif-plugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.478 186962 DEBUG oslo_concurrency.lockutils [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.478 186962 DEBUG oslo_concurrency.lockutils [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.479 186962 DEBUG oslo_concurrency.lockutils [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.479 186962 DEBUG nova.compute.manager [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] No waiting events found dispatching network-vif-plugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.479 186962 WARNING nova.compute.manager [req-8cec1b4d-b53d-42a9-bacc-0ffd3d090cf2 req-99b083ab-b3e2-4536-991d-baafff8956d4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Received unexpected event network-vif-plugged-3f16ba4d-248c-49cb-870a-e6bb1bcbe766 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.495 186962 DEBUG oslo_concurrency.lockutils [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.495 186962 DEBUG oslo_concurrency.lockutils [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.553 186962 DEBUG nova.compute.provider_tree [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.579 186962 DEBUG nova.scheduler.client.report [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.809 186962 DEBUG oslo_concurrency.lockutils [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:18 np0005539505 nova_compute[186958]: 2025-11-29 07:30:18.849 186962 INFO nova.scheduler.client.report [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Deleted allocations for instance 96d9c767-4ce6-4a5f-93bc-236f9592b9d4#033[00m
Nov 29 02:30:19 np0005539505 nova_compute[186958]: 2025-11-29 07:30:19.235 186962 DEBUG oslo_concurrency.lockutils [None req-87cf0a1b-ba7e-420c-a532-f65e469a1ef6 a22b815e615747308906551e90e82f75 65e2103ae49c4ff3a639d9ef42c848bc - - default default] Lock "96d9c767-4ce6-4a5f-93bc-236f9592b9d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:20 np0005539505 nova_compute[186958]: 2025-11-29 07:30:20.835 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:21 np0005539505 nova_compute[186958]: 2025-11-29 07:30:21.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:21 np0005539505 nova_compute[186958]: 2025-11-29 07:30:21.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:30:22 np0005539505 nova_compute[186958]: 2025-11-29 07:30:22.200 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:22 np0005539505 nova_compute[186958]: 2025-11-29 07:30:22.386 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:22 np0005539505 nova_compute[186958]: 2025-11-29 07:30:22.387 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:22 np0005539505 nova_compute[186958]: 2025-11-29 07:30:22.406 186962 DEBUG nova.compute.manager [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:30:22 np0005539505 nova_compute[186958]: 2025-11-29 07:30:22.532 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:22 np0005539505 nova_compute[186958]: 2025-11-29 07:30:22.533 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:22 np0005539505 nova_compute[186958]: 2025-11-29 07:30:22.538 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:30:22 np0005539505 nova_compute[186958]: 2025-11-29 07:30:22.539 186962 INFO nova.compute.claims [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:30:22 np0005539505 nova_compute[186958]: 2025-11-29 07:30:22.881 186962 DEBUG nova.compute.provider_tree [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:30:22 np0005539505 nova_compute[186958]: 2025-11-29 07:30:22.936 186962 DEBUG nova.scheduler.client.report [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:30:23 np0005539505 nova_compute[186958]: 2025-11-29 07:30:23.074 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:23 np0005539505 nova_compute[186958]: 2025-11-29 07:30:23.075 186962 DEBUG nova.compute.manager [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:30:23 np0005539505 nova_compute[186958]: 2025-11-29 07:30:23.361 186962 DEBUG nova.compute.manager [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:30:23 np0005539505 nova_compute[186958]: 2025-11-29 07:30:23.362 186962 DEBUG nova.network.neutron [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:30:23 np0005539505 nova_compute[186958]: 2025-11-29 07:30:23.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:23 np0005539505 nova_compute[186958]: 2025-11-29 07:30:23.544 186962 INFO nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:30:23 np0005539505 nova_compute[186958]: 2025-11-29 07:30:23.692 186962 DEBUG nova.compute.manager [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:30:23 np0005539505 nova_compute[186958]: 2025-11-29 07:30:23.846 186962 DEBUG nova.policy [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.249 186962 DEBUG nova.compute.manager [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.250 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.250 186962 INFO nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Creating image(s)#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.251 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.251 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.252 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.264 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.324 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.325 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.325 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.337 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.393 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.394 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.434 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.435 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.436 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.500 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.502 186962 DEBUG nova.virt.disk.api [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.502 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.556 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.557 186962 DEBUG nova.virt.disk.api [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.557 186962 DEBUG nova.objects.instance [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.608 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.609 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Ensure instance console log exists: /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.609 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.610 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:24 np0005539505 nova_compute[186958]: 2025-11-29 07:30:24.610 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:25 np0005539505 podman[243046]: 2025-11-29 07:30:25.719022294 +0000 UTC m=+0.047024982 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:30:25 np0005539505 podman[243045]: 2025-11-29 07:30:25.722289677 +0000 UTC m=+0.052354413 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:30:25 np0005539505 nova_compute[186958]: 2025-11-29 07:30:25.837 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.466 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.466 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.466 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.467 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.617 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.619 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5694MB free_disk=73.07343292236328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.619 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.620 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.852 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.853 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.853 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.908 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.949 186962 DEBUG nova.network.neutron [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Successfully created port: a8aba1b5-0a8f-4d8c-9284-d89c9b87efed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:30:26 np0005539505 nova_compute[186958]: 2025-11-29 07:30:26.954 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:30:27 np0005539505 nova_compute[186958]: 2025-11-29 07:30:27.077 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:30:27 np0005539505 nova_compute[186958]: 2025-11-29 07:30:27.078 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.458s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:27 np0005539505 nova_compute[186958]: 2025-11-29 07:30:27.203 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:27 np0005539505 nova_compute[186958]: 2025-11-29 07:30:27.379 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:27.509 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:27.510 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:27.510 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:28 np0005539505 nova_compute[186958]: 2025-11-29 07:30:28.079 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:28 np0005539505 nova_compute[186958]: 2025-11-29 07:30:28.079 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:30:28 np0005539505 nova_compute[186958]: 2025-11-29 07:30:28.079 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:30:28 np0005539505 nova_compute[186958]: 2025-11-29 07:30:28.101 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:30:28 np0005539505 nova_compute[186958]: 2025-11-29 07:30:28.101 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:30:28 np0005539505 podman[243090]: 2025-11-29 07:30:28.756622824 +0000 UTC m=+0.066730499 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:30:29 np0005539505 nova_compute[186958]: 2025-11-29 07:30:29.396 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:29 np0005539505 nova_compute[186958]: 2025-11-29 07:30:29.850 186962 DEBUG nova.network.neutron [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Successfully updated port: a8aba1b5-0a8f-4d8c-9284-d89c9b87efed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:30:30 np0005539505 nova_compute[186958]: 2025-11-29 07:30:30.054 186962 DEBUG nova.compute.manager [req-43faa860-5e7a-4d94-8aea-2bd29eddbd5f req-6aaa3072-6ab5-45e5-84c5-f18d65eac6b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Received event network-changed-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:30:30 np0005539505 nova_compute[186958]: 2025-11-29 07:30:30.054 186962 DEBUG nova.compute.manager [req-43faa860-5e7a-4d94-8aea-2bd29eddbd5f req-6aaa3072-6ab5-45e5-84c5-f18d65eac6b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Refreshing instance network info cache due to event network-changed-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:30:30 np0005539505 nova_compute[186958]: 2025-11-29 07:30:30.054 186962 DEBUG oslo_concurrency.lockutils [req-43faa860-5e7a-4d94-8aea-2bd29eddbd5f req-6aaa3072-6ab5-45e5-84c5-f18d65eac6b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:30:30 np0005539505 nova_compute[186958]: 2025-11-29 07:30:30.054 186962 DEBUG oslo_concurrency.lockutils [req-43faa860-5e7a-4d94-8aea-2bd29eddbd5f req-6aaa3072-6ab5-45e5-84c5-f18d65eac6b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:30:30 np0005539505 nova_compute[186958]: 2025-11-29 07:30:30.055 186962 DEBUG nova.network.neutron [req-43faa860-5e7a-4d94-8aea-2bd29eddbd5f req-6aaa3072-6ab5-45e5-84c5-f18d65eac6b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Refreshing network info cache for port a8aba1b5-0a8f-4d8c-9284-d89c9b87efed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:30:30 np0005539505 nova_compute[186958]: 2025-11-29 07:30:30.063 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:30:30 np0005539505 nova_compute[186958]: 2025-11-29 07:30:30.325 186962 DEBUG nova.network.neutron [req-43faa860-5e7a-4d94-8aea-2bd29eddbd5f req-6aaa3072-6ab5-45e5-84c5-f18d65eac6b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:30:30 np0005539505 nova_compute[186958]: 2025-11-29 07:30:30.840 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:31 np0005539505 nova_compute[186958]: 2025-11-29 07:30:31.735 186962 DEBUG nova.network.neutron [req-43faa860-5e7a-4d94-8aea-2bd29eddbd5f req-6aaa3072-6ab5-45e5-84c5-f18d65eac6b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:30:31 np0005539505 nova_compute[186958]: 2025-11-29 07:30:31.763 186962 DEBUG oslo_concurrency.lockutils [req-43faa860-5e7a-4d94-8aea-2bd29eddbd5f req-6aaa3072-6ab5-45e5-84c5-f18d65eac6b1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:30:31 np0005539505 nova_compute[186958]: 2025-11-29 07:30:31.763 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:30:31 np0005539505 nova_compute[186958]: 2025-11-29 07:30:31.764 186962 DEBUG nova.network.neutron [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:30:32 np0005539505 nova_compute[186958]: 2025-11-29 07:30:32.152 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401417.1519296, 96d9c767-4ce6-4a5f-93bc-236f9592b9d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:30:32 np0005539505 nova_compute[186958]: 2025-11-29 07:30:32.153 186962 INFO nova.compute.manager [-] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:30:32 np0005539505 nova_compute[186958]: 2025-11-29 07:30:32.206 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:32 np0005539505 nova_compute[186958]: 2025-11-29 07:30:32.697 186962 DEBUG nova.compute.manager [None req-b8660bec-0385-4541-9d6b-f14405178d53 - - - - - -] [instance: 96d9c767-4ce6-4a5f-93bc-236f9592b9d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:30:33 np0005539505 nova_compute[186958]: 2025-11-29 07:30:33.740 186962 DEBUG nova.network.neutron [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:30:35 np0005539505 nova_compute[186958]: 2025-11-29 07:30:35.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:30:35 np0005539505 nova_compute[186958]: 2025-11-29 07:30:35.843 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.173 186962 DEBUG nova.network.neutron [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Updating instance_info_cache with network_info: [{"id": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "address": "fa:16:3e:b2:c4:17", "network": {"id": "ca1a5c39-f8e9-483b-bec9-4649fde14447", "bridge": "br-int", "label": "tempest-network-smoke--2019186041", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8aba1b5-0a", "ovs_interfaceid": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.248 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.249 186962 DEBUG nova.compute.manager [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Instance network_info: |[{"id": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "address": "fa:16:3e:b2:c4:17", "network": {"id": "ca1a5c39-f8e9-483b-bec9-4649fde14447", "bridge": "br-int", "label": "tempest-network-smoke--2019186041", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8aba1b5-0a", "ovs_interfaceid": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.251 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Start _get_guest_xml network_info=[{"id": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "address": "fa:16:3e:b2:c4:17", "network": {"id": "ca1a5c39-f8e9-483b-bec9-4649fde14447", "bridge": "br-int", "label": "tempest-network-smoke--2019186041", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8aba1b5-0a", "ovs_interfaceid": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.255 186962 WARNING nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.261 186962 DEBUG nova.virt.libvirt.host [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.262 186962 DEBUG nova.virt.libvirt.host [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.266 186962 DEBUG nova.virt.libvirt.host [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.267 186962 DEBUG nova.virt.libvirt.host [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.268 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.268 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.269 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.269 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.270 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.270 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.270 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.270 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.271 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.271 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.271 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.272 186962 DEBUG nova.virt.hardware [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.275 186962 DEBUG nova.virt.libvirt.vif [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1778982610',display_name='tempest-TestNetworkBasicOps-server-1778982610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1778982610',id=143,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7vmcEuMTXEy7Figk9HNQZAVTFRHfbot7/wlxWHZi4/TPIsOsfbK8Oqf7oXJmr2L7vCQh9F6AJKHLqK3jMPOWatEqd+CE0cXMu9iIXc1Crg6VnhLG3Foof5KhaIpHTbeg==',key_name='tempest-TestNetworkBasicOps-1699100556',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-0wojgbjj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:30:23Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=f1e06afd-5b90-4cc3-87a6-54bcfcb07b91,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "address": "fa:16:3e:b2:c4:17", "network": {"id": "ca1a5c39-f8e9-483b-bec9-4649fde14447", "bridge": "br-int", "label": "tempest-network-smoke--2019186041", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8aba1b5-0a", "ovs_interfaceid": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.275 186962 DEBUG nova.network.os_vif_util [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "address": "fa:16:3e:b2:c4:17", "network": {"id": "ca1a5c39-f8e9-483b-bec9-4649fde14447", "bridge": "br-int", "label": "tempest-network-smoke--2019186041", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8aba1b5-0a", "ovs_interfaceid": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.276 186962 DEBUG nova.network.os_vif_util [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=a8aba1b5-0a8f-4d8c-9284-d89c9b87efed,network=Network(ca1a5c39-f8e9-483b-bec9-4649fde14447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8aba1b5-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:30:36 np0005539505 nova_compute[186958]: 2025-11-29 07:30:36.277 186962 DEBUG nova.objects.instance [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.146 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  <uuid>f1e06afd-5b90-4cc3-87a6-54bcfcb07b91</uuid>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  <name>instance-0000008f</name>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestNetworkBasicOps-server-1778982610</nova:name>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:30:36</nova:creationTime>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:        <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:        <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:        <nova:port uuid="a8aba1b5-0a8f-4d8c-9284-d89c9b87efed">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <entry name="serial">f1e06afd-5b90-4cc3-87a6-54bcfcb07b91</entry>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <entry name="uuid">f1e06afd-5b90-4cc3-87a6-54bcfcb07b91</entry>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.config"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:b2:c4:17"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <target dev="tapa8aba1b5-0a"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/console.log" append="off"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:30:37 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:30:37 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:30:37 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:30:37 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.148 186962 DEBUG nova.compute.manager [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Preparing to wait for external event network-vif-plugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.148 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.148 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.148 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.149 186962 DEBUG nova.virt.libvirt.vif [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1778982610',display_name='tempest-TestNetworkBasicOps-server-1778982610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1778982610',id=143,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7vmcEuMTXEy7Figk9HNQZAVTFRHfbot7/wlxWHZi4/TPIsOsfbK8Oqf7oXJmr2L7vCQh9F6AJKHLqK3jMPOWatEqd+CE0cXMu9iIXc1Crg6VnhLG3Foof5KhaIpHTbeg==',key_name='tempest-TestNetworkBasicOps-1699100556',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-0wojgbjj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:30:23Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=f1e06afd-5b90-4cc3-87a6-54bcfcb07b91,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "address": "fa:16:3e:b2:c4:17", "network": {"id": "ca1a5c39-f8e9-483b-bec9-4649fde14447", "bridge": "br-int", "label": "tempest-network-smoke--2019186041", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8aba1b5-0a", "ovs_interfaceid": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.149 186962 DEBUG nova.network.os_vif_util [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "address": "fa:16:3e:b2:c4:17", "network": {"id": "ca1a5c39-f8e9-483b-bec9-4649fde14447", "bridge": "br-int", "label": "tempest-network-smoke--2019186041", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8aba1b5-0a", "ovs_interfaceid": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.150 186962 DEBUG nova.network.os_vif_util [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=a8aba1b5-0a8f-4d8c-9284-d89c9b87efed,network=Network(ca1a5c39-f8e9-483b-bec9-4649fde14447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8aba1b5-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.151 186962 DEBUG os_vif [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=a8aba1b5-0a8f-4d8c-9284-d89c9b87efed,network=Network(ca1a5c39-f8e9-483b-bec9-4649fde14447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8aba1b5-0a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.151 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.152 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.152 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.155 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.156 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa8aba1b5-0a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.156 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa8aba1b5-0a, col_values=(('external_ids', {'iface-id': 'a8aba1b5-0a8f-4d8c-9284-d89c9b87efed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:c4:17', 'vm-uuid': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.158 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:37 np0005539505 NetworkManager[55134]: <info>  [1764401437.1589] manager: (tapa8aba1b5-0a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.160 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.164 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:37 np0005539505 nova_compute[186958]: 2025-11-29 07:30:37.165 186962 INFO os_vif [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=a8aba1b5-0a8f-4d8c-9284-d89c9b87efed,network=Network(ca1a5c39-f8e9-483b-bec9-4649fde14447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8aba1b5-0a')#033[00m
Nov 29 02:30:38 np0005539505 nova_compute[186958]: 2025-11-29 07:30:38.380 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:30:38 np0005539505 nova_compute[186958]: 2025-11-29 07:30:38.381 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:30:38 np0005539505 nova_compute[186958]: 2025-11-29 07:30:38.381 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:b2:c4:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:30:38 np0005539505 nova_compute[186958]: 2025-11-29 07:30:38.382 186962 INFO nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Using config drive#033[00m
Nov 29 02:30:38 np0005539505 podman[243111]: 2025-11-29 07:30:38.743965448 +0000 UTC m=+0.066968306 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:30:38 np0005539505 podman[243112]: 2025-11-29 07:30:38.770742626 +0000 UTC m=+0.094273669 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:30:40 np0005539505 nova_compute[186958]: 2025-11-29 07:30:40.844 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.230 186962 INFO nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Creating config drive at /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.config#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.234 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1jkwapem execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.364 186962 DEBUG oslo_concurrency.processutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1jkwapem" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:30:41 np0005539505 kernel: tapa8aba1b5-0a: entered promiscuous mode
Nov 29 02:30:41 np0005539505 NetworkManager[55134]: <info>  [1764401441.4282] manager: (tapa8aba1b5-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Nov 29 02:30:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:41Z|00687|binding|INFO|Claiming lport a8aba1b5-0a8f-4d8c-9284-d89c9b87efed for this chassis.
Nov 29 02:30:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:41Z|00688|binding|INFO|a8aba1b5-0a8f-4d8c-9284-d89c9b87efed: Claiming fa:16:3e:b2:c4:17 10.100.0.21
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.428 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:41 np0005539505 systemd-udevd[243182]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.456 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:c4:17 10.100.0.21'], port_security=['fa:16:3e:b2:c4:17 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca1a5c39-f8e9-483b-bec9-4649fde14447', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8bf36cb9-27a3-4c82-bc36-6eb782f28369', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e84be612-9691-4492-be4b-902d298eebaf, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=a8aba1b5-0a8f-4d8c-9284-d89c9b87efed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.458 104094 INFO neutron.agent.ovn.metadata.agent [-] Port a8aba1b5-0a8f-4d8c-9284-d89c9b87efed in datapath ca1a5c39-f8e9-483b-bec9-4649fde14447 bound to our chassis#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.459 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca1a5c39-f8e9-483b-bec9-4649fde14447#033[00m
Nov 29 02:30:41 np0005539505 NetworkManager[55134]: <info>  [1764401441.4648] device (tapa8aba1b5-0a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:30:41 np0005539505 NetworkManager[55134]: <info>  [1764401441.4656] device (tapa8aba1b5-0a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:30:41 np0005539505 systemd-machined[153285]: New machine qemu-76-instance-0000008f.
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.470 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d298704a-49af-4277-a633-ae6073b4f65b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.471 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapca1a5c39-f1 in ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.473 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapca1a5c39-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.473 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fd8f1427-beb9-439a-995b-6c311eb486ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.473 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dc810420-dd8e-4651-97ec-16908227b03e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.483 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.484 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[55d98106-1da5-4449-9f18-da0057382000]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 systemd[1]: Started Virtual Machine qemu-76-instance-0000008f.
Nov 29 02:30:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:41Z|00689|binding|INFO|Setting lport a8aba1b5-0a8f-4d8c-9284-d89c9b87efed ovn-installed in OVS
Nov 29 02:30:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:41Z|00690|binding|INFO|Setting lport a8aba1b5-0a8f-4d8c-9284-d89c9b87efed up in Southbound
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.488 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.507 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc8ca30-9ae6-4497-92c7-18e0b5240f7d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.533 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e9f86111-377d-4bcb-9a46-3e5a564d196f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 NetworkManager[55134]: <info>  [1764401441.5387] manager: (tapca1a5c39-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/337)
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.537 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cb6c28b6-ecf6-4ccb-8f77-0bc641be492d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.566 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[db4a26e1-7c51-4bd8-8c26-2dcc2a2f980e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.569 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f8bcd381-67be-4839-9b11-2e60e2b4ed82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 NetworkManager[55134]: <info>  [1764401441.5926] device (tapca1a5c39-f0): carrier: link connected
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.597 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c91c1745-edfa-4e0a-8200-f77e2d707cdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.615 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fa024ea7-5403-4f37-92c2-1c3647dabefb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca1a5c39-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:e8:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 688917, 'reachable_time': 31835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243216, 'error': None, 'target': 'ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.634 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b574383d-2112-4796-a665-5267577f8312]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:e8d9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 688917, 'tstamp': 688917}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243217, 'error': None, 'target': 'ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.653 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e39fcae2-74cd-4643-aebc-4e61ffcfc663]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca1a5c39-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:e8:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 688917, 'reachable_time': 31835, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243218, 'error': None, 'target': 'ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.679 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7fe97764-5f93-4e73-819d-cee788062e7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.734 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[aef34595-2128-43d6-9c60-51d7715de6ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.735 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca1a5c39-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.736 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.736 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca1a5c39-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.738 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:41 np0005539505 NetworkManager[55134]: <info>  [1764401441.7389] manager: (tapca1a5c39-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/338)
Nov 29 02:30:41 np0005539505 kernel: tapca1a5c39-f0: entered promiscuous mode
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.742 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca1a5c39-f0, col_values=(('external_ids', {'iface-id': '91d4c43b-3c14-4071-996a-5ec1344c7386'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.743 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:41Z|00691|binding|INFO|Releasing lport 91d4c43b-3c14-4071-996a-5ec1344c7386 from this chassis (sb_readonly=0)
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.745 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ca1a5c39-f8e9-483b-bec9-4649fde14447.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ca1a5c39-f8e9-483b-bec9-4649fde14447.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.745 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[75e15da8-4f60-4ee6-9eef-9de9f7d67159]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.746 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-ca1a5c39-f8e9-483b-bec9-4649fde14447
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/ca1a5c39-f8e9-483b-bec9-4649fde14447.pid.haproxy
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID ca1a5c39-f8e9-483b-bec9-4649fde14447
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:30:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:41.747 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447', 'env', 'PROCESS_TAG=haproxy-ca1a5c39-f8e9-483b-bec9-4649fde14447', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ca1a5c39-f8e9-483b-bec9-4649fde14447.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.759 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.808 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401441.8081648, f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.809 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] VM Started (Lifecycle Event)#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.924 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.931 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401441.809155, f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.931 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.964 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:30:41 np0005539505 nova_compute[186958]: 2025-11-29 07:30:41.968 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:30:42 np0005539505 nova_compute[186958]: 2025-11-29 07:30:42.043 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:30:42 np0005539505 podman[243257]: 2025-11-29 07:30:42.058308791 +0000 UTC m=+0.022935100 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:30:42 np0005539505 nova_compute[186958]: 2025-11-29 07:30:42.158 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:42 np0005539505 podman[243257]: 2025-11-29 07:30:42.169456717 +0000 UTC m=+0.134083006 container create d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:30:42 np0005539505 systemd[1]: Started libpod-conmon-d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342.scope.
Nov 29 02:30:42 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:30:42 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c59a567f2f205d7176104dfd8676eb5bb2d5df9ca85b43fc26e3514845962c5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:30:42 np0005539505 podman[243257]: 2025-11-29 07:30:42.266830833 +0000 UTC m=+0.231457162 container init d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:30:42 np0005539505 podman[243257]: 2025-11-29 07:30:42.274287164 +0000 UTC m=+0.238913453 container start d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 29 02:30:42 np0005539505 podman[243270]: 2025-11-29 07:30:42.279348477 +0000 UTC m=+0.071358770 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:30:42 np0005539505 neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447[243279]: [NOTICE]   (243307) : New worker (243318) forked
Nov 29 02:30:42 np0005539505 neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447[243279]: [NOTICE]   (243307) : Loading success.
Nov 29 02:30:42 np0005539505 podman[243273]: 2025-11-29 07:30:42.341098775 +0000 UTC m=+0.124174685 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:30:43 np0005539505 nova_compute[186958]: 2025-11-29 07:30:43.548 186962 DEBUG nova.compute.manager [req-f5cc1792-9b22-45d4-96aa-25b8cc6f894c req-fcfa5c32-c54a-4f66-9eaa-58a48450a534 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Received event network-vif-plugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:30:43 np0005539505 nova_compute[186958]: 2025-11-29 07:30:43.548 186962 DEBUG oslo_concurrency.lockutils [req-f5cc1792-9b22-45d4-96aa-25b8cc6f894c req-fcfa5c32-c54a-4f66-9eaa-58a48450a534 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:43 np0005539505 nova_compute[186958]: 2025-11-29 07:30:43.549 186962 DEBUG oslo_concurrency.lockutils [req-f5cc1792-9b22-45d4-96aa-25b8cc6f894c req-fcfa5c32-c54a-4f66-9eaa-58a48450a534 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:43 np0005539505 nova_compute[186958]: 2025-11-29 07:30:43.549 186962 DEBUG oslo_concurrency.lockutils [req-f5cc1792-9b22-45d4-96aa-25b8cc6f894c req-fcfa5c32-c54a-4f66-9eaa-58a48450a534 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:43 np0005539505 nova_compute[186958]: 2025-11-29 07:30:43.550 186962 DEBUG nova.compute.manager [req-f5cc1792-9b22-45d4-96aa-25b8cc6f894c req-fcfa5c32-c54a-4f66-9eaa-58a48450a534 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Processing event network-vif-plugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:30:43 np0005539505 nova_compute[186958]: 2025-11-29 07:30:43.551 186962 DEBUG nova.compute.manager [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:30:43 np0005539505 nova_compute[186958]: 2025-11-29 07:30:43.556 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401443.5561244, f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:30:43 np0005539505 nova_compute[186958]: 2025-11-29 07:30:43.557 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:30:43 np0005539505 nova_compute[186958]: 2025-11-29 07:30:43.562 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:30:43 np0005539505 nova_compute[186958]: 2025-11-29 07:30:43.569 186962 INFO nova.virt.libvirt.driver [-] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Instance spawned successfully.#033[00m
Nov 29 02:30:43 np0005539505 nova_compute[186958]: 2025-11-29 07:30:43.570 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:30:44 np0005539505 nova_compute[186958]: 2025-11-29 07:30:44.442 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:30:44 np0005539505 nova_compute[186958]: 2025-11-29 07:30:44.448 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:44 np0005539505 nova_compute[186958]: 2025-11-29 07:30:44.448 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:44 np0005539505 nova_compute[186958]: 2025-11-29 07:30:44.449 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:44 np0005539505 nova_compute[186958]: 2025-11-29 07:30:44.449 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:44 np0005539505 nova_compute[186958]: 2025-11-29 07:30:44.450 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:44 np0005539505 nova_compute[186958]: 2025-11-29 07:30:44.450 186962 DEBUG nova.virt.libvirt.driver [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:30:44 np0005539505 nova_compute[186958]: 2025-11-29 07:30:44.455 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:30:44 np0005539505 nova_compute[186958]: 2025-11-29 07:30:44.639 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:30:45 np0005539505 nova_compute[186958]: 2025-11-29 07:30:45.140 186962 INFO nova.compute.manager [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Took 20.89 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:30:45 np0005539505 nova_compute[186958]: 2025-11-29 07:30:45.141 186962 DEBUG nova.compute.manager [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:30:45 np0005539505 nova_compute[186958]: 2025-11-29 07:30:45.886 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:46 np0005539505 nova_compute[186958]: 2025-11-29 07:30:46.011 186962 DEBUG nova.compute.manager [req-f3813a42-b1e4-446b-a097-dabe2ba710f8 req-cd15d1ff-8fcc-4d96-a5c3-e3046a1bb0d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Received event network-vif-plugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:30:46 np0005539505 nova_compute[186958]: 2025-11-29 07:30:46.012 186962 DEBUG oslo_concurrency.lockutils [req-f3813a42-b1e4-446b-a097-dabe2ba710f8 req-cd15d1ff-8fcc-4d96-a5c3-e3046a1bb0d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:30:46 np0005539505 nova_compute[186958]: 2025-11-29 07:30:46.013 186962 DEBUG oslo_concurrency.lockutils [req-f3813a42-b1e4-446b-a097-dabe2ba710f8 req-cd15d1ff-8fcc-4d96-a5c3-e3046a1bb0d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:30:46 np0005539505 nova_compute[186958]: 2025-11-29 07:30:46.013 186962 DEBUG oslo_concurrency.lockutils [req-f3813a42-b1e4-446b-a097-dabe2ba710f8 req-cd15d1ff-8fcc-4d96-a5c3-e3046a1bb0d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:46 np0005539505 nova_compute[186958]: 2025-11-29 07:30:46.014 186962 DEBUG nova.compute.manager [req-f3813a42-b1e4-446b-a097-dabe2ba710f8 req-cd15d1ff-8fcc-4d96-a5c3-e3046a1bb0d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] No waiting events found dispatching network-vif-plugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:30:46 np0005539505 nova_compute[186958]: 2025-11-29 07:30:46.014 186962 WARNING nova.compute.manager [req-f3813a42-b1e4-446b-a097-dabe2ba710f8 req-cd15d1ff-8fcc-4d96-a5c3-e3046a1bb0d7 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Received unexpected event network-vif-plugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:30:46 np0005539505 nova_compute[186958]: 2025-11-29 07:30:46.838 186962 INFO nova.compute.manager [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Took 24.35 seconds to build instance.#033[00m
Nov 29 02:30:47 np0005539505 nova_compute[186958]: 2025-11-29 07:30:47.206 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.103 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'name': 'tempest-TestNetworkBasicOps-server-1778982610', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000008f', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ec8b80be17a14d1caf666636283749d0', 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'hostId': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.129 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.130 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32dc74c6-7e39-4277-8749-55cf718edde9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-vda', 'timestamp': '2025-11-29T07:30:48.105061', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '531326be-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': '8309b78d6bf6cf68a48fb45ba3bc9daa7c1cd37e5e47b8d5aa150638a0843a6c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-sda', 'timestamp': '2025-11-29T07:30:48.105061', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53133762-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': '40571035fed94378083854e6fa6254413567d00256afa66eb57fd5bf8978c611'}]}, 'timestamp': '2025-11-29 07:30:48.130866', '_unique_id': '0ee3d79e2da544a980783dbf40a373fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.133 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.133 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2a1f042-7e2d-4fc1-a363-dc5d52d806f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-vda', 'timestamp': '2025-11-29T07:30:48.133751', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5313b34a-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': 'bb89e5abb3472f0dc5256cc7fa66db364604445adbe9fbbfd85a6e394d910d3f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-sda', 'timestamp': '2025-11-29T07:30:48.133751', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5313bebc-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': '0a05f6fa5d59089f0d6c37577445a9f4d464fdf0146b8773cd7800c3e23b448f'}]}, 'timestamp': '2025-11-29 07:30:48.134335', '_unique_id': 'ecf2cf82c7f74940a97460b0eba32471'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.135 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.139 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 / tapa8aba1b5-0a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.139 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ede4645-85ee-4dd1-ac7a-40a9e131ede5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008f-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-tapa8aba1b5-0a', 'timestamp': '2025-11-29T07:30:48.135986', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'tapa8aba1b5-0a', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8aba1b5-0a'}, 'message_id': '5314a7a0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.776844149, 'message_signature': '7e6ca6cf4062d637dbac1053d05ed6b04720d08198335c55055c45851367ebf6'}]}, 'timestamp': '2025-11-29 07:30:48.140365', '_unique_id': '523a714d037b4f72b15c5af9d917bcbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.141 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.142 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.142 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09fe4214-80dc-4f22-95cf-117e24852a40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008f-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-tapa8aba1b5-0a', 'timestamp': '2025-11-29T07:30:48.142273', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'tapa8aba1b5-0a', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8aba1b5-0a'}, 'message_id': '53150272-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.776844149, 'message_signature': 'b9b7113246a1933602b2f57cde979b04afc8cde20805d2ec6f75e2145c033d5c'}]}, 'timestamp': '2025-11-29 07:30:48.142664', '_unique_id': '63958c1c94d14765bc480d63659dec6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.144 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.144 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea479e7d-3d3d-43d1-bb6c-ee58079af68f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-vda', 'timestamp': '2025-11-29T07:30:48.144608', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53155d3a-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': 'f7aa855533a6525aff765d2b9adee637ec5eeb4f4e33c4c093b83c6225ba1c6a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-sda', 'timestamp': '2025-11-29T07:30:48.144608', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53156a6e-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': 'b0ddf85a7b16aa5b870d3cded7b2d94748ed0a6ad44fff2d17973be6e8604421'}]}, 'timestamp': '2025-11-29 07:30:48.145327', '_unique_id': '6575bbb613e44056a6d8bad4a742f88e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.145 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.161 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.161 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ea11903-99bc-4166-ab06-70cf3d23d70a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-vda', 'timestamp': '2025-11-29T07:30:48.147425', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5317ecd0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.788481589, 'message_signature': 'cc62a83130e417c1b71f6ee9dc96fa83f9d994b9a663cb11ab132bc316113265'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-sda', 'timestamp': '2025-11-29T07:30:48.147425', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5317fd2e-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.788481589, 'message_signature': '8d8a1120a6d78a151dd9bced48bd611691e605e64abad622d7e747735695a651'}]}, 'timestamp': '2025-11-29 07:30:48.162173', '_unique_id': '26b070050dc547d08a108fc28ed65751'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.164 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.read.latency volume: 136803169 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.164 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.read.latency volume: 595747 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f8a86bc-9910-42b7-a6ec-7360d8c99f27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 136803169, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-vda', 'timestamp': '2025-11-29T07:30:48.164313', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '53185d5a-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': 'b8c19ac5e1a222f8b47ed1f86170ad80f23c26df6c79aba9d797ebb2aaac5b7c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 595747, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-sda', 'timestamp': '2025-11-29T07:30:48.164313', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53186700-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': '4b3b8109ab9489edda57f5bdfd401b8d0fc1e7d50867af4d23f9309077334c21'}]}, 'timestamp': '2025-11-29 07:30:48.164837', '_unique_id': 'a1103af59cb34a5894abff46e105eaf2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.166 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41674ad9-b303-4e67-970a-a46bbaff2673', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008f-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-tapa8aba1b5-0a', 'timestamp': '2025-11-29T07:30:48.166409', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'tapa8aba1b5-0a', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8aba1b5-0a'}, 'message_id': '5318af80-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.776844149, 'message_signature': '19c92e2039f0decd0432141f86e309d83154066fa26ebb82d6aa7491623210e4'}]}, 'timestamp': '2025-11-29 07:30:48.166707', '_unique_id': 'e601fb2267ad400886b76f9ec9fd39e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.168 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.168 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cfe5a65-65c0-4f4f-a0fc-e389985235b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-vda', 'timestamp': '2025-11-29T07:30:48.168500', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '531900ac-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.788481589, 'message_signature': '98de3f065ec23f7f9be756192f6f09cc5fa21a69e3eec562ecb15af389f2e1be'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-sda', 'timestamp': '2025-11-29T07:30:48.168500', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '53190b42-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.788481589, 'message_signature': 'f24b6a7518696d8ff8d51191780df9e9826cd6b420e99d3f6057a94922e2033f'}]}, 'timestamp': '2025-11-29 07:30:48.169040', '_unique_id': '47be54e0b6974f3db45b1260f99bfd40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.170 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.171 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1778982610>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1778982610>]
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.171 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.171 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a32c4f6-813b-4843-ab3c-d0064b4217a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008f-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-tapa8aba1b5-0a', 'timestamp': '2025-11-29T07:30:48.171544', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'tapa8aba1b5-0a', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8aba1b5-0a'}, 'message_id': '53197974-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.776844149, 'message_signature': '79823e2be3757d98cc349c1138bde0b96bf15070097d07aa87988a4996727372'}]}, 'timestamp': '2025-11-29 07:30:48.171926', '_unique_id': '74a3e5239b8e4dcab3afe46bd17d2511'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.173 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.173 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.174 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1778982610>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1778982610>]
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.174 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.174 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.174 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 nova_compute[186958]: 2025-11-29 07:30:48.174 186962 DEBUG oslo_concurrency.lockutils [None req-f554a69a-759c-4deb-a25e-5c412992b235 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 25.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55001c13-ee1b-4f0b-8b6d-c129413516d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-vda', 'timestamp': '2025-11-29T07:30:48.174518', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5319edf0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': 'e4fd1454b2977228e47dd3e04e9cd9d888c9fa9c1b22d4e68e455cdb6f9557d5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-sda', 'timestamp': '2025-11-29T07:30:48.174518', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5319fb1a-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': 'e6ce00763ea6ba7991aa2b5245df4eeb34f0e118da95ec3158ee18161edd620c'}]}, 'timestamp': '2025-11-29 07:30:48.175257', '_unique_id': 'bacf01b10a1f4872ab3d2be53b74e08a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.175 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.177 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08305f23-3a0b-46a9-92ac-c5932ce8993f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008f-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-tapa8aba1b5-0a', 'timestamp': '2025-11-29T07:30:48.177620', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'tapa8aba1b5-0a', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8aba1b5-0a'}, 'message_id': '531a6816-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.776844149, 'message_signature': '6444336a605b51c82b0a1d1625f48a989f6fb45ff21f025a687995f5e8f7ad9f'}]}, 'timestamp': '2025-11-29 07:30:48.178047', '_unique_id': 'd72026ea6e9343678ff87492eb551069'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.180 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dff1881-0a77-4591-b101-450b815b5088', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008f-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-tapa8aba1b5-0a', 'timestamp': '2025-11-29T07:30:48.180062', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'tapa8aba1b5-0a', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8aba1b5-0a'}, 'message_id': '531ac77a-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.776844149, 'message_signature': 'c7c5f88d876500f956d3abe42d9f0425e4fcc6cf7bbf29b0d642fa9eae18dcd7'}]}, 'timestamp': '2025-11-29 07:30:48.180474', '_unique_id': '9b11a27ca6624c97b45787b4521d94f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.182 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.182 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1778982610>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1778982610>]
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.183 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.183 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1783bbd5-b918-423d-8580-61013f8783b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-vda', 'timestamp': '2025-11-29T07:30:48.183050', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '531b3b42-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': 'ed34556b8b70d690d04c15ddf40d57d0b32480b075cd673322566f1639423943'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-sda', 'timestamp': '2025-11-29T07:30:48.183050', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '531b488a-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.745926144, 'message_signature': '50acfefb04399cfe68b8df7fb6df1fb7270a00e6ebad8683e01db429d0ca6a73'}]}, 'timestamp': '2025-11-29 07:30:48.183763', '_unique_id': '92b77d9b13274d4ea66afff17cb132cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.185 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bdb27e8-ab5b-45b3-8ccc-d8e52605b11d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008f-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-tapa8aba1b5-0a', 'timestamp': '2025-11-29T07:30:48.185825', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'tapa8aba1b5-0a', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8aba1b5-0a'}, 'message_id': '531ba870-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.776844149, 'message_signature': '6d11d5e8955a48d3407644bbcec9c6380d2a050b5aa81162d2e52026a809a5fb'}]}, 'timestamp': '2025-11-29 07:30:48.186251', '_unique_id': '6c7e592b2122428e9ce0c0c81c60be5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.187 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.206 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.206 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance f1e06afd-5b90-4cc3-87a6-54bcfcb07b91: ceilometer.compute.pollsters.NoVolumeException
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.207 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'feb139d2-1346-4665-bc2f-cd37f2aed17d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008f-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-tapa8aba1b5-0a', 'timestamp': '2025-11-29T07:30:48.207091', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'tapa8aba1b5-0a', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8aba1b5-0a'}, 'message_id': '531ee8b4-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.776844149, 'message_signature': '46756b4fedd4146f8f94e71d2ee1eb2437c22efa0a482b8463f5aab9167e2da9'}]}, 'timestamp': '2025-11-29 07:30:48.207587', '_unique_id': '92fbf1319d264d61aa03c47dc4ecf87b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.210 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.210 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1778982610>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1778982610>]
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.210 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.211 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0db832d0-db99-4286-94a0-12cd19fdb519', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-vda', 'timestamp': '2025-11-29T07:30:48.210692', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '531f7392-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.788481589, 'message_signature': '00db7af6eabe0ded0b380e8adadfb8d196e94d44066314eefab84037ae273f8c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-sda', 'timestamp': '2025-11-29T07:30:48.210692', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '531f8684-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.788481589, 'message_signature': 'b537362a47b9ef1288c31032d4f83b24c35983307670be392d7f914fabf4cb05'}]}, 'timestamp': '2025-11-29 07:30:48.211612', '_unique_id': '3a90ce70aea34de08909135cc5c86e4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.213 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1153f8c3-33fa-42b7-a055-5f1b0353fe82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008f-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-tapa8aba1b5-0a', 'timestamp': '2025-11-29T07:30:48.213734', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'tapa8aba1b5-0a', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8aba1b5-0a'}, 'message_id': '531fea98-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.776844149, 'message_signature': 'f30adc52b27f5326d77861f0e73d10eacf004d35f4ff50dea69b5249dd6899fa'}]}, 'timestamp': '2025-11-29 07:30:48.214190', '_unique_id': '962a7458842645ec949bf240949e1789'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.214 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.216 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09dd9b3b-860b-4704-9466-54bd1ecd550c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000008f-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-tapa8aba1b5-0a', 'timestamp': '2025-11-29T07:30:48.216462', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'tapa8aba1b5-0a', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:c4:17', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8aba1b5-0a'}, 'message_id': '53205492-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.776844149, 'message_signature': 'c457b0bd0cc4927ca27989e308d10eb7bf2fbeb708adad9e4864de9a296a620d'}]}, 'timestamp': '2025-11-29 07:30:48.216884', '_unique_id': 'afd9c2234089470392197f1b2cd8fdc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.217 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.218 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.219 12 DEBUG ceilometer.compute.pollsters [-] f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/cpu volume: 4410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a152af6d-7d3a-4b8e-8692-7bd4d1a914d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4410000000, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'timestamp': '2025-11-29T07:30:48.219167', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1778982610', 'name': 'instance-0000008f', 'instance_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5320c198-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 6895.847204611, 'message_signature': '2d2c9f9121fa58d7b38357c31d15bb8c2bd763d6adda510c38c2192928d9d644'}]}, 'timestamp': '2025-11-29 07:30:48.219711', '_unique_id': '6e33382395e84cdfa19f4a442cba2b56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:30:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:30:48.220 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:30:50 np0005539505 nova_compute[186958]: 2025-11-29 07:30:50.936 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:52 np0005539505 nova_compute[186958]: 2025-11-29 07:30:52.208 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:55 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:55Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b2:c4:17 10.100.0.21
Nov 29 02:30:55 np0005539505 ovn_controller[95143]: 2025-11-29T07:30:55Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b2:c4:17 10.100.0.21
Nov 29 02:30:55 np0005539505 nova_compute[186958]: 2025-11-29 07:30:55.937 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:56 np0005539505 podman[243346]: 2025-11-29 07:30:56.729173369 +0000 UTC m=+0.055344658 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:30:56 np0005539505 podman[243345]: 2025-11-29 07:30:56.736177157 +0000 UTC m=+0.063004584 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 02:30:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:56.827 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:30:56 np0005539505 nova_compute[186958]: 2025-11-29 07:30:56.828 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:56.829 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:30:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:30:56.829 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:30:57 np0005539505 nova_compute[186958]: 2025-11-29 07:30:57.209 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:30:59 np0005539505 podman[243387]: 2025-11-29 07:30:59.723167025 +0000 UTC m=+0.044835970 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 02:31:00 np0005539505 nova_compute[186958]: 2025-11-29 07:31:00.971 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:02 np0005539505 nova_compute[186958]: 2025-11-29 07:31:02.211 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:05 np0005539505 nova_compute[186958]: 2025-11-29 07:31:05.972 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:07 np0005539505 nova_compute[186958]: 2025-11-29 07:31:07.213 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:09 np0005539505 podman[243406]: 2025-11-29 07:31:09.724996389 +0000 UTC m=+0.049949915 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:31:09 np0005539505 podman[243407]: 2025-11-29 07:31:09.78650161 +0000 UTC m=+0.104629873 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 29 02:31:11 np0005539505 nova_compute[186958]: 2025-11-29 07:31:11.024 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:12 np0005539505 nova_compute[186958]: 2025-11-29 07:31:12.261 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:12 np0005539505 podman[243457]: 2025-11-29 07:31:12.759265365 +0000 UTC m=+0.089826623 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 29 02:31:12 np0005539505 podman[243456]: 2025-11-29 07:31:12.772933082 +0000 UTC m=+0.098053446 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:31:15 np0005539505 nova_compute[186958]: 2025-11-29 07:31:15.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:16 np0005539505 nova_compute[186958]: 2025-11-29 07:31:16.026 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:17 np0005539505 nova_compute[186958]: 2025-11-29 07:31:17.262 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:17 np0005539505 nova_compute[186958]: 2025-11-29 07:31:17.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:19 np0005539505 ovn_controller[95143]: 2025-11-29T07:31:19Z|00692|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Nov 29 02:31:21 np0005539505 nova_compute[186958]: 2025-11-29 07:31:21.028 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:22 np0005539505 nova_compute[186958]: 2025-11-29 07:31:22.264 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:22 np0005539505 nova_compute[186958]: 2025-11-29 07:31:22.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:22 np0005539505 nova_compute[186958]: 2025-11-29 07:31:22.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:31:25 np0005539505 nova_compute[186958]: 2025-11-29 07:31:25.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:26 np0005539505 nova_compute[186958]: 2025-11-29 07:31:26.030 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:27 np0005539505 nova_compute[186958]: 2025-11-29 07:31:27.305 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:27 np0005539505 nova_compute[186958]: 2025-11-29 07:31:27.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:31:27.510 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:31:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:31:27.511 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:31:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:31:27.511 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:31:27 np0005539505 podman[243497]: 2025-11-29 07:31:27.758385132 +0000 UTC m=+0.076840645 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:31:27 np0005539505 podman[243496]: 2025-11-29 07:31:27.767020497 +0000 UTC m=+0.090748190 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:31:28 np0005539505 nova_compute[186958]: 2025-11-29 07:31:28.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:31:28 np0005539505 nova_compute[186958]: 2025-11-29 07:31:28.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:31:28 np0005539505 nova_compute[186958]: 2025-11-29 07:31:28.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:31:30 np0005539505 podman[243542]: 2025-11-29 07:31:30.762933896 +0000 UTC m=+0.087583599 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 02:31:31 np0005539505 nova_compute[186958]: 2025-11-29 07:31:31.031 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:32 np0005539505 nova_compute[186958]: 2025-11-29 07:31:32.160 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:31:32 np0005539505 nova_compute[186958]: 2025-11-29 07:31:32.161 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:31:32 np0005539505 nova_compute[186958]: 2025-11-29 07:31:32.161 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:31:32 np0005539505 nova_compute[186958]: 2025-11-29 07:31:32.162 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:31:32 np0005539505 nova_compute[186958]: 2025-11-29 07:31:32.307 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:36 np0005539505 nova_compute[186958]: 2025-11-29 07:31:36.034 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:37 np0005539505 nova_compute[186958]: 2025-11-29 07:31:37.310 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:40 np0005539505 podman[243561]: 2025-11-29 07:31:40.766407575 +0000 UTC m=+0.096603875 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:31:40 np0005539505 podman[243562]: 2025-11-29 07:31:40.803194696 +0000 UTC m=+0.129906747 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:31:41 np0005539505 nova_compute[186958]: 2025-11-29 07:31:41.036 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:42 np0005539505 nova_compute[186958]: 2025-11-29 07:31:42.313 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:43 np0005539505 podman[243611]: 2025-11-29 07:31:43.737627805 +0000 UTC m=+0.064875557 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:31:43 np0005539505 podman[243610]: 2025-11-29 07:31:43.752662001 +0000 UTC m=+0.080967313 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 02:31:46 np0005539505 nova_compute[186958]: 2025-11-29 07:31:46.038 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:47 np0005539505 nova_compute[186958]: 2025-11-29 07:31:47.316 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:51 np0005539505 nova_compute[186958]: 2025-11-29 07:31:51.041 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:51 np0005539505 ovn_controller[95143]: 2025-11-29T07:31:51Z|00693|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Nov 29 02:31:52 np0005539505 nova_compute[186958]: 2025-11-29 07:31:52.318 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:31:54.835 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:31:54 np0005539505 nova_compute[186958]: 2025-11-29 07:31:54.835 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:54 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:31:54.836 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:31:54 np0005539505 nova_compute[186958]: 2025-11-29 07:31:54.881 186962 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 10.76 sec#033[00m
Nov 29 02:31:56 np0005539505 nova_compute[186958]: 2025-11-29 07:31:56.042 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:57 np0005539505 nova_compute[186958]: 2025-11-29 07:31:57.320 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:31:57.838 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:31:58 np0005539505 podman[243650]: 2025-11-29 07:31:58.743210547 +0000 UTC m=+0.055195166 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:31:58 np0005539505 podman[243649]: 2025-11-29 07:31:58.777411141 +0000 UTC m=+0.094012370 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, version=9.6, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:31:59 np0005539505 nova_compute[186958]: 2025-11-29 07:31:59.154 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:59 np0005539505 NetworkManager[55134]: <info>  [1764401519.1556] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Nov 29 02:31:59 np0005539505 NetworkManager[55134]: <info>  [1764401519.1569] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/340)
Nov 29 02:31:59 np0005539505 nova_compute[186958]: 2025-11-29 07:31:59.310 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:59 np0005539505 ovn_controller[95143]: 2025-11-29T07:31:59Z|00694|binding|INFO|Releasing lport 91d4c43b-3c14-4071-996a-5ec1344c7386 from this chassis (sb_readonly=0)
Nov 29 02:31:59 np0005539505 nova_compute[186958]: 2025-11-29 07:31:59.362 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:31:59 np0005539505 nova_compute[186958]: 2025-11-29 07:31:59.546 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Updating instance_info_cache with network_info: [{"id": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "address": "fa:16:3e:b2:c4:17", "network": {"id": "ca1a5c39-f8e9-483b-bec9-4649fde14447", "bridge": "br-int", "label": "tempest-network-smoke--2019186041", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8aba1b5-0a", "ovs_interfaceid": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:32:00 np0005539505 nova_compute[186958]: 2025-11-29 07:32:00.151 186962 DEBUG oslo_concurrency.lockutils [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:00 np0005539505 nova_compute[186958]: 2025-11-29 07:32:00.152 186962 DEBUG oslo_concurrency.lockutils [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:00 np0005539505 nova_compute[186958]: 2025-11-29 07:32:00.152 186962 DEBUG oslo_concurrency.lockutils [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:00 np0005539505 nova_compute[186958]: 2025-11-29 07:32:00.153 186962 DEBUG oslo_concurrency.lockutils [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:00 np0005539505 nova_compute[186958]: 2025-11-29 07:32:00.153 186962 DEBUG oslo_concurrency.lockutils [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:00 np0005539505 nova_compute[186958]: 2025-11-29 07:32:00.813 186962 INFO nova.compute.manager [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Terminating instance#033[00m
Nov 29 02:32:00 np0005539505 nova_compute[186958]: 2025-11-29 07:32:00.848 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:32:00 np0005539505 nova_compute[186958]: 2025-11-29 07:32:00.849 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:32:00 np0005539505 nova_compute[186958]: 2025-11-29 07:32:00.850 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:00 np0005539505 nova_compute[186958]: 2025-11-29 07:32:00.850 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:01 np0005539505 nova_compute[186958]: 2025-11-29 07:32:01.044 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:01 np0005539505 podman[243699]: 2025-11-29 07:32:01.728562975 +0000 UTC m=+0.062967685 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:32:01 np0005539505 nova_compute[186958]: 2025-11-29 07:32:01.776 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:01 np0005539505 nova_compute[186958]: 2025-11-29 07:32:01.776 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:01 np0005539505 nova_compute[186958]: 2025-11-29 07:32:01.776 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:01 np0005539505 nova_compute[186958]: 2025-11-29 07:32:01.777 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:32:02 np0005539505 nova_compute[186958]: 2025-11-29 07:32:02.321 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:04 np0005539505 nova_compute[186958]: 2025-11-29 07:32:04.589 186962 DEBUG nova.compute.manager [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:32:04 np0005539505 kernel: tapa8aba1b5-0a (unregistering): left promiscuous mode
Nov 29 02:32:04 np0005539505 NetworkManager[55134]: <info>  [1764401524.6099] device (tapa8aba1b5-0a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:32:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:32:04Z|00695|binding|INFO|Releasing lport a8aba1b5-0a8f-4d8c-9284-d89c9b87efed from this chassis (sb_readonly=0)
Nov 29 02:32:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:32:04Z|00696|binding|INFO|Setting lport a8aba1b5-0a8f-4d8c-9284-d89c9b87efed down in Southbound
Nov 29 02:32:04 np0005539505 nova_compute[186958]: 2025-11-29 07:32:04.646 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:32:04Z|00697|binding|INFO|Removing iface tapa8aba1b5-0a ovn-installed in OVS
Nov 29 02:32:04 np0005539505 nova_compute[186958]: 2025-11-29 07:32:04.650 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:04 np0005539505 nova_compute[186958]: 2025-11-29 07:32:04.673 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:04 np0005539505 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d0000008f.scope: Deactivated successfully.
Nov 29 02:32:04 np0005539505 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d0000008f.scope: Consumed 15.460s CPU time.
Nov 29 02:32:04 np0005539505 systemd-machined[153285]: Machine qemu-76-instance-0000008f terminated.
Nov 29 02:32:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:04.752 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:c4:17 10.100.0.21'], port_security=['fa:16:3e:b2:c4:17 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': 'f1e06afd-5b90-4cc3-87a6-54bcfcb07b91', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca1a5c39-f8e9-483b-bec9-4649fde14447', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8bf36cb9-27a3-4c82-bc36-6eb782f28369', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e84be612-9691-4492-be4b-902d298eebaf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=a8aba1b5-0a8f-4d8c-9284-d89c9b87efed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:32:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:04.754 104094 INFO neutron.agent.ovn.metadata.agent [-] Port a8aba1b5-0a8f-4d8c-9284-d89c9b87efed in datapath ca1a5c39-f8e9-483b-bec9-4649fde14447 unbound from our chassis#033[00m
Nov 29 02:32:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:04.755 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ca1a5c39-f8e9-483b-bec9-4649fde14447, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:32:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:04.757 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3cc2cd36-0ed7-4fdb-9a77-9a412308762b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:04.757 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447 namespace which is not needed anymore#033[00m
Nov 29 02:32:04 np0005539505 nova_compute[186958]: 2025-11-29 07:32:04.864 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:04 np0005539505 nova_compute[186958]: 2025-11-29 07:32:04.882 186962 INFO nova.virt.libvirt.driver [-] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Instance destroyed successfully.#033[00m
Nov 29 02:32:04 np0005539505 nova_compute[186958]: 2025-11-29 07:32:04.883 186962 DEBUG nova.objects.instance [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:04 np0005539505 neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447[243279]: [NOTICE]   (243307) : haproxy version is 2.8.14-c23fe91
Nov 29 02:32:04 np0005539505 neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447[243279]: [NOTICE]   (243307) : path to executable is /usr/sbin/haproxy
Nov 29 02:32:04 np0005539505 neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447[243279]: [WARNING]  (243307) : Exiting Master process...
Nov 29 02:32:04 np0005539505 neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447[243279]: [ALERT]    (243307) : Current worker (243318) exited with code 143 (Terminated)
Nov 29 02:32:04 np0005539505 neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447[243279]: [WARNING]  (243307) : All workers exited. Exiting... (0)
Nov 29 02:32:04 np0005539505 systemd[1]: libpod-d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342.scope: Deactivated successfully.
Nov 29 02:32:04 np0005539505 nova_compute[186958]: 2025-11-29 07:32:04.922 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:04 np0005539505 nova_compute[186958]: 2025-11-29 07:32:04.923 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:04 np0005539505 podman[243756]: 2025-11-29 07:32:04.927500723 +0000 UTC m=+0.063067648 container died d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 02:32:04 np0005539505 systemd[1]: var-lib-containers-storage-overlay-2c59a567f2f205d7176104dfd8676eb5bb2d5df9ca85b43fc26e3514845962c5-merged.mount: Deactivated successfully.
Nov 29 02:32:04 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342-userdata-shm.mount: Deactivated successfully.
Nov 29 02:32:04 np0005539505 podman[243756]: 2025-11-29 07:32:04.965641458 +0000 UTC m=+0.101208373 container cleanup d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:32:04 np0005539505 nova_compute[186958]: 2025-11-29 07:32:04.984 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:04 np0005539505 systemd[1]: libpod-conmon-d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342.scope: Deactivated successfully.
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.130 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.134 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5554MB free_disk=73.04494857788086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.134 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.134 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.210 186962 DEBUG nova.virt.libvirt.vif [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:30:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1778982610',display_name='tempest-TestNetworkBasicOps-server-1778982610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1778982610',id=143,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7vmcEuMTXEy7Figk9HNQZAVTFRHfbot7/wlxWHZi4/TPIsOsfbK8Oqf7oXJmr2L7vCQh9F6AJKHLqK3jMPOWatEqd+CE0cXMu9iIXc1Crg6VnhLG3Foof5KhaIpHTbeg==',key_name='tempest-TestNetworkBasicOps-1699100556',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:30:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-0wojgbjj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:30:46Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=f1e06afd-5b90-4cc3-87a6-54bcfcb07b91,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "address": "fa:16:3e:b2:c4:17", "network": {"id": "ca1a5c39-f8e9-483b-bec9-4649fde14447", "bridge": "br-int", "label": "tempest-network-smoke--2019186041", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8aba1b5-0a", "ovs_interfaceid": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.210 186962 DEBUG nova.network.os_vif_util [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "address": "fa:16:3e:b2:c4:17", "network": {"id": "ca1a5c39-f8e9-483b-bec9-4649fde14447", "bridge": "br-int", "label": "tempest-network-smoke--2019186041", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8aba1b5-0a", "ovs_interfaceid": "a8aba1b5-0a8f-4d8c-9284-d89c9b87efed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.211 186962 DEBUG nova.network.os_vif_util [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=a8aba1b5-0a8f-4d8c-9284-d89c9b87efed,network=Network(ca1a5c39-f8e9-483b-bec9-4649fde14447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8aba1b5-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.212 186962 DEBUG os_vif [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=a8aba1b5-0a8f-4d8c-9284-d89c9b87efed,network=Network(ca1a5c39-f8e9-483b-bec9-4649fde14447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8aba1b5-0a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.214 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.215 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8aba1b5-0a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.216 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.218 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.222 186962 INFO os_vif [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:c4:17,bridge_name='br-int',has_traffic_filtering=True,id=a8aba1b5-0a8f-4d8c-9284-d89c9b87efed,network=Network(ca1a5c39-f8e9-483b-bec9-4649fde14447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8aba1b5-0a')#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.222 186962 INFO nova.virt.libvirt.driver [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Deleting instance files /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91_del#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.223 186962 INFO nova.virt.libvirt.driver [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Deletion of /var/lib/nova/instances/f1e06afd-5b90-4cc3-87a6-54bcfcb07b91_del complete#033[00m
Nov 29 02:32:05 np0005539505 podman[243791]: 2025-11-29 07:32:05.438889683 +0000 UTC m=+0.451202675 container remove d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:32:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:05.447 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[450d404f-2d7b-41fd-a024-054a7eec1509]: (4, ('Sat Nov 29 07:32:04 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447 (d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342)\nd833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342\nSat Nov 29 07:32:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447 (d833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342)\nd833ae19fbce63dfaf950970a37d5e506f74841cca0d296260645748d9d6e342\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:05.449 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2abbe8-9171-4022-afdb-1c6d9081335d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:05.450 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca1a5c39-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:32:05 np0005539505 kernel: tapca1a5c39-f0: left promiscuous mode
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.453 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:05 np0005539505 nova_compute[186958]: 2025-11-29 07:32:05.477 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:05.480 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[982f23b8-4416-455d-bbbc-1e1ee703c51f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:05.613 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1dfc3379-71d1-4086-b4b0-36751f01c9c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:05.616 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[88e653d4-8511-4086-9a96-902c346514cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:05.635 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[95d32d6b-d952-41f1-bfdc-6b9511eea4e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 688911, 'reachable_time': 32649, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243810, 'error': None, 'target': 'ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:05.642 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ca1a5c39-f8e9-483b-bec9-4649fde14447 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:32:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:05.642 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[22c0fe45-4f18-4ef2-8d0b-641f39f7f434]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:05 np0005539505 systemd[1]: run-netns-ovnmeta\x2dca1a5c39\x2df8e9\x2d483b\x2dbec9\x2d4649fde14447.mount: Deactivated successfully.
Nov 29 02:32:06 np0005539505 nova_compute[186958]: 2025-11-29 07:32:06.046 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:06 np0005539505 nova_compute[186958]: 2025-11-29 07:32:06.094 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:32:06 np0005539505 nova_compute[186958]: 2025-11-29 07:32:06.094 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:32:06 np0005539505 nova_compute[186958]: 2025-11-29 07:32:06.094 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:32:06 np0005539505 nova_compute[186958]: 2025-11-29 07:32:06.163 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:32:06 np0005539505 nova_compute[186958]: 2025-11-29 07:32:06.394 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:32:07 np0005539505 nova_compute[186958]: 2025-11-29 07:32:07.458 186962 DEBUG nova.compute.manager [req-f2e641db-6df0-4acd-98de-8aabf56f16d6 req-c34c00e7-9043-4ef5-aeb4-13cd068cabbb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Received event network-vif-unplugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:32:07 np0005539505 nova_compute[186958]: 2025-11-29 07:32:07.458 186962 DEBUG oslo_concurrency.lockutils [req-f2e641db-6df0-4acd-98de-8aabf56f16d6 req-c34c00e7-9043-4ef5-aeb4-13cd068cabbb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:07 np0005539505 nova_compute[186958]: 2025-11-29 07:32:07.460 186962 DEBUG oslo_concurrency.lockutils [req-f2e641db-6df0-4acd-98de-8aabf56f16d6 req-c34c00e7-9043-4ef5-aeb4-13cd068cabbb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:07 np0005539505 nova_compute[186958]: 2025-11-29 07:32:07.460 186962 DEBUG oslo_concurrency.lockutils [req-f2e641db-6df0-4acd-98de-8aabf56f16d6 req-c34c00e7-9043-4ef5-aeb4-13cd068cabbb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:07 np0005539505 nova_compute[186958]: 2025-11-29 07:32:07.460 186962 DEBUG nova.compute.manager [req-f2e641db-6df0-4acd-98de-8aabf56f16d6 req-c34c00e7-9043-4ef5-aeb4-13cd068cabbb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] No waiting events found dispatching network-vif-unplugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:32:07 np0005539505 nova_compute[186958]: 2025-11-29 07:32:07.461 186962 DEBUG nova.compute.manager [req-f2e641db-6df0-4acd-98de-8aabf56f16d6 req-c34c00e7-9043-4ef5-aeb4-13cd068cabbb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Received event network-vif-unplugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:32:07 np0005539505 nova_compute[186958]: 2025-11-29 07:32:07.609 186962 INFO nova.compute.manager [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Took 3.02 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:32:07 np0005539505 nova_compute[186958]: 2025-11-29 07:32:07.610 186962 DEBUG oslo.service.loopingcall [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:32:07 np0005539505 nova_compute[186958]: 2025-11-29 07:32:07.611 186962 DEBUG nova.compute.manager [-] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:32:07 np0005539505 nova_compute[186958]: 2025-11-29 07:32:07.611 186962 DEBUG nova.network.neutron [-] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:32:08 np0005539505 nova_compute[186958]: 2025-11-29 07:32:08.408 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:32:08 np0005539505 nova_compute[186958]: 2025-11-29 07:32:08.409 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:09 np0005539505 nova_compute[186958]: 2025-11-29 07:32:09.726 186962 DEBUG nova.compute.manager [req-fd83fd68-7fcb-4ec5-afe6-d27011cc91e0 req-6c54c762-33bb-4604-a46f-ba8ddcb55f71 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Received event network-vif-plugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:32:09 np0005539505 nova_compute[186958]: 2025-11-29 07:32:09.728 186962 DEBUG oslo_concurrency.lockutils [req-fd83fd68-7fcb-4ec5-afe6-d27011cc91e0 req-6c54c762-33bb-4604-a46f-ba8ddcb55f71 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:09 np0005539505 nova_compute[186958]: 2025-11-29 07:32:09.729 186962 DEBUG oslo_concurrency.lockutils [req-fd83fd68-7fcb-4ec5-afe6-d27011cc91e0 req-6c54c762-33bb-4604-a46f-ba8ddcb55f71 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:09 np0005539505 nova_compute[186958]: 2025-11-29 07:32:09.729 186962 DEBUG oslo_concurrency.lockutils [req-fd83fd68-7fcb-4ec5-afe6-d27011cc91e0 req-6c54c762-33bb-4604-a46f-ba8ddcb55f71 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:09 np0005539505 nova_compute[186958]: 2025-11-29 07:32:09.730 186962 DEBUG nova.compute.manager [req-fd83fd68-7fcb-4ec5-afe6-d27011cc91e0 req-6c54c762-33bb-4604-a46f-ba8ddcb55f71 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] No waiting events found dispatching network-vif-plugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:32:09 np0005539505 nova_compute[186958]: 2025-11-29 07:32:09.730 186962 WARNING nova.compute.manager [req-fd83fd68-7fcb-4ec5-afe6-d27011cc91e0 req-6c54c762-33bb-4604-a46f-ba8ddcb55f71 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Received unexpected event network-vif-plugged-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:32:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:09.958 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:32:09 np0005539505 nova_compute[186958]: 2025-11-29 07:32:09.959 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:09.961 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:32:10 np0005539505 nova_compute[186958]: 2025-11-29 07:32:10.040 186962 DEBUG nova.network.neutron [-] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:32:10 np0005539505 nova_compute[186958]: 2025-11-29 07:32:10.251 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:10 np0005539505 nova_compute[186958]: 2025-11-29 07:32:10.295 186962 INFO nova.compute.manager [-] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Took 2.68 seconds to deallocate network for instance.#033[00m
Nov 29 02:32:10 np0005539505 nova_compute[186958]: 2025-11-29 07:32:10.381 186962 DEBUG nova.compute.manager [req-c4fc462a-8228-4827-bb6e-6e128ba0f797 req-76a8935b-5e9e-40a7-84da-6eee8b193a63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Received event network-vif-deleted-a8aba1b5-0a8f-4d8c-9284-d89c9b87efed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:32:10 np0005539505 nova_compute[186958]: 2025-11-29 07:32:10.382 186962 INFO nova.compute.manager [req-c4fc462a-8228-4827-bb6e-6e128ba0f797 req-76a8935b-5e9e-40a7-84da-6eee8b193a63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Neutron deleted interface a8aba1b5-0a8f-4d8c-9284-d89c9b87efed; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:32:10 np0005539505 nova_compute[186958]: 2025-11-29 07:32:10.382 186962 DEBUG nova.network.neutron [req-c4fc462a-8228-4827-bb6e-6e128ba0f797 req-76a8935b-5e9e-40a7-84da-6eee8b193a63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:32:10 np0005539505 nova_compute[186958]: 2025-11-29 07:32:10.632 186962 DEBUG nova.compute.manager [req-c4fc462a-8228-4827-bb6e-6e128ba0f797 req-76a8935b-5e9e-40a7-84da-6eee8b193a63 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Detach interface failed, port_id=a8aba1b5-0a8f-4d8c-9284-d89c9b87efed, reason: Instance f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:32:10 np0005539505 nova_compute[186958]: 2025-11-29 07:32:10.997 186962 DEBUG oslo_concurrency.lockutils [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:10 np0005539505 nova_compute[186958]: 2025-11-29 07:32:10.997 186962 DEBUG oslo_concurrency.lockutils [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:11 np0005539505 nova_compute[186958]: 2025-11-29 07:32:11.040 186962 DEBUG nova.compute.provider_tree [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:32:11 np0005539505 nova_compute[186958]: 2025-11-29 07:32:11.047 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:11 np0005539505 nova_compute[186958]: 2025-11-29 07:32:11.112 186962 DEBUG nova.scheduler.client.report [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:32:11 np0005539505 nova_compute[186958]: 2025-11-29 07:32:11.231 186962 DEBUG oslo_concurrency.lockutils [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:11 np0005539505 nova_compute[186958]: 2025-11-29 07:32:11.329 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:11 np0005539505 nova_compute[186958]: 2025-11-29 07:32:11.349 186962 INFO nova.scheduler.client.report [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance f1e06afd-5b90-4cc3-87a6-54bcfcb07b91#033[00m
Nov 29 02:32:11 np0005539505 nova_compute[186958]: 2025-11-29 07:32:11.403 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:11 np0005539505 nova_compute[186958]: 2025-11-29 07:32:11.557 186962 DEBUG oslo_concurrency.lockutils [None req-153fd05e-1212-45af-be2b-905563dda448 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "f1e06afd-5b90-4cc3-87a6-54bcfcb07b91" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 11.404s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:11 np0005539505 podman[243811]: 2025-11-29 07:32:11.718204957 +0000 UTC m=+0.051447111 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:32:11 np0005539505 podman[243812]: 2025-11-29 07:32:11.760461598 +0000 UTC m=+0.087100796 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:32:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:13.964 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:32:14 np0005539505 podman[243863]: 2025-11-29 07:32:14.758334859 +0000 UTC m=+0.084752660 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:32:14 np0005539505 podman[243862]: 2025-11-29 07:32:14.769021181 +0000 UTC m=+0.091486009 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:32:15 np0005539505 nova_compute[186958]: 2025-11-29 07:32:15.254 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:16 np0005539505 nova_compute[186958]: 2025-11-29 07:32:16.049 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:16 np0005539505 nova_compute[186958]: 2025-11-29 07:32:16.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:17 np0005539505 nova_compute[186958]: 2025-11-29 07:32:17.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:19 np0005539505 nova_compute[186958]: 2025-11-29 07:32:19.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:19 np0005539505 nova_compute[186958]: 2025-11-29 07:32:19.862 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401524.8611405, f1e06afd-5b90-4cc3-87a6-54bcfcb07b91 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:32:19 np0005539505 nova_compute[186958]: 2025-11-29 07:32:19.863 186962 INFO nova.compute.manager [-] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:32:20 np0005539505 nova_compute[186958]: 2025-11-29 07:32:20.258 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:20 np0005539505 nova_compute[186958]: 2025-11-29 07:32:20.629 186962 DEBUG nova.compute.manager [None req-90cb5f14-f238-40bc-80a8-b4db3ac1235d - - - - - -] [instance: f1e06afd-5b90-4cc3-87a6-54bcfcb07b91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:32:21 np0005539505 nova_compute[186958]: 2025-11-29 07:32:21.052 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:23 np0005539505 nova_compute[186958]: 2025-11-29 07:32:23.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:23 np0005539505 nova_compute[186958]: 2025-11-29 07:32:23.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:32:25 np0005539505 nova_compute[186958]: 2025-11-29 07:32:25.261 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:25 np0005539505 nova_compute[186958]: 2025-11-29 07:32:25.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:26 np0005539505 nova_compute[186958]: 2025-11-29 07:32:26.098 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:27.512 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:27.512 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:27.512 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:28 np0005539505 nova_compute[186958]: 2025-11-29 07:32:28.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:28.659 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:f1:bd 2001:db8:0:1:f816:3eff:fe4a:f1bd 2001:db8::f816:3eff:fe4a:f1bd'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe4a:f1bd/64 2001:db8::f816:3eff:fe4a:f1bd/64', 'neutron:device_id': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c069d1db-d7e5-4641-988e-cd6e75103caa, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b0b0536c-6e35-42c5-8936-a1236a4f216e) old=Port_Binding(mac=['fa:16:3e:4a:f1:bd 2001:db8::f816:3eff:fe4a:f1bd'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4a:f1bd/64', 'neutron:device_id': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:28.660 104094 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b0b0536c-6e35-42c5-8936-a1236a4f216e in datapath 716ed53e-cc56-4286-b418-2f5e02d33124 updated#033[00m
Nov 29 02:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:28.662 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 716ed53e-cc56-4286-b418-2f5e02d33124, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:32:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:32:28.663 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4af33d24-ddff-4212-9df8-38d78f06c6f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:32:29 np0005539505 nova_compute[186958]: 2025-11-29 07:32:29.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:29 np0005539505 podman[243904]: 2025-11-29 07:32:29.725089212 +0000 UTC m=+0.048795976 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:32:29 np0005539505 podman[243903]: 2025-11-29 07:32:29.729865337 +0000 UTC m=+0.057443360 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:32:30 np0005539505 nova_compute[186958]: 2025-11-29 07:32:30.264 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:30 np0005539505 nova_compute[186958]: 2025-11-29 07:32:30.455 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:30 np0005539505 nova_compute[186958]: 2025-11-29 07:32:30.456 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:32:30 np0005539505 nova_compute[186958]: 2025-11-29 07:32:30.456 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:32:30 np0005539505 nova_compute[186958]: 2025-11-29 07:32:30.559 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:32:30 np0005539505 nova_compute[186958]: 2025-11-29 07:32:30.559 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.023 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.023 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.024 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.024 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.145 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.235 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.236 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5706MB free_disk=73.07363510131836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.236 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.236 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.522 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.523 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.578 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.622 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.738 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:32:31 np0005539505 nova_compute[186958]: 2025-11-29 07:32:31.738 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:32 np0005539505 nova_compute[186958]: 2025-11-29 07:32:32.656 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:32 np0005539505 podman[243949]: 2025-11-29 07:32:32.712114738 +0000 UTC m=+0.045117432 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:32:35 np0005539505 nova_compute[186958]: 2025-11-29 07:32:35.266 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:36 np0005539505 nova_compute[186958]: 2025-11-29 07:32:36.145 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:38 np0005539505 nova_compute[186958]: 2025-11-29 07:32:38.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:40 np0005539505 nova_compute[186958]: 2025-11-29 07:32:40.268 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:41 np0005539505 nova_compute[186958]: 2025-11-29 07:32:41.149 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:42 np0005539505 nova_compute[186958]: 2025-11-29 07:32:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:42 np0005539505 nova_compute[186958]: 2025-11-29 07:32:42.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:32:42 np0005539505 nova_compute[186958]: 2025-11-29 07:32:42.397 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:32:42 np0005539505 podman[243968]: 2025-11-29 07:32:42.75541815 +0000 UTC m=+0.077280098 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:32:42 np0005539505 podman[243969]: 2025-11-29 07:32:42.810014079 +0000 UTC m=+0.128494802 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:32:43 np0005539505 nova_compute[186958]: 2025-11-29 07:32:43.945 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:43 np0005539505 nova_compute[186958]: 2025-11-29 07:32:43.946 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:43 np0005539505 nova_compute[186958]: 2025-11-29 07:32:43.982 186962 DEBUG nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:32:44 np0005539505 nova_compute[186958]: 2025-11-29 07:32:44.262 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:44 np0005539505 nova_compute[186958]: 2025-11-29 07:32:44.262 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:44 np0005539505 nova_compute[186958]: 2025-11-29 07:32:44.268 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:32:44 np0005539505 nova_compute[186958]: 2025-11-29 07:32:44.268 186962 INFO nova.compute.claims [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:32:44 np0005539505 nova_compute[186958]: 2025-11-29 07:32:44.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:44 np0005539505 nova_compute[186958]: 2025-11-29 07:32:44.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:32:45 np0005539505 nova_compute[186958]: 2025-11-29 07:32:45.272 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:45 np0005539505 podman[244018]: 2025-11-29 07:32:45.742810996 +0000 UTC m=+0.073821672 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Nov 29 02:32:45 np0005539505 podman[244019]: 2025-11-29 07:32:45.762668605 +0000 UTC m=+0.088360141 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:32:46 np0005539505 nova_compute[186958]: 2025-11-29 07:32:46.267 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:47 np0005539505 nova_compute[186958]: 2025-11-29 07:32:47.074 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:47 np0005539505 nova_compute[186958]: 2025-11-29 07:32:47.385 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:47 np0005539505 nova_compute[186958]: 2025-11-29 07:32:47.482 186962 DEBUG nova.compute.provider_tree [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:32:47 np0005539505 nova_compute[186958]: 2025-11-29 07:32:47.614 186962 DEBUG nova.scheduler.client.report [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:32:47 np0005539505 nova_compute[186958]: 2025-11-29 07:32:47.938 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:47 np0005539505 nova_compute[186958]: 2025-11-29 07:32:47.940 186962 DEBUG nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:32:48.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:32:48 np0005539505 nova_compute[186958]: 2025-11-29 07:32:48.428 186962 DEBUG nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:32:48 np0005539505 nova_compute[186958]: 2025-11-29 07:32:48.429 186962 DEBUG nova.network.neutron [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:32:48 np0005539505 nova_compute[186958]: 2025-11-29 07:32:48.701 186962 DEBUG nova.policy [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:32:48 np0005539505 nova_compute[186958]: 2025-11-29 07:32:48.732 186962 INFO nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:32:48 np0005539505 nova_compute[186958]: 2025-11-29 07:32:48.799 186962 DEBUG nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:32:50 np0005539505 nova_compute[186958]: 2025-11-29 07:32:50.275 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:51 np0005539505 nova_compute[186958]: 2025-11-29 07:32:51.274 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:53 np0005539505 nova_compute[186958]: 2025-11-29 07:32:53.861 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:32:55 np0005539505 nova_compute[186958]: 2025-11-29 07:32:55.278 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:55 np0005539505 nova_compute[186958]: 2025-11-29 07:32:55.952 186962 WARNING nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] While synchronizing instance power states, found 1 instances in the database and 0 instances on the hypervisor.#033[00m
Nov 29 02:32:55 np0005539505 nova_compute[186958]: 2025-11-29 07:32:55.952 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Triggering sync for uuid 3f269630-23a3-4378-bca8-2177bcee52e5 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:32:55 np0005539505 nova_compute[186958]: 2025-11-29 07:32:55.953 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.038 186962 DEBUG nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.039 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.040 186962 INFO nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Creating image(s)#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.040 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.040 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.041 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.054 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.152 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.153 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.153 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.165 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.256 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.257 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.311 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.341 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk 1073741824" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.342 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.343 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.399 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.401 186962 DEBUG nova.virt.disk.api [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.402 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.461 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.463 186962 DEBUG nova.virt.disk.api [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:32:56 np0005539505 nova_compute[186958]: 2025-11-29 07:32:56.464 186962 DEBUG nova.objects.instance [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid 3f269630-23a3-4378-bca8-2177bcee52e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:32:57 np0005539505 nova_compute[186958]: 2025-11-29 07:32:57.020 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:32:57 np0005539505 nova_compute[186958]: 2025-11-29 07:32:57.021 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Ensure instance console log exists: /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:32:57 np0005539505 nova_compute[186958]: 2025-11-29 07:32:57.021 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:32:57 np0005539505 nova_compute[186958]: 2025-11-29 07:32:57.022 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:32:57 np0005539505 nova_compute[186958]: 2025-11-29 07:32:57.022 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:32:57 np0005539505 nova_compute[186958]: 2025-11-29 07:32:57.227 186962 DEBUG nova.network.neutron [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Successfully created port: 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:32:59 np0005539505 nova_compute[186958]: 2025-11-29 07:32:59.611 186962 DEBUG nova.network.neutron [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Successfully created port: d916a91b-985b-452b-a2c4-b166a0be47b3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:33:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:00.009 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:33:00 np0005539505 nova_compute[186958]: 2025-11-29 07:33:00.010 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:00.011 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:33:00 np0005539505 nova_compute[186958]: 2025-11-29 07:33:00.280 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:00 np0005539505 podman[244075]: 2025-11-29 07:33:00.753506004 +0000 UTC m=+0.065195678 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:33:00 np0005539505 podman[244074]: 2025-11-29 07:33:00.753601386 +0000 UTC m=+0.070609660 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:33:01 np0005539505 nova_compute[186958]: 2025-11-29 07:33:01.366 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:02 np0005539505 nova_compute[186958]: 2025-11-29 07:33:02.925 186962 DEBUG nova.network.neutron [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Successfully updated port: 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:33:03 np0005539505 podman[244119]: 2025-11-29 07:33:03.723019836 +0000 UTC m=+0.054477896 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 29 02:33:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:04.013 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:04 np0005539505 nova_compute[186958]: 2025-11-29 07:33:04.402 186962 DEBUG nova.compute.manager [req-f5fc6b32-627c-4bdb-9bc1-9abd038cc1a2 req-ab70a81a-9715-43b0-9b26-2af48de4da77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-changed-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:33:04 np0005539505 nova_compute[186958]: 2025-11-29 07:33:04.402 186962 DEBUG nova.compute.manager [req-f5fc6b32-627c-4bdb-9bc1-9abd038cc1a2 req-ab70a81a-9715-43b0-9b26-2af48de4da77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Refreshing instance network info cache due to event network-changed-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:33:04 np0005539505 nova_compute[186958]: 2025-11-29 07:33:04.402 186962 DEBUG oslo_concurrency.lockutils [req-f5fc6b32-627c-4bdb-9bc1-9abd038cc1a2 req-ab70a81a-9715-43b0-9b26-2af48de4da77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:33:04 np0005539505 nova_compute[186958]: 2025-11-29 07:33:04.402 186962 DEBUG oslo_concurrency.lockutils [req-f5fc6b32-627c-4bdb-9bc1-9abd038cc1a2 req-ab70a81a-9715-43b0-9b26-2af48de4da77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:33:04 np0005539505 nova_compute[186958]: 2025-11-29 07:33:04.402 186962 DEBUG nova.network.neutron [req-f5fc6b32-627c-4bdb-9bc1-9abd038cc1a2 req-ab70a81a-9715-43b0-9b26-2af48de4da77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Refreshing network info cache for port 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:33:04 np0005539505 nova_compute[186958]: 2025-11-29 07:33:04.892 186962 DEBUG nova.network.neutron [req-f5fc6b32-627c-4bdb-9bc1-9abd038cc1a2 req-ab70a81a-9715-43b0-9b26-2af48de4da77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:33:05 np0005539505 nova_compute[186958]: 2025-11-29 07:33:05.283 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:06 np0005539505 nova_compute[186958]: 2025-11-29 07:33:06.023 186962 DEBUG nova.network.neutron [req-f5fc6b32-627c-4bdb-9bc1-9abd038cc1a2 req-ab70a81a-9715-43b0-9b26-2af48de4da77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:33:06 np0005539505 nova_compute[186958]: 2025-11-29 07:33:06.086 186962 DEBUG oslo_concurrency.lockutils [req-f5fc6b32-627c-4bdb-9bc1-9abd038cc1a2 req-ab70a81a-9715-43b0-9b26-2af48de4da77 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:33:06 np0005539505 nova_compute[186958]: 2025-11-29 07:33:06.368 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:06 np0005539505 nova_compute[186958]: 2025-11-29 07:33:06.518 186962 DEBUG nova.network.neutron [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Successfully updated port: d916a91b-985b-452b-a2c4-b166a0be47b3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:33:06 np0005539505 nova_compute[186958]: 2025-11-29 07:33:06.545 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:33:06 np0005539505 nova_compute[186958]: 2025-11-29 07:33:06.545 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:33:06 np0005539505 nova_compute[186958]: 2025-11-29 07:33:06.546 186962 DEBUG nova.network.neutron [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:33:06 np0005539505 nova_compute[186958]: 2025-11-29 07:33:06.725 186962 DEBUG nova.compute.manager [req-c56a1666-a795-4aac-91e7-f782354b7502 req-e391b353-50d3-4f7f-8727-6814d9d62af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-changed-d916a91b-985b-452b-a2c4-b166a0be47b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:33:06 np0005539505 nova_compute[186958]: 2025-11-29 07:33:06.726 186962 DEBUG nova.compute.manager [req-c56a1666-a795-4aac-91e7-f782354b7502 req-e391b353-50d3-4f7f-8727-6814d9d62af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Refreshing instance network info cache due to event network-changed-d916a91b-985b-452b-a2c4-b166a0be47b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:33:06 np0005539505 nova_compute[186958]: 2025-11-29 07:33:06.726 186962 DEBUG oslo_concurrency.lockutils [req-c56a1666-a795-4aac-91e7-f782354b7502 req-e391b353-50d3-4f7f-8727-6814d9d62af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:33:06 np0005539505 nova_compute[186958]: 2025-11-29 07:33:06.850 186962 DEBUG nova.network.neutron [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:33:10 np0005539505 nova_compute[186958]: 2025-11-29 07:33:10.284 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:11 np0005539505 nova_compute[186958]: 2025-11-29 07:33:11.370 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.648 186962 DEBUG nova.network.neutron [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updating instance_info_cache with network_info: [{"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.733 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.734 186962 DEBUG nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Instance network_info: |[{"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.734 186962 DEBUG oslo_concurrency.lockutils [req-c56a1666-a795-4aac-91e7-f782354b7502 req-e391b353-50d3-4f7f-8727-6814d9d62af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.734 186962 DEBUG nova.network.neutron [req-c56a1666-a795-4aac-91e7-f782354b7502 req-e391b353-50d3-4f7f-8727-6814d9d62af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Refreshing network info cache for port d916a91b-985b-452b-a2c4-b166a0be47b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.739 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Start _get_guest_xml network_info=[{"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.745 186962 WARNING nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.751 186962 DEBUG nova.virt.libvirt.host [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.751 186962 DEBUG nova.virt.libvirt.host [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.757 186962 DEBUG nova.virt.libvirt.host [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.758 186962 DEBUG nova.virt.libvirt.host [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.759 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.759 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.760 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.760 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.760 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.760 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.760 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.761 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.761 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.761 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.761 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.762 186962 DEBUG nova.virt.hardware [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.765 186962 DEBUG nova.virt.libvirt.vif [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:32:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1508463221',display_name='tempest-TestGettingAddress-server-1508463221',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1508463221',id=147,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-j3ghn6ga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:32:48Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=3f269630-23a3-4378-bca8-2177bcee52e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.765 186962 DEBUG nova.network.os_vif_util [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.766 186962 DEBUG nova.network.os_vif_util [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:ee:26,bridge_name='br-int',has_traffic_filtering=True,id=1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c6a9dfe-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.766 186962 DEBUG nova.virt.libvirt.vif [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:32:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1508463221',display_name='tempest-TestGettingAddress-server-1508463221',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1508463221',id=147,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-j3ghn6ga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:32:48Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=3f269630-23a3-4378-bca8-2177bcee52e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.767 186962 DEBUG nova.network.os_vif_util [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.767 186962 DEBUG nova.network.os_vif_util [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:06:4a,bridge_name='br-int',has_traffic_filtering=True,id=d916a91b-985b-452b-a2c4-b166a0be47b3,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd916a91b-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.768 186962 DEBUG nova.objects.instance [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid 3f269630-23a3-4378-bca8-2177bcee52e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.782 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  <uuid>3f269630-23a3-4378-bca8-2177bcee52e5</uuid>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  <name>instance-00000093</name>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestGettingAddress-server-1508463221</nova:name>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:33:12</nova:creationTime>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:        <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:        <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:        <nova:port uuid="1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:        <nova:port uuid="d916a91b-985b-452b-a2c4-b166a0be47b3">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe31:64a" ipVersion="6"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe31:64a" ipVersion="6"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <entry name="serial">3f269630-23a3-4378-bca8-2177bcee52e5</entry>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <entry name="uuid">3f269630-23a3-4378-bca8-2177bcee52e5</entry>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk.config"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:4a:ee:26"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <target dev="tap1c6a9dfe-d9"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:31:06:4a"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <target dev="tapd916a91b-98"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/console.log" append="off"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:33:12 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:33:12 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:33:12 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:33:12 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.783 186962 DEBUG nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Preparing to wait for external event network-vif-plugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.784 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.784 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.784 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.785 186962 DEBUG nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Preparing to wait for external event network-vif-plugged-d916a91b-985b-452b-a2c4-b166a0be47b3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.785 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.785 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.785 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.786 186962 DEBUG nova.virt.libvirt.vif [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:32:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1508463221',display_name='tempest-TestGettingAddress-server-1508463221',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1508463221',id=147,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-j3ghn6ga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:32:48Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=3f269630-23a3-4378-bca8-2177bcee52e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.787 186962 DEBUG nova.network.os_vif_util [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.787 186962 DEBUG nova.network.os_vif_util [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:ee:26,bridge_name='br-int',has_traffic_filtering=True,id=1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c6a9dfe-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.788 186962 DEBUG os_vif [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:ee:26,bridge_name='br-int',has_traffic_filtering=True,id=1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c6a9dfe-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.789 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.789 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.789 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.792 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.792 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c6a9dfe-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.793 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1c6a9dfe-d9, col_values=(('external_ids', {'iface-id': '1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:ee:26', 'vm-uuid': '3f269630-23a3-4378-bca8-2177bcee52e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.795 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:12 np0005539505 NetworkManager[55134]: <info>  [1764401592.7971] manager: (tap1c6a9dfe-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/341)
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.797 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.800 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.801 186962 INFO os_vif [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:ee:26,bridge_name='br-int',has_traffic_filtering=True,id=1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c6a9dfe-d9')#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.802 186962 DEBUG nova.virt.libvirt.vif [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:32:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1508463221',display_name='tempest-TestGettingAddress-server-1508463221',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1508463221',id=147,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-j3ghn6ga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:32:48Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=3f269630-23a3-4378-bca8-2177bcee52e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.802 186962 DEBUG nova.network.os_vif_util [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.803 186962 DEBUG nova.network.os_vif_util [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:06:4a,bridge_name='br-int',has_traffic_filtering=True,id=d916a91b-985b-452b-a2c4-b166a0be47b3,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd916a91b-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.804 186962 DEBUG os_vif [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:06:4a,bridge_name='br-int',has_traffic_filtering=True,id=d916a91b-985b-452b-a2c4-b166a0be47b3,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd916a91b-98') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.804 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.804 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.805 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.807 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.807 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd916a91b-98, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.808 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd916a91b-98, col_values=(('external_ids', {'iface-id': 'd916a91b-985b-452b-a2c4-b166a0be47b3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:06:4a', 'vm-uuid': '3f269630-23a3-4378-bca8-2177bcee52e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.809 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:12 np0005539505 NetworkManager[55134]: <info>  [1764401592.8108] manager: (tapd916a91b-98): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.812 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.815 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:12 np0005539505 nova_compute[186958]: 2025-11-29 07:33:12.816 186962 INFO os_vif [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:06:4a,bridge_name='br-int',has_traffic_filtering=True,id=d916a91b-985b-452b-a2c4-b166a0be47b3,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd916a91b-98')#033[00m
Nov 29 02:33:13 np0005539505 nova_compute[186958]: 2025-11-29 07:33:13.406 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:33:13 np0005539505 nova_compute[186958]: 2025-11-29 07:33:13.406 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:33:13 np0005539505 nova_compute[186958]: 2025-11-29 07:33:13.406 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:4a:ee:26, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:33:13 np0005539505 nova_compute[186958]: 2025-11-29 07:33:13.407 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:31:06:4a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:33:13 np0005539505 nova_compute[186958]: 2025-11-29 07:33:13.407 186962 INFO nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Using config drive#033[00m
Nov 29 02:33:13 np0005539505 podman[244142]: 2025-11-29 07:33:13.721142335 +0000 UTC m=+0.056371450 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:33:13 np0005539505 podman[244143]: 2025-11-29 07:33:13.757710685 +0000 UTC m=+0.089227205 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:33:14 np0005539505 nova_compute[186958]: 2025-11-29 07:33:14.290 186962 INFO nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Creating config drive at /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk.config#033[00m
Nov 29 02:33:14 np0005539505 nova_compute[186958]: 2025-11-29 07:33:14.295 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr0v1tw8n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:33:14 np0005539505 nova_compute[186958]: 2025-11-29 07:33:14.430 186962 DEBUG oslo_concurrency.processutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr0v1tw8n" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:33:14 np0005539505 NetworkManager[55134]: <info>  [1764401594.5075] manager: (tap1c6a9dfe-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/343)
Nov 29 02:33:14 np0005539505 kernel: tap1c6a9dfe-d9: entered promiscuous mode
Nov 29 02:33:14 np0005539505 nova_compute[186958]: 2025-11-29 07:33:14.560 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:14Z|00698|binding|INFO|Claiming lport 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 for this chassis.
Nov 29 02:33:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:14Z|00699|binding|INFO|1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28: Claiming fa:16:3e:4a:ee:26 10.100.0.7
Nov 29 02:33:14 np0005539505 kernel: tapd916a91b-98: entered promiscuous mode
Nov 29 02:33:14 np0005539505 NetworkManager[55134]: <info>  [1764401594.5724] manager: (tapd916a91b-98): new Tun device (/org/freedesktop/NetworkManager/Devices/344)
Nov 29 02:33:14 np0005539505 systemd-udevd[244216]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:33:14 np0005539505 systemd-udevd[244215]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.592 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:ee:26 10.100.0.7'], port_security=['fa:16:3e:4a:ee:26 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7900eada-6f98-452f-b178-ad26f2b82064', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0790a775-3668-4bb8-97f1-d4276df58523, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.593 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 in datapath ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 bound to our chassis#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.594 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee42f4e1-7038-4a24-8d9b-8ee99ca415d0#033[00m
Nov 29 02:33:14 np0005539505 NetworkManager[55134]: <info>  [1764401594.6065] device (tap1c6a9dfe-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:33:14 np0005539505 NetworkManager[55134]: <info>  [1764401594.6090] device (tap1c6a9dfe-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.607 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ab0e03f7-42c1-4954-993d-7c1e5171783c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.608 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee42f4e1-71 in ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:33:14 np0005539505 NetworkManager[55134]: <info>  [1764401594.6109] device (tapd916a91b-98): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.610 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee42f4e1-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.610 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[22dd1540-c1f6-40e3-a318-51c70752e9b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 NetworkManager[55134]: <info>  [1764401594.6129] device (tapd916a91b-98): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.612 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a3923404-7e10-456e-b2c9-3bb20488ce0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.625 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[2ae92082-3a78-40b5-91dc-b627a1f93b39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 systemd-machined[153285]: New machine qemu-77-instance-00000093.
Nov 29 02:33:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:14Z|00700|binding|INFO|Claiming lport d916a91b-985b-452b-a2c4-b166a0be47b3 for this chassis.
Nov 29 02:33:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:14Z|00701|binding|INFO|d916a91b-985b-452b-a2c4-b166a0be47b3: Claiming fa:16:3e:31:06:4a 2001:db8:0:1:f816:3eff:fe31:64a 2001:db8::f816:3eff:fe31:64a
Nov 29 02:33:14 np0005539505 nova_compute[186958]: 2025-11-29 07:33:14.640 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:14 np0005539505 nova_compute[186958]: 2025-11-29 07:33:14.643 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:14Z|00702|binding|INFO|Setting lport 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 ovn-installed in OVS
Nov 29 02:33:14 np0005539505 nova_compute[186958]: 2025-11-29 07:33:14.650 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:14 np0005539505 systemd[1]: Started Virtual Machine qemu-77-instance-00000093.
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.653 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:06:4a 2001:db8:0:1:f816:3eff:fe31:64a 2001:db8::f816:3eff:fe31:64a'], port_security=['fa:16:3e:31:06:4a 2001:db8:0:1:f816:3eff:fe31:64a 2001:db8::f816:3eff:fe31:64a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe31:64a/64 2001:db8::f816:3eff:fe31:64a/64', 'neutron:device_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7900eada-6f98-452f-b178-ad26f2b82064', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c069d1db-d7e5-4641-988e-cd6e75103caa, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=d916a91b-985b-452b-a2c4-b166a0be47b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:33:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:14Z|00703|binding|INFO|Setting lport 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 up in Southbound
Nov 29 02:33:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:14Z|00704|binding|INFO|Setting lport d916a91b-985b-452b-a2c4-b166a0be47b3 ovn-installed in OVS
Nov 29 02:33:14 np0005539505 nova_compute[186958]: 2025-11-29 07:33:14.659 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:14Z|00705|binding|INFO|Setting lport d916a91b-985b-452b-a2c4-b166a0be47b3 up in Southbound
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.660 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0f7c771f-18c1-4062-a739-c1d207487184]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.700 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[7999a8c3-1aab-439f-8092-b531f3e4bb14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.706 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[07e34914-7bbd-414d-a758-f413edbdfd02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 NetworkManager[55134]: <info>  [1764401594.7077] manager: (tapee42f4e1-70): new Veth device (/org/freedesktop/NetworkManager/Devices/345)
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.743 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[6b495a30-6128-4b00-84a5-2203648480be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.747 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[94ba25e4-1247-44b4-93b8-9a132106056d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 NetworkManager[55134]: <info>  [1764401594.7715] device (tapee42f4e1-70): carrier: link connected
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.777 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[72a56606-d15b-405a-ada3-f398b3cd61c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.797 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c744dcfd-49d7-41aa-83b0-1eb74bfb7ec4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee42f4e1-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:a6:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 704235, 'reachable_time': 40780, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244252, 'error': None, 'target': 'ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.816 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[01439d2c-5a0d-4b95-8c97-3f319704f4fe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:a67e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 704235, 'tstamp': 704235}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244253, 'error': None, 'target': 'ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.833 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[16e55877-ee11-4dbd-a75d-b80a5f1b4c2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee42f4e1-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:a6:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 704235, 'reachable_time': 40780, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244254, 'error': None, 'target': 'ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.863 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2f533793-7bf7-4231-a32f-dfb46e83e12c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.916 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6317eec0-fea1-412c-bba7-76e2f6467a65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.918 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee42f4e1-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.918 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.919 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee42f4e1-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:14 np0005539505 NetworkManager[55134]: <info>  [1764401594.9211] manager: (tapee42f4e1-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/346)
Nov 29 02:33:14 np0005539505 kernel: tapee42f4e1-70: entered promiscuous mode
Nov 29 02:33:14 np0005539505 nova_compute[186958]: 2025-11-29 07:33:14.921 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.930 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee42f4e1-70, col_values=(('external_ids', {'iface-id': '78e8cb8e-6743-4ef2-8e7c-19feddf2ed97'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:14Z|00706|binding|INFO|Releasing lport 78e8cb8e-6743-4ef2-8e7c-19feddf2ed97 from this chassis (sb_readonly=0)
Nov 29 02:33:14 np0005539505 nova_compute[186958]: 2025-11-29 07:33:14.931 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.934 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee42f4e1-7038-4a24-8d9b-8ee99ca415d0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee42f4e1-7038-4a24-8d9b-8ee99ca415d0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:33:14 np0005539505 nova_compute[186958]: 2025-11-29 07:33:14.944 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.943 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed0ab5a-e24a-4391-a87f-3d00eacd583f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.945 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/ee42f4e1-7038-4a24-8d9b-8ee99ca415d0.pid.haproxy
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID ee42f4e1-7038-4a24-8d9b-8ee99ca415d0
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:33:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:14.946 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'env', 'PROCESS_TAG=haproxy-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee42f4e1-7038-4a24-8d9b-8ee99ca415d0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.259 186962 DEBUG nova.compute.manager [req-c19cba63-9f3d-4874-877c-a46c6c6adcc6 req-9018bbb8-b293-40e1-b7fe-193cc17619f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-plugged-d916a91b-985b-452b-a2c4-b166a0be47b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.259 186962 DEBUG oslo_concurrency.lockutils [req-c19cba63-9f3d-4874-877c-a46c6c6adcc6 req-9018bbb8-b293-40e1-b7fe-193cc17619f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.259 186962 DEBUG oslo_concurrency.lockutils [req-c19cba63-9f3d-4874-877c-a46c6c6adcc6 req-9018bbb8-b293-40e1-b7fe-193cc17619f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.260 186962 DEBUG oslo_concurrency.lockutils [req-c19cba63-9f3d-4874-877c-a46c6c6adcc6 req-9018bbb8-b293-40e1-b7fe-193cc17619f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.260 186962 DEBUG nova.compute.manager [req-c19cba63-9f3d-4874-877c-a46c6c6adcc6 req-9018bbb8-b293-40e1-b7fe-193cc17619f5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Processing event network-vif-plugged-d916a91b-985b-452b-a2c4-b166a0be47b3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:33:15 np0005539505 podman[244287]: 2025-11-29 07:33:15.29208487 +0000 UTC m=+0.051888954 container create 5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:33:15 np0005539505 systemd[1]: Started libpod-conmon-5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954.scope.
Nov 29 02:33:15 np0005539505 podman[244287]: 2025-11-29 07:33:15.262674371 +0000 UTC m=+0.022478455 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:33:15 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:33:15 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcb08d7b4c0df3991bb94c0c038669b1c0fb81315879f7dd53c8d69b63f44a0d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:33:15 np0005539505 podman[244287]: 2025-11-29 07:33:15.392625153 +0000 UTC m=+0.152429267 container init 5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:33:15 np0005539505 podman[244287]: 2025-11-29 07:33:15.399007352 +0000 UTC m=+0.158811436 container start 5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:33:15 np0005539505 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[244303]: [NOTICE]   (244307) : New worker (244309) forked
Nov 29 02:33:15 np0005539505 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[244303]: [NOTICE]   (244307) : Loading success.
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.452 104094 INFO neutron.agent.ovn.metadata.agent [-] Port d916a91b-985b-452b-a2c4-b166a0be47b3 in datapath 716ed53e-cc56-4286-b418-2f5e02d33124 unbound from our chassis#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.454 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 716ed53e-cc56-4286-b418-2f5e02d33124#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.464 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[650e1e3f-4d70-4c16-a557-989756d624e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.465 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap716ed53e-c1 in ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.467 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap716ed53e-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.467 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ce999c1f-8d87-42d2-a9a9-177ad1d20831]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.468 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[71300568-d3ad-46b3-984a-9981fa6385e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.481 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[8b81ec2b-60d6-4f92-9bd0-121fd09ef18b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.494 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f543571e-67bf-4034-8131-eced861cff9e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.516 186962 DEBUG nova.network.neutron [req-c56a1666-a795-4aac-91e7-f782354b7502 req-e391b353-50d3-4f7f-8727-6814d9d62af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updated VIF entry in instance network info cache for port d916a91b-985b-452b-a2c4-b166a0be47b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.517 186962 DEBUG nova.network.neutron [req-c56a1666-a795-4aac-91e7-f782354b7502 req-e391b353-50d3-4f7f-8727-6814d9d62af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updating instance_info_cache with network_info: [{"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.523 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a9bfeb39-e0fe-41fd-b33f-5f738ba92ab6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 NetworkManager[55134]: <info>  [1764401595.5298] manager: (tap716ed53e-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/347)
Nov 29 02:33:15 np0005539505 systemd-udevd[244236]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.531 186962 DEBUG oslo_concurrency.lockutils [req-c56a1666-a795-4aac-91e7-f782354b7502 req-e391b353-50d3-4f7f-8727-6814d9d62af5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.528 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[786a4890-1381-4156-b6bf-9373b63f103d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.560 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec632eb-9491-44b7-a7da-b5ec60b7ac07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.564 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[feab7831-bc5a-4ce1-9956-b3a8f29e6eb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 NetworkManager[55134]: <info>  [1764401595.5852] device (tap716ed53e-c0): carrier: link connected
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.588 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e366ec95-80eb-4847-b5d4-d7be751a91f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.607 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[666ada5a-62be-41a6-9746-767ae0a2beb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap716ed53e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:f1:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 223], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 704316, 'reachable_time': 31339, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244328, 'error': None, 'target': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.622 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[52d3a6a3-d2b3-4c84-b9de-c1d790c71e8c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:f1bd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 704316, 'tstamp': 704316}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244329, 'error': None, 'target': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.643 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[93d81303-2af1-48d1-87b3-189a915040ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap716ed53e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:f1:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 223], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 704316, 'reachable_time': 31339, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244330, 'error': None, 'target': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.675 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd3848a-7ad7-4db1-b0c1-0aa709c3b510]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.708 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0046c89f-199a-4059-9471-75206e41c3a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.709 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap716ed53e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.710 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.710 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap716ed53e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:15 np0005539505 NetworkManager[55134]: <info>  [1764401595.7482] manager: (tap716ed53e-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/348)
Nov 29 02:33:15 np0005539505 kernel: tap716ed53e-c0: entered promiscuous mode
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.747 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.751 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.755 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap716ed53e-c0, col_values=(('external_ids', {'iface-id': 'b0b0536c-6e35-42c5-8936-a1236a4f216e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.757 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:15Z|00707|binding|INFO|Releasing lport b0b0536c-6e35-42c5-8936-a1236a4f216e from this chassis (sb_readonly=0)
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.759 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.760 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/716ed53e-cc56-4286-b418-2f5e02d33124.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/716ed53e-cc56-4286-b418-2f5e02d33124.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.761 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5fd48c-4c0a-4bea-ad81-e161e5043e9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.761 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-716ed53e-cc56-4286-b418-2f5e02d33124
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/716ed53e-cc56-4286-b418-2f5e02d33124.pid.haproxy
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 716ed53e-cc56-4286-b418-2f5e02d33124
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:33:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:15.762 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'env', 'PROCESS_TAG=haproxy-716ed53e-cc56-4286-b418-2f5e02d33124', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/716ed53e-cc56-4286-b418-2f5e02d33124.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.784 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.835 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401595.8344455, 3f269630-23a3-4378-bca8-2177bcee52e5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.835 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] VM Started (Lifecycle Event)#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.873 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.877 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401595.8377597, 3f269630-23a3-4378-bca8-2177bcee52e5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.877 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.895 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.899 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:33:15 np0005539505 nova_compute[186958]: 2025-11-29 07:33:15.915 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:33:16 np0005539505 podman[244368]: 2025-11-29 07:33:16.111457407 +0000 UTC m=+0.028822503 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:33:16 np0005539505 podman[244368]: 2025-11-29 07:33:16.28724196 +0000 UTC m=+0.204607036 container create 727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 02:33:16 np0005539505 systemd[1]: Started libpod-conmon-727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6.scope.
Nov 29 02:33:16 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:33:16 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d60a516de93f656b45cb56f081b477b380b6ddd8250887ef8bbd07a61b1c5089/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:33:16 np0005539505 podman[244368]: 2025-11-29 07:33:16.356252715 +0000 UTC m=+0.273617811 container init 727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:33:16 np0005539505 podman[244368]: 2025-11-29 07:33:16.362425029 +0000 UTC m=+0.279790105 container start 727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:33:16 np0005539505 nova_compute[186958]: 2025-11-29 07:33:16.372 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:16 np0005539505 podman[244382]: 2025-11-29 07:33:16.385420857 +0000 UTC m=+0.061627788 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:33:16 np0005539505 podman[244385]: 2025-11-29 07:33:16.385439648 +0000 UTC m=+0.062031760 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:33:16 np0005539505 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[244386]: [NOTICE]   (244420) : New worker (244428) forked
Nov 29 02:33:16 np0005539505 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[244386]: [NOTICE]   (244420) : Loading success.
Nov 29 02:33:17 np0005539505 nova_compute[186958]: 2025-11-29 07:33:17.810 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:18 np0005539505 nova_compute[186958]: 2025-11-29 07:33:18.488 186962 DEBUG nova.compute.manager [req-d1490084-e999-421f-97a4-253b403ca2da req-581b9ca9-64ac-4024-8885-54962d101975 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-plugged-d916a91b-985b-452b-a2c4-b166a0be47b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:33:18 np0005539505 nova_compute[186958]: 2025-11-29 07:33:18.488 186962 DEBUG oslo_concurrency.lockutils [req-d1490084-e999-421f-97a4-253b403ca2da req-581b9ca9-64ac-4024-8885-54962d101975 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:18 np0005539505 nova_compute[186958]: 2025-11-29 07:33:18.488 186962 DEBUG oslo_concurrency.lockutils [req-d1490084-e999-421f-97a4-253b403ca2da req-581b9ca9-64ac-4024-8885-54962d101975 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:18 np0005539505 nova_compute[186958]: 2025-11-29 07:33:18.488 186962 DEBUG oslo_concurrency.lockutils [req-d1490084-e999-421f-97a4-253b403ca2da req-581b9ca9-64ac-4024-8885-54962d101975 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:18 np0005539505 nova_compute[186958]: 2025-11-29 07:33:18.489 186962 DEBUG nova.compute.manager [req-d1490084-e999-421f-97a4-253b403ca2da req-581b9ca9-64ac-4024-8885-54962d101975 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] No event matching network-vif-plugged-d916a91b-985b-452b-a2c4-b166a0be47b3 in dict_keys([('network-vif-plugged', '1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Nov 29 02:33:18 np0005539505 nova_compute[186958]: 2025-11-29 07:33:18.489 186962 WARNING nova.compute.manager [req-d1490084-e999-421f-97a4-253b403ca2da req-581b9ca9-64ac-4024-8885-54962d101975 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received unexpected event network-vif-plugged-d916a91b-985b-452b-a2c4-b166a0be47b3 for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:33:19 np0005539505 nova_compute[186958]: 2025-11-29 07:33:19.470 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:19 np0005539505 nova_compute[186958]: 2025-11-29 07:33:19.470 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:21 np0005539505 nova_compute[186958]: 2025-11-29 07:33:21.410 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.544 186962 DEBUG nova.compute.manager [req-1efd371e-17d7-4f02-b13b-d8e29b0df950 req-ae80f7a8-70cb-47ec-ba45-c5df6f8a702b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-plugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.544 186962 DEBUG oslo_concurrency.lockutils [req-1efd371e-17d7-4f02-b13b-d8e29b0df950 req-ae80f7a8-70cb-47ec-ba45-c5df6f8a702b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.545 186962 DEBUG oslo_concurrency.lockutils [req-1efd371e-17d7-4f02-b13b-d8e29b0df950 req-ae80f7a8-70cb-47ec-ba45-c5df6f8a702b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.545 186962 DEBUG oslo_concurrency.lockutils [req-1efd371e-17d7-4f02-b13b-d8e29b0df950 req-ae80f7a8-70cb-47ec-ba45-c5df6f8a702b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.545 186962 DEBUG nova.compute.manager [req-1efd371e-17d7-4f02-b13b-d8e29b0df950 req-ae80f7a8-70cb-47ec-ba45-c5df6f8a702b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Processing event network-vif-plugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.545 186962 DEBUG nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Instance event wait completed in 6 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.549 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401602.5492525, 3f269630-23a3-4378-bca8-2177bcee52e5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.549 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.551 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.554 186962 INFO nova.virt.libvirt.driver [-] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Instance spawned successfully.#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.554 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.576 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.583 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.587 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.587 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.588 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.588 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.589 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.589 186962 DEBUG nova.virt.libvirt.driver [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.638 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.708 186962 INFO nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Took 26.67 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.709 186962 DEBUG nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.803 186962 INFO nova.compute.manager [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Took 38.75 seconds to build instance.#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.813 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.823 186962 DEBUG oslo_concurrency.lockutils [None req-f972005e-cf3b-4ea9-8530-efa4884a338d 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 38.878s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.824 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "3f269630-23a3-4378-bca8-2177bcee52e5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 26.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.824 186962 INFO nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] During sync_power_state the instance has a pending task (block_device_mapping). Skip.#033[00m
Nov 29 02:33:22 np0005539505 nova_compute[186958]: 2025-11-29 07:33:22.825 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "3f269630-23a3-4378-bca8-2177bcee52e5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:23 np0005539505 nova_compute[186958]: 2025-11-29 07:33:23.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:23 np0005539505 nova_compute[186958]: 2025-11-29 07:33:23.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:33:24 np0005539505 nova_compute[186958]: 2025-11-29 07:33:24.647 186962 DEBUG nova.compute.manager [req-eb65d63f-6065-41c2-b940-9dee9060915b req-d2b4603f-d6cd-4e73-a9c7-6b623b95120e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-plugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:33:24 np0005539505 nova_compute[186958]: 2025-11-29 07:33:24.647 186962 DEBUG oslo_concurrency.lockutils [req-eb65d63f-6065-41c2-b940-9dee9060915b req-d2b4603f-d6cd-4e73-a9c7-6b623b95120e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:24 np0005539505 nova_compute[186958]: 2025-11-29 07:33:24.647 186962 DEBUG oslo_concurrency.lockutils [req-eb65d63f-6065-41c2-b940-9dee9060915b req-d2b4603f-d6cd-4e73-a9c7-6b623b95120e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:24 np0005539505 nova_compute[186958]: 2025-11-29 07:33:24.648 186962 DEBUG oslo_concurrency.lockutils [req-eb65d63f-6065-41c2-b940-9dee9060915b req-d2b4603f-d6cd-4e73-a9c7-6b623b95120e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:24 np0005539505 nova_compute[186958]: 2025-11-29 07:33:24.648 186962 DEBUG nova.compute.manager [req-eb65d63f-6065-41c2-b940-9dee9060915b req-d2b4603f-d6cd-4e73-a9c7-6b623b95120e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] No waiting events found dispatching network-vif-plugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:33:24 np0005539505 nova_compute[186958]: 2025-11-29 07:33:24.648 186962 WARNING nova.compute.manager [req-eb65d63f-6065-41c2-b940-9dee9060915b req-d2b4603f-d6cd-4e73-a9c7-6b623b95120e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received unexpected event network-vif-plugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:33:26 np0005539505 nova_compute[186958]: 2025-11-29 07:33:26.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:26 np0005539505 nova_compute[186958]: 2025-11-29 07:33:26.449 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:27.513 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:27.514 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:27.515 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:27 np0005539505 nova_compute[186958]: 2025-11-29 07:33:27.816 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:29 np0005539505 nova_compute[186958]: 2025-11-29 07:33:29.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.234 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:30 np0005539505 NetworkManager[55134]: <info>  [1764401610.2385] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/349)
Nov 29 02:33:30 np0005539505 NetworkManager[55134]: <info>  [1764401610.2405] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/350)
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.407 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:30 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:30Z|00708|binding|INFO|Releasing lport b0b0536c-6e35-42c5-8936-a1236a4f216e from this chassis (sb_readonly=0)
Nov 29 02:33:30 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:30Z|00709|binding|INFO|Releasing lport 78e8cb8e-6743-4ef2-8e7c-19feddf2ed97 from this chassis (sb_readonly=0)
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.431 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.595 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.596 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.596 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.597 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3f269630-23a3-4378-bca8-2177bcee52e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.643 186962 DEBUG nova.compute.manager [req-9e0cb3bc-b45b-4fd2-859c-bf2b97c39e83 req-e0dea97d-9729-4cb3-8ee8-474ee8e88aa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-changed-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.643 186962 DEBUG nova.compute.manager [req-9e0cb3bc-b45b-4fd2-859c-bf2b97c39e83 req-e0dea97d-9729-4cb3-8ee8-474ee8e88aa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Refreshing instance network info cache due to event network-changed-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:33:30 np0005539505 nova_compute[186958]: 2025-11-29 07:33:30.644 186962 DEBUG oslo_concurrency.lockutils [req-9e0cb3bc-b45b-4fd2-859c-bf2b97c39e83 req-e0dea97d-9729-4cb3-8ee8-474ee8e88aa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:33:31 np0005539505 nova_compute[186958]: 2025-11-29 07:33:31.142 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:31 np0005539505 nova_compute[186958]: 2025-11-29 07:33:31.452 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:31 np0005539505 nova_compute[186958]: 2025-11-29 07:33:31.477 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:31 np0005539505 podman[244440]: 2025-11-29 07:33:31.721140895 +0000 UTC m=+0.050363650 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:33:31 np0005539505 podman[244439]: 2025-11-29 07:33:31.730009965 +0000 UTC m=+0.061116373 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 02:33:32 np0005539505 nova_compute[186958]: 2025-11-29 07:33:32.821 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:34 np0005539505 podman[244499]: 2025-11-29 07:33:34.716258919 +0000 UTC m=+0.048652002 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:33:35 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:35Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4a:ee:26 10.100.0.7
Nov 29 02:33:35 np0005539505 ovn_controller[95143]: 2025-11-29T07:33:35Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4a:ee:26 10.100.0.7
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.159 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updating instance_info_cache with network_info: [{"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.181 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.181 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.182 186962 DEBUG oslo_concurrency.lockutils [req-9e0cb3bc-b45b-4fd2-859c-bf2b97c39e83 req-e0dea97d-9729-4cb3-8ee8-474ee8e88aa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.182 186962 DEBUG nova.network.neutron [req-9e0cb3bc-b45b-4fd2-859c-bf2b97c39e83 req-e0dea97d-9729-4cb3-8ee8-474ee8e88aa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Refreshing network info cache for port 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.183 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.225 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.226 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.226 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.226 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.317 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.384 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.385 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.446 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.454 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.606 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.608 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5502MB free_disk=73.04493713378906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.608 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.609 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.740 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 3f269630-23a3-4378-bca8-2177bcee52e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.741 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:33:36 np0005539505 nova_compute[186958]: 2025-11-29 07:33:36.741 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:33:37 np0005539505 nova_compute[186958]: 2025-11-29 07:33:37.035 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:33:37 np0005539505 nova_compute[186958]: 2025-11-29 07:33:37.069 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:33:37 np0005539505 nova_compute[186958]: 2025-11-29 07:33:37.117 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:33:37 np0005539505 nova_compute[186958]: 2025-11-29 07:33:37.118 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.509s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:33:37 np0005539505 nova_compute[186958]: 2025-11-29 07:33:37.747 186962 DEBUG nova.network.neutron [req-9e0cb3bc-b45b-4fd2-859c-bf2b97c39e83 req-e0dea97d-9729-4cb3-8ee8-474ee8e88aa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updated VIF entry in instance network info cache for port 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:33:37 np0005539505 nova_compute[186958]: 2025-11-29 07:33:37.748 186962 DEBUG nova.network.neutron [req-9e0cb3bc-b45b-4fd2-859c-bf2b97c39e83 req-e0dea97d-9729-4cb3-8ee8-474ee8e88aa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updating instance_info_cache with network_info: [{"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:33:37 np0005539505 nova_compute[186958]: 2025-11-29 07:33:37.763 186962 DEBUG oslo_concurrency.lockutils [req-9e0cb3bc-b45b-4fd2-859c-bf2b97c39e83 req-e0dea97d-9729-4cb3-8ee8-474ee8e88aa8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:33:37 np0005539505 nova_compute[186958]: 2025-11-29 07:33:37.823 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:38 np0005539505 nova_compute[186958]: 2025-11-29 07:33:38.112 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:38 np0005539505 nova_compute[186958]: 2025-11-29 07:33:38.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:33:38 np0005539505 nova_compute[186958]: 2025-11-29 07:33:38.499 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:38 np0005539505 nova_compute[186958]: 2025-11-29 07:33:38.781 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:39.911 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:33:39 np0005539505 nova_compute[186958]: 2025-11-29 07:33:39.912 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:39.913 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:33:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:33:40.917 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:33:41 np0005539505 nova_compute[186958]: 2025-11-29 07:33:41.510 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:42 np0005539505 nova_compute[186958]: 2025-11-29 07:33:42.476 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:42 np0005539505 nova_compute[186958]: 2025-11-29 07:33:42.862 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:43 np0005539505 nova_compute[186958]: 2025-11-29 07:33:43.143 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:44 np0005539505 podman[244525]: 2025-11-29 07:33:44.716979601 +0000 UTC m=+0.046937533 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:33:44 np0005539505 podman[244526]: 2025-11-29 07:33:44.778626678 +0000 UTC m=+0.104956268 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 29 02:33:46 np0005539505 nova_compute[186958]: 2025-11-29 07:33:46.521 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:46 np0005539505 podman[244572]: 2025-11-29 07:33:46.721476932 +0000 UTC m=+0.048670792 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:33:46 np0005539505 podman[244571]: 2025-11-29 07:33:46.730944939 +0000 UTC m=+0.062040289 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:33:47 np0005539505 nova_compute[186958]: 2025-11-29 07:33:47.891 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:51 np0005539505 nova_compute[186958]: 2025-11-29 07:33:51.524 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:52 np0005539505 nova_compute[186958]: 2025-11-29 07:33:52.894 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:56 np0005539505 nova_compute[186958]: 2025-11-29 07:33:56.527 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:33:57 np0005539505 nova_compute[186958]: 2025-11-29 07:33:57.897 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:34:00Z|00710|binding|INFO|Releasing lport b0b0536c-6e35-42c5-8936-a1236a4f216e from this chassis (sb_readonly=0)
Nov 29 02:34:00 np0005539505 ovn_controller[95143]: 2025-11-29T07:34:00Z|00711|binding|INFO|Releasing lport 78e8cb8e-6743-4ef2-8e7c-19feddf2ed97 from this chassis (sb_readonly=0)
Nov 29 02:34:00 np0005539505 nova_compute[186958]: 2025-11-29 07:34:00.993 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:01 np0005539505 nova_compute[186958]: 2025-11-29 07:34:01.529 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:02 np0005539505 podman[244611]: 2025-11-29 07:34:02.747151073 +0000 UTC m=+0.077328540 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Nov 29 02:34:02 np0005539505 podman[244612]: 2025-11-29 07:34:02.771398706 +0000 UTC m=+0.085476559 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:34:02 np0005539505 nova_compute[186958]: 2025-11-29 07:34:02.910 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:05 np0005539505 podman[244657]: 2025-11-29 07:34:05.720406742 +0000 UTC m=+0.051608235 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 29 02:34:06 np0005539505 nova_compute[186958]: 2025-11-29 07:34:06.531 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:07 np0005539505 nova_compute[186958]: 2025-11-29 07:34:07.913 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:34:09Z|00712|binding|INFO|Releasing lport b0b0536c-6e35-42c5-8936-a1236a4f216e from this chassis (sb_readonly=0)
Nov 29 02:34:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:34:09Z|00713|binding|INFO|Releasing lport 78e8cb8e-6743-4ef2-8e7c-19feddf2ed97 from this chassis (sb_readonly=0)
Nov 29 02:34:09 np0005539505 nova_compute[186958]: 2025-11-29 07:34:09.307 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:11 np0005539505 nova_compute[186958]: 2025-11-29 07:34:11.532 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:12 np0005539505 nova_compute[186958]: 2025-11-29 07:34:12.916 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:15 np0005539505 podman[244676]: 2025-11-29 07:34:15.745677883 +0000 UTC m=+0.066789943 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:34:15 np0005539505 podman[244677]: 2025-11-29 07:34:15.764117422 +0000 UTC m=+0.086586800 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 02:34:16 np0005539505 nova_compute[186958]: 2025-11-29 07:34:16.542 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:17 np0005539505 nova_compute[186958]: 2025-11-29 07:34:17.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:17 np0005539505 podman[244726]: 2025-11-29 07:34:17.726043883 +0000 UTC m=+0.053234111 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:34:17 np0005539505 podman[244727]: 2025-11-29 07:34:17.74685541 +0000 UTC m=+0.065239939 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:34:17 np0005539505 nova_compute[186958]: 2025-11-29 07:34:17.919 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:34:20.225 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:34:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:34:20.226 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:34:20 np0005539505 nova_compute[186958]: 2025-11-29 07:34:20.226 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:20 np0005539505 nova_compute[186958]: 2025-11-29 07:34:20.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:21 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:34:21.229 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:34:21 np0005539505 nova_compute[186958]: 2025-11-29 07:34:21.545 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:22 np0005539505 nova_compute[186958]: 2025-11-29 07:34:22.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:22 np0005539505 nova_compute[186958]: 2025-11-29 07:34:22.922 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:25 np0005539505 nova_compute[186958]: 2025-11-29 07:34:25.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:25 np0005539505 nova_compute[186958]: 2025-11-29 07:34:25.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:34:26 np0005539505 nova_compute[186958]: 2025-11-29 07:34:26.547 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:34:27.514 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:34:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:34:27.514 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:34:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:34:27.515 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:34:27 np0005539505 nova_compute[186958]: 2025-11-29 07:34:27.924 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:28 np0005539505 nova_compute[186958]: 2025-11-29 07:34:28.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:30 np0005539505 nova_compute[186958]: 2025-11-29 07:34:30.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:30 np0005539505 nova_compute[186958]: 2025-11-29 07:34:30.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:34:30 np0005539505 nova_compute[186958]: 2025-11-29 07:34:30.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:34:31 np0005539505 nova_compute[186958]: 2025-11-29 07:34:31.548 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:32 np0005539505 nova_compute[186958]: 2025-11-29 07:34:32.926 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:33 np0005539505 podman[244770]: 2025-11-29 07:34:33.766519843 +0000 UTC m=+0.080002075 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:34:33 np0005539505 podman[244769]: 2025-11-29 07:34:33.773929482 +0000 UTC m=+0.092184599 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Nov 29 02:34:36 np0005539505 nova_compute[186958]: 2025-11-29 07:34:36.550 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:36 np0005539505 podman[244814]: 2025-11-29 07:34:36.743411043 +0000 UTC m=+0.068432709 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:34:37 np0005539505 nova_compute[186958]: 2025-11-29 07:34:37.929 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:41 np0005539505 nova_compute[186958]: 2025-11-29 07:34:41.092 186962 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.19 sec#033[00m
Nov 29 02:34:41 np0005539505 nova_compute[186958]: 2025-11-29 07:34:41.552 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:42 np0005539505 nova_compute[186958]: 2025-11-29 07:34:42.934 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:44 np0005539505 nova_compute[186958]: 2025-11-29 07:34:44.584 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:34:44 np0005539505 nova_compute[186958]: 2025-11-29 07:34:44.585 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:34:44 np0005539505 nova_compute[186958]: 2025-11-29 07:34:44.585 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:34:44 np0005539505 nova_compute[186958]: 2025-11-29 07:34:44.586 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3f269630-23a3-4378-bca8-2177bcee52e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:34:46 np0005539505 nova_compute[186958]: 2025-11-29 07:34:46.554 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:46 np0005539505 podman[244832]: 2025-11-29 07:34:46.715006123 +0000 UTC m=+0.049870996 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:34:46 np0005539505 podman[244833]: 2025-11-29 07:34:46.791313534 +0000 UTC m=+0.116692630 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:34:47 np0005539505 nova_compute[186958]: 2025-11-29 07:34:47.939 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.105 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'name': 'tempest-TestGettingAddress-server-1508463221', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000093', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0111c22b4b954ea586ca20d91ed3970f', 'user_id': '31ac7b05b012433b89143dc9f259644a', 'hostId': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.124 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.125 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a9eea43-cc7a-47cf-a903-1f68e59f1430', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-vda', 'timestamp': '2025-11-29T07:34:48.107053', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e21f8456-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.747914165, 'message_signature': '71dd211c50829c035074ba018316540d959b2ea7f54e38b470b513e6b4790dc1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-sda', 'timestamp': '2025-11-29T07:34:48.107053', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e21f9d4c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.747914165, 'message_signature': '0fb82a9ccfa943eeddd8bd6dd661d8ef443eb329e4a32a343fb0746f083a2b46'}]}, 'timestamp': '2025-11-29 07:34:48.126387', '_unique_id': 'e2fcb080d7ee49c8aa4321939c1b6a2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.133 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3f269630-23a3-4378-bca8-2177bcee52e5 / tap1c6a9dfe-d9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.134 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3f269630-23a3-4378-bca8-2177bcee52e5 / tapd916a91b-98 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.134 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.135 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b598a2f3-5f89-47d8-a5c5-0e4b80ca327a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tap1c6a9dfe-d9', 'timestamp': '2025-11-29T07:34:48.130181', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tap1c6a9dfe-d9', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:ee:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c6a9dfe-d9'}, 'message_id': 'e220ee18-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': 'a02e6ff0023b581d0856edfa85c9e7f43249bff0fac1d3df0b8af67ba85a75f9'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tapd916a91b-98', 'timestamp': '2025-11-29T07:34:48.130181', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tapd916a91b-98', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:06:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd916a91b-98'}, 'message_id': 'e221004c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '4682c66a66236440bb2c9946091c526a3c1d1f3ec4464bac24fc88e8e1371cc9'}]}, 'timestamp': '2025-11-29 07:34:48.135449', '_unique_id': '00553ffdc4754eac8adb572428498552'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.136 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.138 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.138 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6ab32f7-a87d-44e2-b9a0-d8f9b1e4c0b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tap1c6a9dfe-d9', 'timestamp': '2025-11-29T07:34:48.138270', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tap1c6a9dfe-d9', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:ee:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c6a9dfe-d9'}, 'message_id': 'e2217f5e-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': 'fb06e091324994f799e23f1118af1ebe17dce5893c07dd2f1c0d0e7d56a9e6a5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tapd916a91b-98', 'timestamp': '2025-11-29T07:34:48.138270', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tapd916a91b-98', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:06:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd916a91b-98'}, 'message_id': 'e2218e2c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '856136855b9270cf9f2743214c46306e167762fc6d393b34586a34908588e28b'}]}, 'timestamp': '2025-11-29 07:34:48.139070', '_unique_id': '780c7d69b1444c18a75bf534bf859baa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.139 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.141 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.141 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.outgoing.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fe3951a-110e-4790-8613-69d950ac23ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tap1c6a9dfe-d9', 'timestamp': '2025-11-29T07:34:48.141476', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tap1c6a9dfe-d9', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:ee:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c6a9dfe-d9'}, 'message_id': 'e221fbf0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': 'b2463c5a4040d1a33ea1bf855f9ec5fa527028eb0262668932a0a57e00c5dd6d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tapd916a91b-98', 'timestamp': '2025-11-29T07:34:48.141476', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tapd916a91b-98', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:06:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd916a91b-98'}, 'message_id': 'e2220abe-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '57f2149a806fc6cf2c95501ca9494dc042651ed5a7668f8a0db53f91782affb6'}]}, 'timestamp': '2025-11-29 07:34:48.142290', '_unique_id': 'e92d6c84a66248a39c47211d106f7f83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.143 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.181 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.write.latency volume: 3623554544 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.182 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e975ba8-d958-4040-968f-da77ad9599e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3623554544, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-vda', 'timestamp': '2025-11-29T07:34:48.144534', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e2282ed0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': 'ed49735f0192413a8ecab6366f312ed6e9865f10b43b2b7843fce8caf758be46'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-sda', 'timestamp': '2025-11-29T07:34:48.144534', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e2283e02-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': '80917f6ff3f1657ffb7b10841248dae35f26ab57aaa178f771be163f2f9ade51'}]}, 'timestamp': '2025-11-29 07:34:48.182889', '_unique_id': 'cf8c137d97804f5aa335c08467f90a34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.185 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.write.bytes volume: 73060352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.186 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f9c94ef-a5cc-4cda-8734-87dc37dfbd9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73060352, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-vda', 'timestamp': '2025-11-29T07:34:48.185828', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e228bf12-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': '158e44df80ced1e602a82adc366d8ed753d5ca1fb0438394d4483869174a70e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-sda', 'timestamp': '2025-11-29T07:34:48.185828', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e228ccc8-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': '142593fe3c509da5033aef4d9e5bc04466817207aaae5e21fe0c55d72939f315'}]}, 'timestamp': '2025-11-29 07:34:48.186529', '_unique_id': '9d36d56744cf47769775621ccfb74bfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.188 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.188 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1508463221>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1508463221>]
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.189 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.189 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7af7982b-0429-4f9f-a711-7091d5c59cdd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tap1c6a9dfe-d9', 'timestamp': '2025-11-29T07:34:48.189084', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tap1c6a9dfe-d9', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:ee:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c6a9dfe-d9'}, 'message_id': 'e2293e9c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '06d12d56704ce539894452fbfe3e99256602e2b7b607aa9f5b3c5a0eb54b6a24'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tapd916a91b-98', 'timestamp': '2025-11-29T07:34:48.189084', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tapd916a91b-98', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:06:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd916a91b-98'}, 'message_id': 'e2294b76-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '7062e8710e6e547f711dc8bfc3f00b6debc8fc7a984fd7778918d67ea3a124de'}]}, 'timestamp': '2025-11-29 07:34:48.189813', '_unique_id': 'e31316340ae64ae8b0006d1fddd380e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.190 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.191 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.192 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.incoming.bytes volume: 772 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb0eba46-0142-4d07-88e9-ea8bec2ec1df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tap1c6a9dfe-d9', 'timestamp': '2025-11-29T07:34:48.191752', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tap1c6a9dfe-d9', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:ee:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c6a9dfe-d9'}, 'message_id': 'e229a616-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': 'af0ac56d1a45f6bf65437c09f2b87784e2812a3d1e663a0df13fd234221dab9a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 772, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tapd916a91b-98', 'timestamp': '2025-11-29T07:34:48.191752', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tapd916a91b-98', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:06:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd916a91b-98'}, 'message_id': 'e229b37c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': 'f3f0014df21c9a5549bc7cdbdcef27e5d5d4887e23880cf20281a89c4f32f79e'}]}, 'timestamp': '2025-11-29 07:34:48.192437', '_unique_id': 'ca31cf808d12450b9f06b2fec40dd689'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.194 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.195 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b06768fb-ce8e-4097-a613-5cc91f348ee7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tap1c6a9dfe-d9', 'timestamp': '2025-11-29T07:34:48.194809', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tap1c6a9dfe-d9', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:ee:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c6a9dfe-d9'}, 'message_id': 'e22a1d80-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '488611be3ecc029a932bcc6b3cec71c3d8044a5470d5876a0957ab1ce147fde0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tapd916a91b-98', 'timestamp': '2025-11-29T07:34:48.194809', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tapd916a91b-98', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:06:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd916a91b-98'}, 'message_id': 'e22a2b22-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '34b586880ace430bc40fb35164765744d87947fbe86878b6060050c5a8ff73f3'}]}, 'timestamp': '2025-11-29 07:34:48.195501', '_unique_id': '0939c5b0be554b9694c3bff5c123b8be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.197 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.197 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1508463221>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1508463221>]
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.198 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.198 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1508463221>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1508463221>]
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.198 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.199 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.outgoing.bytes volume: 3304 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2b4aafa-c831-4d46-90cd-360969232100', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tap1c6a9dfe-d9', 'timestamp': '2025-11-29T07:34:48.198615', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tap1c6a9dfe-d9', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:ee:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c6a9dfe-d9'}, 'message_id': 'e22ab93e-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': 'de89718ef930de95389589d530e5007fb85eb314237057024631789372c85c03'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3304, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tapd916a91b-98', 'timestamp': '2025-11-29T07:34:48.198615', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tapd916a91b-98', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:06:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd916a91b-98'}, 'message_id': 'e22ac744-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '7a8f6457c7a46e5e2a9050b6fa53920dd15c7c93df03ff5aada39d48af694e4e'}]}, 'timestamp': '2025-11-29 07:34:48.199499', '_unique_id': '4ec535cab3a14c438cb0f107320d91b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.201 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.202 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7622198d-f612-4089-8196-f1ad1eba2936', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tap1c6a9dfe-d9', 'timestamp': '2025-11-29T07:34:48.201865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tap1c6a9dfe-d9', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:ee:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c6a9dfe-d9'}, 'message_id': 'e22b3544-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '5bd6df22185b15c0d7f2c0a959a2fb28376d6707a8894ab550ba8de29e470eb8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tapd916a91b-98', 'timestamp': '2025-11-29T07:34:48.201865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tapd916a91b-98', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:06:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd916a91b-98'}, 'message_id': 'e22b41a6-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '18180683c9e6f99ec39d93cfdc1422e6e50846165ad06df59206b65a6983c0ec'}]}, 'timestamp': '2025-11-29 07:34:48.202606', '_unique_id': '74dee8b6fb6d43d29038e2433f9f9890'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.204 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.read.bytes volume: 30665216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.204 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27035504-791e-4ded-98f8-33d0770e7bc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30665216, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-vda', 'timestamp': '2025-11-29T07:34:48.204313', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e22b8da0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': '77ff21086a49594b4aea755620cb2ba32923521687decadf468807dc0f62fdc7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-sda', 'timestamp': '2025-11-29T07:34:48.204313', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e22b970a-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': '10ddcaf117f3f8673b338424421b51884fc708b4f3658ab0112e20e0d297b91b'}]}, 'timestamp': '2025-11-29 07:34:48.204769', '_unique_id': '8bf0f44c9abc4680a42eedbb5787ab84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.205 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d593cc1-6697-433e-b1b9-2cd86709cb9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tap1c6a9dfe-d9', 'timestamp': '2025-11-29T07:34:48.205869', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tap1c6a9dfe-d9', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:ee:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c6a9dfe-d9'}, 'message_id': 'e22bcbee-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '01be1654d359e99ac9da17f92387bbaef0d3f0840788fa13b58e5977706a0e86'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tapd916a91b-98', 'timestamp': '2025-11-29T07:34:48.205869', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tapd916a91b-98', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:06:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd916a91b-98'}, 'message_id': 'e22bd4fe-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '7c49ce5f0de926d107d76f65ec67478c7a0b235b9d5550ce73ddf08ae7a3c68c'}]}, 'timestamp': '2025-11-29 07:34:48.206364', '_unique_id': 'd6601410cbb64794a7cf026557fe6683'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.206 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.207 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.207 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1508463221>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1508463221>]
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.207 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.207 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff7bc0cb-721e-4836-b0d4-bed434f191b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-vda', 'timestamp': '2025-11-29T07:34:48.207759', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e22c13f6-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.747914165, 'message_signature': 'e193996b2b20b449440a3e860ad78d469feaf89035015c55295000e43493f535'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-sda', 'timestamp': '2025-11-29T07:34:48.207759', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e22c1ba8-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.747914165, 'message_signature': '51384aa6f265ff7d667a0ab19de9f0af3db167701152a04538b18a44f89765c5'}]}, 'timestamp': '2025-11-29 07:34:48.208189', '_unique_id': '6568b6f93e254045b2f00c4c73d61c45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.209 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.read.requests volume: 1108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.209 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29c82c77-11fb-41f5-b064-07a414e10a27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1108, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-vda', 'timestamp': '2025-11-29T07:34:48.209354', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e22c530c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': '28267c8c2a55d2ffec14b664174cfb47d78669ef1d27049dbb6833f9040fb797'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-sda', 'timestamp': '2025-11-29T07:34:48.209354', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e22c5ff0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': 'e359a099ad6ca3879d6b30c385428ce6828831b605fd2849999d6b2d1bdfc737'}]}, 'timestamp': '2025-11-29 07:34:48.209969', '_unique_id': '76bec0c8e2d8444dbdc69a5eb892e080'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.211 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.211 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.read.latency volume: 226214529 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.211 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.read.latency volume: 21888362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9db1772f-a6e6-4a98-9a1c-bacbd05ce780', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 226214529, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-vda', 'timestamp': '2025-11-29T07:34:48.211113', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e22c9830-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': '9220887007c10081247c9f2730fdf512b21eef8fe99edd58a8b4f212f775142d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21888362, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-sda', 'timestamp': '2025-11-29T07:34:48.211113', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e22ca26c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': '797ad462b2795b9939187d1bf65fb8caf006d57663e4c6b91d8b0df63468c260'}]}, 'timestamp': '2025-11-29 07:34:48.211616', '_unique_id': '1469868e3006473a84352119f3b364c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.212 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6273cf0-3995-4f9a-87de-bdc38097f87a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tap1c6a9dfe-d9', 'timestamp': '2025-11-29T07:34:48.212725', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tap1c6a9dfe-d9', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:ee:26', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c6a9dfe-d9'}, 'message_id': 'e22cd624-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': 'b9ca89be11f834c130df307ab5a4344651d1b0a4b38c099914c77ffc4a0ed8b9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000093-3f269630-23a3-4378-bca8-2177bcee52e5-tapd916a91b-98', 'timestamp': '2025-11-29T07:34:48.212725', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'tapd916a91b-98', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:31:06:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd916a91b-98'}, 'message_id': 'e22cded0-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.771084348, 'message_signature': '89bd1c4878c19bc77cd8b67516661e8b16309d2cae48d7063e71d236d21058cb'}]}, 'timestamp': '2025-11-29 07:34:48.213170', '_unique_id': '43dc1284901046a2bdfb05d7205700b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.214 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '502c1551-61ef-4811-b760-ad5b49d1d243', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-vda', 'timestamp': '2025-11-29T07:34:48.214732', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e22d2610-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.747914165, 'message_signature': '4107e923181647d82e2e4d46c0c271adaf1c326540b20a6347a17f11ef2cc96c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-sda', 'timestamp': '2025-11-29T07:34:48.214732', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e22d30f6-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.747914165, 'message_signature': 'bbe466fd2989533e8cac55193c29e2b08771121337e5b20e580b019ef5234964'}]}, 'timestamp': '2025-11-29 07:34:48.215287', '_unique_id': '5ce19b6f1d7e441ea739d2200cc30c49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.215 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.216 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.236 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/cpu volume: 12460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74808748-d624-4b81-99af-308487d3bfda', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12460000000, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'timestamp': '2025-11-29T07:34:48.216746', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e230761c-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.877145627, 'message_signature': '705c7287375280dd270ebed57529e8a99df01119b5a37312949e0dca65d98dc0'}]}, 'timestamp': '2025-11-29 07:34:48.236744', '_unique_id': 'b69ba65d4e6a4150b500232e780c5735'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.237 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/memory.usage volume: 43.75 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c5efbda-9948-49f7-89f8-3164215ed89a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.75, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'timestamp': '2025-11-29T07:34:48.238182', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e230b992-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.877145627, 'message_signature': 'e9bb3a405f744c7fd07d0426e6138af977546b9d6a8eb3986e106d605b08f321'}]}, 'timestamp': '2025-11-29 07:34:48.238427', '_unique_id': '9e5ab7d9c81f4e40a242c0b64446be8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.238 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.239 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.239 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.write.requests volume: 320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.239 12 DEBUG ceilometer.compute.pollsters [-] 3f269630-23a3-4378-bca8-2177bcee52e5/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f28fd9fc-258a-423c-951b-dd40f97993de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 320, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-vda', 'timestamp': '2025-11-29T07:34:48.239539', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e230edb8-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': '8781d69e747a5c1b8c1da820f2081c4436f77f02e993d3fc4ca2015d1b83af31'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '3f269630-23a3-4378-bca8-2177bcee52e5-sda', 'timestamp': '2025-11-29T07:34:48.239539', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1508463221', 'name': 'instance-00000093', 'instance_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e230f6fa-ccf5-11f0-8954-fa163e5a5606', 'monotonic_time': 7135.785415542, 'message_signature': 'cc5418febdc55f18e2bf3fdd9342fdf38cbd02d97aca7076fbeaa95b0d668286'}]}, 'timestamp': '2025-11-29 07:34:48.239994', '_unique_id': '98b2e9fcc3474305937e78ae80a672ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:34:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:34:48.240 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:34:48 np0005539505 podman[244883]: 2025-11-29 07:34:48.739609282 +0000 UTC m=+0.068729858 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:34:48 np0005539505 podman[244884]: 2025-11-29 07:34:48.740331502 +0000 UTC m=+0.062755609 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:34:48 np0005539505 ovn_controller[95143]: 2025-11-29T07:34:48Z|00714|binding|INFO|Releasing lport b0b0536c-6e35-42c5-8936-a1236a4f216e from this chassis (sb_readonly=0)
Nov 29 02:34:48 np0005539505 ovn_controller[95143]: 2025-11-29T07:34:48Z|00715|binding|INFO|Releasing lport 78e8cb8e-6743-4ef2-8e7c-19feddf2ed97 from this chassis (sb_readonly=0)
Nov 29 02:34:49 np0005539505 nova_compute[186958]: 2025-11-29 07:34:49.104 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:51 np0005539505 nova_compute[186958]: 2025-11-29 07:34:51.558 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:52 np0005539505 nova_compute[186958]: 2025-11-29 07:34:52.941 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:53 np0005539505 ovn_controller[95143]: 2025-11-29T07:34:53Z|00716|binding|INFO|Releasing lport b0b0536c-6e35-42c5-8936-a1236a4f216e from this chassis (sb_readonly=0)
Nov 29 02:34:53 np0005539505 ovn_controller[95143]: 2025-11-29T07:34:53Z|00717|binding|INFO|Releasing lport 78e8cb8e-6743-4ef2-8e7c-19feddf2ed97 from this chassis (sb_readonly=0)
Nov 29 02:34:53 np0005539505 nova_compute[186958]: 2025-11-29 07:34:53.691 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.279 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updating instance_info_cache with network_info: [{"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.313 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.313 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.314 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.314 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.314 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.424 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.424 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.424 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.425 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.592 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.661 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.663 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.720 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.883 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.884 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5552MB free_disk=73.04502487182617GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.885 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:34:54 np0005539505 nova_compute[186958]: 2025-11-29 07:34:54.885 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:34:56 np0005539505 nova_compute[186958]: 2025-11-29 07:34:56.489 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 3f269630-23a3-4378-bca8-2177bcee52e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:34:56 np0005539505 nova_compute[186958]: 2025-11-29 07:34:56.490 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:34:56 np0005539505 nova_compute[186958]: 2025-11-29 07:34:56.490 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:34:56 np0005539505 nova_compute[186958]: 2025-11-29 07:34:56.560 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:34:56 np0005539505 nova_compute[186958]: 2025-11-29 07:34:56.699 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:34:56 np0005539505 nova_compute[186958]: 2025-11-29 07:34:56.739 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:34:56 np0005539505 nova_compute[186958]: 2025-11-29 07:34:56.740 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:34:56 np0005539505 nova_compute[186958]: 2025-11-29 07:34:56.785 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:34:56 np0005539505 nova_compute[186958]: 2025-11-29 07:34:56.867 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:34:56 np0005539505 nova_compute[186958]: 2025-11-29 07:34:56.994 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:34:57 np0005539505 nova_compute[186958]: 2025-11-29 07:34:57.313 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:34:57 np0005539505 nova_compute[186958]: 2025-11-29 07:34:57.316 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:34:57 np0005539505 nova_compute[186958]: 2025-11-29 07:34:57.317 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:34:57 np0005539505 nova_compute[186958]: 2025-11-29 07:34:57.945 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:00 np0005539505 nova_compute[186958]: 2025-11-29 07:35:00.313 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:01 np0005539505 nova_compute[186958]: 2025-11-29 07:35:01.562 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:02 np0005539505 nova_compute[186958]: 2025-11-29 07:35:02.950 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:04 np0005539505 podman[244929]: 2025-11-29 07:35:04.71410548 +0000 UTC m=+0.044973418 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:35:04 np0005539505 podman[244928]: 2025-11-29 07:35:04.715958322 +0000 UTC m=+0.049460405 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, distribution-scope=public, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Nov 29 02:35:06 np0005539505 nova_compute[186958]: 2025-11-29 07:35:06.618 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:07 np0005539505 podman[244973]: 2025-11-29 07:35:07.709324557 +0000 UTC m=+0.047273874 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 29 02:35:07 np0005539505 nova_compute[186958]: 2025-11-29 07:35:07.952 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:11 np0005539505 nova_compute[186958]: 2025-11-29 07:35:11.620 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:12 np0005539505 nova_compute[186958]: 2025-11-29 07:35:12.956 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:16 np0005539505 nova_compute[186958]: 2025-11-29 07:35:16.621 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:17 np0005539505 podman[244993]: 2025-11-29 07:35:17.722566285 +0000 UTC m=+0.048908060 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:35:17 np0005539505 podman[244994]: 2025-11-29 07:35:17.759148175 +0000 UTC m=+0.084104081 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:35:17 np0005539505 nova_compute[186958]: 2025-11-29 07:35:17.958 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:18.175 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:35:18 np0005539505 nova_compute[186958]: 2025-11-29 07:35:18.176 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:18.176 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:35:19 np0005539505 nova_compute[186958]: 2025-11-29 07:35:19.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:19 np0005539505 podman[245040]: 2025-11-29 07:35:19.715039616 +0000 UTC m=+0.050146384 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:35:19 np0005539505 podman[245041]: 2025-11-29 07:35:19.716747704 +0000 UTC m=+0.048582230 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 29 02:35:21 np0005539505 nova_compute[186958]: 2025-11-29 07:35:21.656 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:22.178 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:22 np0005539505 nova_compute[186958]: 2025-11-29 07:35:22.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:22 np0005539505 nova_compute[186958]: 2025-11-29 07:35:22.960 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:25 np0005539505 nova_compute[186958]: 2025-11-29 07:35:25.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:25 np0005539505 nova_compute[186958]: 2025-11-29 07:35:25.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:35:26 np0005539505 nova_compute[186958]: 2025-11-29 07:35:26.660 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:27 np0005539505 nova_compute[186958]: 2025-11-29 07:35:27.448 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:27.515 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:27.516 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:27.517 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:27 np0005539505 nova_compute[186958]: 2025-11-29 07:35:27.962 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:30 np0005539505 nova_compute[186958]: 2025-11-29 07:35:30.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:31 np0005539505 nova_compute[186958]: 2025-11-29 07:35:31.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:31 np0005539505 nova_compute[186958]: 2025-11-29 07:35:31.663 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:32 np0005539505 nova_compute[186958]: 2025-11-29 07:35:32.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:32 np0005539505 nova_compute[186958]: 2025-11-29 07:35:32.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:35:32 np0005539505 nova_compute[186958]: 2025-11-29 07:35:32.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:35:32 np0005539505 nova_compute[186958]: 2025-11-29 07:35:32.965 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:33 np0005539505 nova_compute[186958]: 2025-11-29 07:35:33.091 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:35:33 np0005539505 nova_compute[186958]: 2025-11-29 07:35:33.091 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:35:33 np0005539505 nova_compute[186958]: 2025-11-29 07:35:33.091 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:35:33 np0005539505 nova_compute[186958]: 2025-11-29 07:35:33.092 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3f269630-23a3-4378-bca8-2177bcee52e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:35:34 np0005539505 nova_compute[186958]: 2025-11-29 07:35:34.588 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:35 np0005539505 podman[245082]: 2025-11-29 07:35:35.730122095 +0000 UTC m=+0.059957561 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:35:35 np0005539505 podman[245081]: 2025-11-29 07:35:35.735909868 +0000 UTC m=+0.065202258 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Nov 29 02:35:36 np0005539505 nova_compute[186958]: 2025-11-29 07:35:36.757 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:38 np0005539505 nova_compute[186958]: 2025-11-29 07:35:38.016 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:38 np0005539505 nova_compute[186958]: 2025-11-29 07:35:38.156 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updating instance_info_cache with network_info: [{"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:35:38 np0005539505 nova_compute[186958]: 2025-11-29 07:35:38.179 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:35:38 np0005539505 nova_compute[186958]: 2025-11-29 07:35:38.179 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:35:38 np0005539505 nova_compute[186958]: 2025-11-29 07:35:38.180 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:38 np0005539505 nova_compute[186958]: 2025-11-29 07:35:38.207 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:38 np0005539505 nova_compute[186958]: 2025-11-29 07:35:38.208 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:38 np0005539505 nova_compute[186958]: 2025-11-29 07:35:38.208 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:38 np0005539505 nova_compute[186958]: 2025-11-29 07:35:38.208 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:35:38 np0005539505 podman[245126]: 2025-11-29 07:35:38.735379995 +0000 UTC m=+0.066260409 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:35:39 np0005539505 nova_compute[186958]: 2025-11-29 07:35:39.842 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:35:39 np0005539505 nova_compute[186958]: 2025-11-29 07:35:39.905 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:35:39 np0005539505 nova_compute[186958]: 2025-11-29 07:35:39.906 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:35:39 np0005539505 nova_compute[186958]: 2025-11-29 07:35:39.957 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:35:40 np0005539505 nova_compute[186958]: 2025-11-29 07:35:40.146 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:35:40 np0005539505 nova_compute[186958]: 2025-11-29 07:35:40.148 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5542MB free_disk=73.04546356201172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:35:40 np0005539505 nova_compute[186958]: 2025-11-29 07:35:40.148 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:40 np0005539505 nova_compute[186958]: 2025-11-29 07:35:40.149 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:40 np0005539505 nova_compute[186958]: 2025-11-29 07:35:40.235 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 3f269630-23a3-4378-bca8-2177bcee52e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:35:40 np0005539505 nova_compute[186958]: 2025-11-29 07:35:40.236 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:35:40 np0005539505 nova_compute[186958]: 2025-11-29 07:35:40.236 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:35:40 np0005539505 nova_compute[186958]: 2025-11-29 07:35:40.308 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:35:40 np0005539505 nova_compute[186958]: 2025-11-29 07:35:40.331 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:35:40 np0005539505 nova_compute[186958]: 2025-11-29 07:35:40.332 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:35:40 np0005539505 nova_compute[186958]: 2025-11-29 07:35:40.333 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:41 np0005539505 nova_compute[186958]: 2025-11-29 07:35:41.759 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:42 np0005539505 nova_compute[186958]: 2025-11-29 07:35:42.532 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:42 np0005539505 nova_compute[186958]: 2025-11-29 07:35:42.534 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:35:43 np0005539505 nova_compute[186958]: 2025-11-29 07:35:43.019 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:44 np0005539505 nova_compute[186958]: 2025-11-29 07:35:44.198 186962 DEBUG nova.compute.manager [req-e9adf263-8952-4246-9a26-f990653520af req-1998e464-ae9b-4688-b215-998bd91df41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-changed-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:44 np0005539505 nova_compute[186958]: 2025-11-29 07:35:44.199 186962 DEBUG nova.compute.manager [req-e9adf263-8952-4246-9a26-f990653520af req-1998e464-ae9b-4688-b215-998bd91df41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Refreshing instance network info cache due to event network-changed-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:35:44 np0005539505 nova_compute[186958]: 2025-11-29 07:35:44.199 186962 DEBUG oslo_concurrency.lockutils [req-e9adf263-8952-4246-9a26-f990653520af req-1998e464-ae9b-4688-b215-998bd91df41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:35:44 np0005539505 nova_compute[186958]: 2025-11-29 07:35:44.199 186962 DEBUG oslo_concurrency.lockutils [req-e9adf263-8952-4246-9a26-f990653520af req-1998e464-ae9b-4688-b215-998bd91df41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:35:44 np0005539505 nova_compute[186958]: 2025-11-29 07:35:44.200 186962 DEBUG nova.network.neutron [req-e9adf263-8952-4246-9a26-f990653520af req-1998e464-ae9b-4688-b215-998bd91df41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Refreshing network info cache for port 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:35:45 np0005539505 nova_compute[186958]: 2025-11-29 07:35:45.980 186962 DEBUG oslo_concurrency.lockutils [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:45 np0005539505 nova_compute[186958]: 2025-11-29 07:35:45.981 186962 DEBUG oslo_concurrency.lockutils [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:45 np0005539505 nova_compute[186958]: 2025-11-29 07:35:45.981 186962 DEBUG oslo_concurrency.lockutils [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:45 np0005539505 nova_compute[186958]: 2025-11-29 07:35:45.982 186962 DEBUG oslo_concurrency.lockutils [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:45 np0005539505 nova_compute[186958]: 2025-11-29 07:35:45.982 186962 DEBUG oslo_concurrency.lockutils [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.000 186962 INFO nova.compute.manager [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Terminating instance#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.015 186962 DEBUG nova.compute.manager [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:35:46 np0005539505 kernel: tap1c6a9dfe-d9 (unregistering): left promiscuous mode
Nov 29 02:35:46 np0005539505 NetworkManager[55134]: <info>  [1764401746.0494] device (tap1c6a9dfe-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.101 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:35:46Z|00718|binding|INFO|Releasing lport 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 from this chassis (sb_readonly=0)
Nov 29 02:35:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:35:46Z|00719|binding|INFO|Setting lport 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 down in Southbound
Nov 29 02:35:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:35:46Z|00720|binding|INFO|Removing iface tap1c6a9dfe-d9 ovn-installed in OVS
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.105 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:46.113 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:ee:26 10.100.0.7'], port_security=['fa:16:3e:4a:ee:26 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7900eada-6f98-452f-b178-ad26f2b82064', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0790a775-3668-4bb8-97f1-d4276df58523, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:35:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:46.115 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 in datapath ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 unbound from our chassis#033[00m
Nov 29 02:35:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:46.119 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:35:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:46.120 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cb57e88a-6700-49a7-8a8d-3668f8cf2976]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:46.121 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 namespace which is not needed anymore#033[00m
Nov 29 02:35:46 np0005539505 kernel: tapd916a91b-98 (unregistering): left promiscuous mode
Nov 29 02:35:46 np0005539505 NetworkManager[55134]: <info>  [1764401746.1323] device (tapd916a91b-98): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.136 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:35:46Z|00721|binding|INFO|Releasing lport d916a91b-985b-452b-a2c4-b166a0be47b3 from this chassis (sb_readonly=0)
Nov 29 02:35:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:35:46Z|00722|binding|INFO|Setting lport d916a91b-985b-452b-a2c4-b166a0be47b3 down in Southbound
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.147 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:46 np0005539505 ovn_controller[95143]: 2025-11-29T07:35:46Z|00723|binding|INFO|Removing iface tapd916a91b-98 ovn-installed in OVS
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.151 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:46.157 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:06:4a 2001:db8:0:1:f816:3eff:fe31:64a 2001:db8::f816:3eff:fe31:64a'], port_security=['fa:16:3e:31:06:4a 2001:db8:0:1:f816:3eff:fe31:64a 2001:db8::f816:3eff:fe31:64a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe31:64a/64 2001:db8::f816:3eff:fe31:64a/64', 'neutron:device_id': '3f269630-23a3-4378-bca8-2177bcee52e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-716ed53e-cc56-4286-b418-2f5e02d33124', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7900eada-6f98-452f-b178-ad26f2b82064', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c069d1db-d7e5-4641-988e-cd6e75103caa, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=d916a91b-985b-452b-a2c4-b166a0be47b3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.178 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:46 np0005539505 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000093.scope: Deactivated successfully.
Nov 29 02:35:46 np0005539505 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d00000093.scope: Consumed 19.889s CPU time.
Nov 29 02:35:46 np0005539505 systemd-machined[153285]: Machine qemu-77-instance-00000093 terminated.
Nov 29 02:35:46 np0005539505 NetworkManager[55134]: <info>  [1764401746.2361] manager: (tap1c6a9dfe-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/351)
Nov 29 02:35:46 np0005539505 NetworkManager[55134]: <info>  [1764401746.2495] manager: (tapd916a91b-98): new Tun device (/org/freedesktop/NetworkManager/Devices/352)
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.293 186962 INFO nova.virt.libvirt.driver [-] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Instance destroyed successfully.#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.294 186962 DEBUG nova.objects.instance [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid 3f269630-23a3-4378-bca8-2177bcee52e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.311 186962 DEBUG nova.virt.libvirt.vif [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:32:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1508463221',display_name='tempest-TestGettingAddress-server-1508463221',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1508463221',id=147,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:33:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-j3ghn6ga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:33:22Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=3f269630-23a3-4378-bca8-2177bcee52e5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.312 186962 DEBUG nova.network.os_vif_util [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.313 186962 DEBUG nova.network.os_vif_util [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:ee:26,bridge_name='br-int',has_traffic_filtering=True,id=1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c6a9dfe-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.314 186962 DEBUG os_vif [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:ee:26,bridge_name='br-int',has_traffic_filtering=True,id=1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c6a9dfe-d9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.316 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.316 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c6a9dfe-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.320 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.324 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.327 186962 INFO os_vif [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:ee:26,bridge_name='br-int',has_traffic_filtering=True,id=1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28,network=Network(ee42f4e1-7038-4a24-8d9b-8ee99ca415d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c6a9dfe-d9')#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.328 186962 DEBUG nova.virt.libvirt.vif [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:32:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1508463221',display_name='tempest-TestGettingAddress-server-1508463221',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1508463221',id=147,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBK18BM+cTj/3sUxBzPfGqMZVFW8u0MAYl6D47npYBAuFCVOWerNtWreCBLJmXVjolkwHlCMhwEYpOxlO5EeqIXZi9GpQKv0jmJXL6Uw9pSzb3x5DyZSKVQsbQBVQ+UXj+w==',key_name='tempest-TestGettingAddress-668700412',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:33:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-j3ghn6ga',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:33:22Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=3f269630-23a3-4378-bca8-2177bcee52e5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.328 186962 DEBUG nova.network.os_vif_util [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.329 186962 DEBUG nova.network.os_vif_util [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:31:06:4a,bridge_name='br-int',has_traffic_filtering=True,id=d916a91b-985b-452b-a2c4-b166a0be47b3,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd916a91b-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.330 186962 DEBUG os_vif [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:06:4a,bridge_name='br-int',has_traffic_filtering=True,id=d916a91b-985b-452b-a2c4-b166a0be47b3,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd916a91b-98') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.331 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.331 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd916a91b-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.333 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.335 186962 INFO os_vif [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:06:4a,bridge_name='br-int',has_traffic_filtering=True,id=d916a91b-985b-452b-a2c4-b166a0be47b3,network=Network(716ed53e-cc56-4286-b418-2f5e02d33124),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd916a91b-98')#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.336 186962 INFO nova.virt.libvirt.driver [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Deleting instance files /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5_del#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.336 186962 INFO nova.virt.libvirt.driver [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Deletion of /var/lib/nova/instances/3f269630-23a3-4378-bca8-2177bcee52e5_del complete#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.425 186962 INFO nova.compute.manager [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.426 186962 DEBUG oslo.service.loopingcall [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.426 186962 DEBUG nova.compute.manager [-] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.426 186962 DEBUG nova.network.neutron [-] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:35:46 np0005539505 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[244303]: [NOTICE]   (244307) : haproxy version is 2.8.14-c23fe91
Nov 29 02:35:46 np0005539505 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[244303]: [NOTICE]   (244307) : path to executable is /usr/sbin/haproxy
Nov 29 02:35:46 np0005539505 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[244303]: [WARNING]  (244307) : Exiting Master process...
Nov 29 02:35:46 np0005539505 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[244303]: [WARNING]  (244307) : Exiting Master process...
Nov 29 02:35:46 np0005539505 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[244303]: [ALERT]    (244307) : Current worker (244309) exited with code 143 (Terminated)
Nov 29 02:35:46 np0005539505 neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0[244303]: [WARNING]  (244307) : All workers exited. Exiting... (0)
Nov 29 02:35:46 np0005539505 systemd[1]: libpod-5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954.scope: Deactivated successfully.
Nov 29 02:35:46 np0005539505 podman[245180]: 2025-11-29 07:35:46.442490621 +0000 UTC m=+0.193932816 container died 5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:35:46 np0005539505 nova_compute[186958]: 2025-11-29 07:35:46.762 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:47 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954-userdata-shm.mount: Deactivated successfully.
Nov 29 02:35:47 np0005539505 systemd[1]: var-lib-containers-storage-overlay-dcb08d7b4c0df3991bb94c0c038669b1c0fb81315879f7dd53c8d69b63f44a0d-merged.mount: Deactivated successfully.
Nov 29 02:35:47 np0005539505 podman[245180]: 2025-11-29 07:35:47.446915913 +0000 UTC m=+1.198358058 container cleanup 5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:35:47 np0005539505 systemd[1]: libpod-conmon-5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954.scope: Deactivated successfully.
Nov 29 02:35:47 np0005539505 podman[245240]: 2025-11-29 07:35:47.560493944 +0000 UTC m=+0.089956296 container remove 5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.565 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5ac77cda-1f58-4ac2-bda1-cfce4188053a]: (4, ('Sat Nov 29 07:35:46 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 (5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954)\n5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954\nSat Nov 29 07:35:47 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 (5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954)\n5669cae5a2a217129f3fe75d5fe5157d168ff8c37d2b5118a92afa629328a954\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.567 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b18dec40-9870-4cde-8182-2a9cdad01e88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.568 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee42f4e1-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:47 np0005539505 nova_compute[186958]: 2025-11-29 07:35:47.569 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:47 np0005539505 kernel: tapee42f4e1-70: left promiscuous mode
Nov 29 02:35:47 np0005539505 nova_compute[186958]: 2025-11-29 07:35:47.581 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.583 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bddaf86e-3ae4-4172-8c2e-d3a7e75a60dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.594 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[410249ad-b841-4e14-85d6-aa6acc361b82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.595 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0e1897ec-0e77-41ea-a494-f807a37d697d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.618 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[55efb771-a59d-49fb-a8f2-7d1e76de90ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 704227, 'reachable_time': 29894, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245257, 'error': None, 'target': 'ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.621 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee42f4e1-7038-4a24-8d9b-8ee99ca415d0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.622 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0dfc43-7e53-400c-ac75-c39346a875bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.623 104094 INFO neutron.agent.ovn.metadata.agent [-] Port d916a91b-985b-452b-a2c4-b166a0be47b3 in datapath 716ed53e-cc56-4286-b418-2f5e02d33124 unbound from our chassis#033[00m
Nov 29 02:35:47 np0005539505 systemd[1]: run-netns-ovnmeta\x2dee42f4e1\x2d7038\x2d4a24\x2d8d9b\x2d8ee99ca415d0.mount: Deactivated successfully.
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.624 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 716ed53e-cc56-4286-b418-2f5e02d33124, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.625 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[01e784c9-5508-4e50-8a80-d50050ee92b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.625 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124 namespace which is not needed anymore#033[00m
Nov 29 02:35:47 np0005539505 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[244386]: [NOTICE]   (244420) : haproxy version is 2.8.14-c23fe91
Nov 29 02:35:47 np0005539505 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[244386]: [NOTICE]   (244420) : path to executable is /usr/sbin/haproxy
Nov 29 02:35:47 np0005539505 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[244386]: [WARNING]  (244420) : Exiting Master process...
Nov 29 02:35:47 np0005539505 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[244386]: [ALERT]    (244420) : Current worker (244428) exited with code 143 (Terminated)
Nov 29 02:35:47 np0005539505 neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124[244386]: [WARNING]  (244420) : All workers exited. Exiting... (0)
Nov 29 02:35:47 np0005539505 systemd[1]: libpod-727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6.scope: Deactivated successfully.
Nov 29 02:35:47 np0005539505 podman[245276]: 2025-11-29 07:35:47.748006407 +0000 UTC m=+0.045698219 container died 727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:35:47 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6-userdata-shm.mount: Deactivated successfully.
Nov 29 02:35:47 np0005539505 systemd[1]: var-lib-containers-storage-overlay-d60a516de93f656b45cb56f081b477b380b6ddd8250887ef8bbd07a61b1c5089-merged.mount: Deactivated successfully.
Nov 29 02:35:47 np0005539505 podman[245276]: 2025-11-29 07:35:47.778062154 +0000 UTC m=+0.075753986 container cleanup 727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:35:47 np0005539505 systemd[1]: libpod-conmon-727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6.scope: Deactivated successfully.
Nov 29 02:35:47 np0005539505 podman[245291]: 2025-11-29 07:35:47.827343573 +0000 UTC m=+0.054514997 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:35:47 np0005539505 podman[245320]: 2025-11-29 07:35:47.847787289 +0000 UTC m=+0.047273053 container remove 727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.853 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cefa6faf-9c6d-418b-aa60-7d578a82106c]: (4, ('Sat Nov 29 07:35:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124 (727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6)\n727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6\nSat Nov 29 07:35:47 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124 (727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6)\n727729c75f97822b7b23f110ec19a032ec7cefad5e6a238c1add4bd7cd9527d6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.854 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a5969325-cb5e-46ef-bb8a-e25696ae6492]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.855 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap716ed53e-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:35:47 np0005539505 nova_compute[186958]: 2025-11-29 07:35:47.856 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:47 np0005539505 kernel: tap716ed53e-c0: left promiscuous mode
Nov 29 02:35:47 np0005539505 nova_compute[186958]: 2025-11-29 07:35:47.872 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:47 np0005539505 podman[245300]: 2025-11-29 07:35:47.875598682 +0000 UTC m=+0.093985399 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller)
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.875 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[edb023fe-30c7-4498-8432-f5b56a42460e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.886 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b0de991e-3684-459d-8fe5-711f6c8fffc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.887 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[aeb560c7-e2a4-4905-ba9f-ab2264e6a118]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.902 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dc6b7713-9565-4c81-825d-c2824b82b37f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 704309, 'reachable_time': 16533, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245362, 'error': None, 'target': 'ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.903 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-716ed53e-cc56-4286-b418-2f5e02d33124 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:35:47 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:35:47.904 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[55975495-4ecb-4be2-b8e0-c06413847b5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:35:48 np0005539505 systemd[1]: run-netns-ovnmeta\x2d716ed53e\x2dcc56\x2d4286\x2db418\x2d2f5e02d33124.mount: Deactivated successfully.
Nov 29 02:35:50 np0005539505 podman[245364]: 2025-11-29 07:35:50.727025177 +0000 UTC m=+0.057527702 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:35:50 np0005539505 podman[245365]: 2025-11-29 07:35:50.736038651 +0000 UTC m=+0.062651937 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3)
Nov 29 02:35:51 np0005539505 nova_compute[186958]: 2025-11-29 07:35:51.335 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:51 np0005539505 nova_compute[186958]: 2025-11-29 07:35:51.764 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:53 np0005539505 nova_compute[186958]: 2025-11-29 07:35:53.966 186962 DEBUG nova.compute.manager [req-6dce417d-81c9-4843-a234-b0ba19fbc79f req-ffc01460-8a4c-40c1-84b3-722f946d4e2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-unplugged-d916a91b-985b-452b-a2c4-b166a0be47b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:35:53 np0005539505 nova_compute[186958]: 2025-11-29 07:35:53.967 186962 DEBUG oslo_concurrency.lockutils [req-6dce417d-81c9-4843-a234-b0ba19fbc79f req-ffc01460-8a4c-40c1-84b3-722f946d4e2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:35:53 np0005539505 nova_compute[186958]: 2025-11-29 07:35:53.967 186962 DEBUG oslo_concurrency.lockutils [req-6dce417d-81c9-4843-a234-b0ba19fbc79f req-ffc01460-8a4c-40c1-84b3-722f946d4e2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:35:53 np0005539505 nova_compute[186958]: 2025-11-29 07:35:53.967 186962 DEBUG oslo_concurrency.lockutils [req-6dce417d-81c9-4843-a234-b0ba19fbc79f req-ffc01460-8a4c-40c1-84b3-722f946d4e2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:35:53 np0005539505 nova_compute[186958]: 2025-11-29 07:35:53.968 186962 DEBUG nova.compute.manager [req-6dce417d-81c9-4843-a234-b0ba19fbc79f req-ffc01460-8a4c-40c1-84b3-722f946d4e2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] No waiting events found dispatching network-vif-unplugged-d916a91b-985b-452b-a2c4-b166a0be47b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:35:53 np0005539505 nova_compute[186958]: 2025-11-29 07:35:53.968 186962 DEBUG nova.compute.manager [req-6dce417d-81c9-4843-a234-b0ba19fbc79f req-ffc01460-8a4c-40c1-84b3-722f946d4e2f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-unplugged-d916a91b-985b-452b-a2c4-b166a0be47b3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:35:54 np0005539505 nova_compute[186958]: 2025-11-29 07:35:54.067 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:54 np0005539505 nova_compute[186958]: 2025-11-29 07:35:54.311 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:56 np0005539505 nova_compute[186958]: 2025-11-29 07:35:56.362 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:56 np0005539505 nova_compute[186958]: 2025-11-29 07:35:56.766 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:35:59 np0005539505 nova_compute[186958]: 2025-11-29 07:35:59.392 186962 DEBUG nova.network.neutron [req-e9adf263-8952-4246-9a26-f990653520af req-1998e464-ae9b-4688-b215-998bd91df41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updated VIF entry in instance network info cache for port 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:35:59 np0005539505 nova_compute[186958]: 2025-11-29 07:35:59.393 186962 DEBUG nova.network.neutron [req-e9adf263-8952-4246-9a26-f990653520af req-1998e464-ae9b-4688-b215-998bd91df41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updating instance_info_cache with network_info: [{"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d916a91b-985b-452b-a2c4-b166a0be47b3", "address": "fa:16:3e:31:06:4a", "network": {"id": "716ed53e-cc56-4286-b418-2f5e02d33124", "bridge": "br-int", "label": "tempest-network-smoke--990040375", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:64a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd916a91b-98", "ovs_interfaceid": "d916a91b-985b-452b-a2c4-b166a0be47b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.225 186962 DEBUG oslo_concurrency.lockutils [req-e9adf263-8952-4246-9a26-f990653520af req-1998e464-ae9b-4688-b215-998bd91df41e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-3f269630-23a3-4378-bca8-2177bcee52e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.292 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401746.2907948, 3f269630-23a3-4378-bca8-2177bcee52e5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.293 186962 INFO nova.compute.manager [-] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.372 186962 DEBUG nova.compute.manager [req-253840ad-e877-40b9-9462-0f318abba13a req-a443fd51-b25e-43cb-b9bd-06c29f37d9be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-deleted-d916a91b-985b-452b-a2c4-b166a0be47b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.373 186962 INFO nova.compute.manager [req-253840ad-e877-40b9-9462-0f318abba13a req-a443fd51-b25e-43cb-b9bd-06c29f37d9be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Neutron deleted interface d916a91b-985b-452b-a2c4-b166a0be47b3; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.374 186962 DEBUG nova.network.neutron [req-253840ad-e877-40b9-9462-0f318abba13a req-a443fd51-b25e-43cb-b9bd-06c29f37d9be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updating instance_info_cache with network_info: [{"id": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "address": "fa:16:3e:4a:ee:26", "network": {"id": "ee42f4e1-7038-4a24-8d9b-8ee99ca415d0", "bridge": "br-int", "label": "tempest-network-smoke--1973917221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.244", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c6a9dfe-d9", "ovs_interfaceid": "1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.377 186962 DEBUG nova.compute.manager [req-6eb9d7bc-0ede-496f-97fc-4d96e9b70aaa req-42e3892f-71cb-4bb7-9d2b-deaadbbbebee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-unplugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.378 186962 DEBUG oslo_concurrency.lockutils [req-6eb9d7bc-0ede-496f-97fc-4d96e9b70aaa req-42e3892f-71cb-4bb7-9d2b-deaadbbbebee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.378 186962 DEBUG oslo_concurrency.lockutils [req-6eb9d7bc-0ede-496f-97fc-4d96e9b70aaa req-42e3892f-71cb-4bb7-9d2b-deaadbbbebee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.379 186962 DEBUG oslo_concurrency.lockutils [req-6eb9d7bc-0ede-496f-97fc-4d96e9b70aaa req-42e3892f-71cb-4bb7-9d2b-deaadbbbebee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.379 186962 DEBUG nova.compute.manager [req-6eb9d7bc-0ede-496f-97fc-4d96e9b70aaa req-42e3892f-71cb-4bb7-9d2b-deaadbbbebee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] No waiting events found dispatching network-vif-unplugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.380 186962 DEBUG nova.compute.manager [req-6eb9d7bc-0ede-496f-97fc-4d96e9b70aaa req-42e3892f-71cb-4bb7-9d2b-deaadbbbebee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-unplugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.380 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.585 186962 DEBUG nova.compute.manager [None req-efa56f9c-f8c2-49c5-87fb-faf9a83851cd - - - - - -] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:36:01 np0005539505 nova_compute[186958]: 2025-11-29 07:36:01.768 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:02 np0005539505 nova_compute[186958]: 2025-11-29 07:36:02.300 186962 DEBUG nova.compute.manager [req-f68c0fc6-c01c-4d69-b8f9-88476680f75d req-6ce0cc98-d3f2-4963-8ea0-7014983b54c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-plugged-d916a91b-985b-452b-a2c4-b166a0be47b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:02 np0005539505 nova_compute[186958]: 2025-11-29 07:36:02.301 186962 DEBUG oslo_concurrency.lockutils [req-f68c0fc6-c01c-4d69-b8f9-88476680f75d req-6ce0cc98-d3f2-4963-8ea0-7014983b54c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:02 np0005539505 nova_compute[186958]: 2025-11-29 07:36:02.301 186962 DEBUG oslo_concurrency.lockutils [req-f68c0fc6-c01c-4d69-b8f9-88476680f75d req-6ce0cc98-d3f2-4963-8ea0-7014983b54c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:02 np0005539505 nova_compute[186958]: 2025-11-29 07:36:02.301 186962 DEBUG oslo_concurrency.lockutils [req-f68c0fc6-c01c-4d69-b8f9-88476680f75d req-6ce0cc98-d3f2-4963-8ea0-7014983b54c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:02 np0005539505 nova_compute[186958]: 2025-11-29 07:36:02.302 186962 DEBUG nova.compute.manager [req-f68c0fc6-c01c-4d69-b8f9-88476680f75d req-6ce0cc98-d3f2-4963-8ea0-7014983b54c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] No waiting events found dispatching network-vif-plugged-d916a91b-985b-452b-a2c4-b166a0be47b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:02 np0005539505 nova_compute[186958]: 2025-11-29 07:36:02.302 186962 WARNING nova.compute.manager [req-f68c0fc6-c01c-4d69-b8f9-88476680f75d req-6ce0cc98-d3f2-4963-8ea0-7014983b54c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received unexpected event network-vif-plugged-d916a91b-985b-452b-a2c4-b166a0be47b3 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:36:02 np0005539505 nova_compute[186958]: 2025-11-29 07:36:02.322 186962 DEBUG nova.compute.manager [req-253840ad-e877-40b9-9462-0f318abba13a req-a443fd51-b25e-43cb-b9bd-06c29f37d9be 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Detach interface failed, port_id=d916a91b-985b-452b-a2c4-b166a0be47b3, reason: Instance 3f269630-23a3-4378-bca8-2177bcee52e5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:36:06 np0005539505 nova_compute[186958]: 2025-11-29 07:36:06.383 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:06 np0005539505 podman[245401]: 2025-11-29 07:36:06.745517285 +0000 UTC m=+0.059803576 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:36:06 np0005539505 podman[245400]: 2025-11-29 07:36:06.753005506 +0000 UTC m=+0.073212584 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:36:06 np0005539505 nova_compute[186958]: 2025-11-29 07:36:06.806 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:08 np0005539505 nova_compute[186958]: 2025-11-29 07:36:08.444 186962 DEBUG nova.compute.manager [req-8a16ce4e-c12d-4e43-a013-07bc80f232f5 req-062deffa-f97c-48a3-826a-bcc42265c505 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-plugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:08 np0005539505 nova_compute[186958]: 2025-11-29 07:36:08.445 186962 DEBUG oslo_concurrency.lockutils [req-8a16ce4e-c12d-4e43-a013-07bc80f232f5 req-062deffa-f97c-48a3-826a-bcc42265c505 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:08 np0005539505 nova_compute[186958]: 2025-11-29 07:36:08.445 186962 DEBUG oslo_concurrency.lockutils [req-8a16ce4e-c12d-4e43-a013-07bc80f232f5 req-062deffa-f97c-48a3-826a-bcc42265c505 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:08 np0005539505 nova_compute[186958]: 2025-11-29 07:36:08.445 186962 DEBUG oslo_concurrency.lockutils [req-8a16ce4e-c12d-4e43-a013-07bc80f232f5 req-062deffa-f97c-48a3-826a-bcc42265c505 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:08 np0005539505 nova_compute[186958]: 2025-11-29 07:36:08.445 186962 DEBUG nova.compute.manager [req-8a16ce4e-c12d-4e43-a013-07bc80f232f5 req-062deffa-f97c-48a3-826a-bcc42265c505 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] No waiting events found dispatching network-vif-plugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:36:08 np0005539505 nova_compute[186958]: 2025-11-29 07:36:08.446 186962 WARNING nova.compute.manager [req-8a16ce4e-c12d-4e43-a013-07bc80f232f5 req-062deffa-f97c-48a3-826a-bcc42265c505 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received unexpected event network-vif-plugged-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:36:08 np0005539505 nova_compute[186958]: 2025-11-29 07:36:08.656 186962 DEBUG nova.network.neutron [-] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:36:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:36:09.234 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:36:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:36:09.235 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:36:09 np0005539505 nova_compute[186958]: 2025-11-29 07:36:09.249 186962 DEBUG nova.compute.manager [req-277062c5-de2b-4d51-9a7d-e52c608a1de5 req-1b53d747-9a3b-4ee0-b6b9-e02ea9fcbd48 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Received event network-vif-deleted-1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:36:09 np0005539505 nova_compute[186958]: 2025-11-29 07:36:09.249 186962 INFO nova.compute.manager [req-277062c5-de2b-4d51-9a7d-e52c608a1de5 req-1b53d747-9a3b-4ee0-b6b9-e02ea9fcbd48 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Neutron deleted interface 1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:36:09 np0005539505 nova_compute[186958]: 2025-11-29 07:36:09.249 186962 DEBUG nova.network.neutron [req-277062c5-de2b-4d51-9a7d-e52c608a1de5 req-1b53d747-9a3b-4ee0-b6b9-e02ea9fcbd48 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:36:09 np0005539505 nova_compute[186958]: 2025-11-29 07:36:09.281 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:09 np0005539505 podman[245441]: 2025-11-29 07:36:09.704938873 +0000 UTC m=+0.044822764 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 29 02:36:10 np0005539505 nova_compute[186958]: 2025-11-29 07:36:10.081 186962 INFO nova.compute.manager [-] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Took 23.66 seconds to deallocate network for instance.#033[00m
Nov 29 02:36:10 np0005539505 nova_compute[186958]: 2025-11-29 07:36:10.086 186962 DEBUG nova.compute.manager [req-277062c5-de2b-4d51-9a7d-e52c608a1de5 req-1b53d747-9a3b-4ee0-b6b9-e02ea9fcbd48 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 3f269630-23a3-4378-bca8-2177bcee52e5] Detach interface failed, port_id=1c6a9dfe-d9c5-4fcc-8ca8-a1a517e9ef28, reason: Instance 3f269630-23a3-4378-bca8-2177bcee52e5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:36:10 np0005539505 nova_compute[186958]: 2025-11-29 07:36:10.493 186962 DEBUG oslo_concurrency.lockutils [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:10 np0005539505 nova_compute[186958]: 2025-11-29 07:36:10.494 186962 DEBUG oslo_concurrency.lockutils [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:10 np0005539505 nova_compute[186958]: 2025-11-29 07:36:10.567 186962 DEBUG nova.compute.provider_tree [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:36:11 np0005539505 nova_compute[186958]: 2025-11-29 07:36:11.387 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:11 np0005539505 nova_compute[186958]: 2025-11-29 07:36:11.808 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:11 np0005539505 nova_compute[186958]: 2025-11-29 07:36:11.871 186962 DEBUG nova.scheduler.client.report [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:36:13 np0005539505 nova_compute[186958]: 2025-11-29 07:36:13.342 186962 DEBUG oslo_concurrency.lockutils [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.848s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:36:14.237 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:36:15 np0005539505 nova_compute[186958]: 2025-11-29 07:36:15.124 186962 INFO nova.scheduler.client.report [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance 3f269630-23a3-4378-bca8-2177bcee52e5#033[00m
Nov 29 02:36:15 np0005539505 nova_compute[186958]: 2025-11-29 07:36:15.349 186962 DEBUG oslo_concurrency.lockutils [None req-69ec423d-e552-444e-89a9-560824533933 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "3f269630-23a3-4378-bca8-2177bcee52e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 29.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:16 np0005539505 nova_compute[186958]: 2025-11-29 07:36:16.391 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:16 np0005539505 nova_compute[186958]: 2025-11-29 07:36:16.810 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:18 np0005539505 podman[245461]: 2025-11-29 07:36:18.718021566 +0000 UTC m=+0.052187332 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:36:18 np0005539505 podman[245462]: 2025-11-29 07:36:18.754267727 +0000 UTC m=+0.085186121 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Nov 29 02:36:21 np0005539505 nova_compute[186958]: 2025-11-29 07:36:21.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:21 np0005539505 nova_compute[186958]: 2025-11-29 07:36:21.394 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:21 np0005539505 podman[245514]: 2025-11-29 07:36:21.728155123 +0000 UTC m=+0.058670504 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:36:21 np0005539505 podman[245515]: 2025-11-29 07:36:21.741942902 +0000 UTC m=+0.061997928 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 02:36:21 np0005539505 nova_compute[186958]: 2025-11-29 07:36:21.815 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:22 np0005539505 nova_compute[186958]: 2025-11-29 07:36:22.376 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:23 np0005539505 nova_compute[186958]: 2025-11-29 07:36:23.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:26 np0005539505 nova_compute[186958]: 2025-11-29 07:36:26.398 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:26 np0005539505 nova_compute[186958]: 2025-11-29 07:36:26.865 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:27 np0005539505 nova_compute[186958]: 2025-11-29 07:36:27.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:27 np0005539505 nova_compute[186958]: 2025-11-29 07:36:27.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:36:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:36:27.516 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:36:27.516 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:36:27.516 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:30 np0005539505 nova_compute[186958]: 2025-11-29 07:36:30.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:31 np0005539505 nova_compute[186958]: 2025-11-29 07:36:31.402 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:31 np0005539505 nova_compute[186958]: 2025-11-29 07:36:31.867 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:32 np0005539505 nova_compute[186958]: 2025-11-29 07:36:32.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:32 np0005539505 nova_compute[186958]: 2025-11-29 07:36:32.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:36:32 np0005539505 nova_compute[186958]: 2025-11-29 07:36:32.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:36:33 np0005539505 nova_compute[186958]: 2025-11-29 07:36:33.878 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:36:33 np0005539505 nova_compute[186958]: 2025-11-29 07:36:33.879 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:35 np0005539505 nova_compute[186958]: 2025-11-29 07:36:35.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:35 np0005539505 nova_compute[186958]: 2025-11-29 07:36:35.752 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:35 np0005539505 nova_compute[186958]: 2025-11-29 07:36:35.753 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:35 np0005539505 nova_compute[186958]: 2025-11-29 07:36:35.753 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:35 np0005539505 nova_compute[186958]: 2025-11-29 07:36:35.753 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:36:35 np0005539505 nova_compute[186958]: 2025-11-29 07:36:35.908 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:36:35 np0005539505 nova_compute[186958]: 2025-11-29 07:36:35.910 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5734MB free_disk=73.07366561889648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:36:35 np0005539505 nova_compute[186958]: 2025-11-29 07:36:35.910 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:36:35 np0005539505 nova_compute[186958]: 2025-11-29 07:36:35.910 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:36:36 np0005539505 nova_compute[186958]: 2025-11-29 07:36:36.024 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:36:36 np0005539505 nova_compute[186958]: 2025-11-29 07:36:36.025 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:36:36 np0005539505 nova_compute[186958]: 2025-11-29 07:36:36.045 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:36:36 np0005539505 nova_compute[186958]: 2025-11-29 07:36:36.406 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:36 np0005539505 nova_compute[186958]: 2025-11-29 07:36:36.869 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:36 np0005539505 nova_compute[186958]: 2025-11-29 07:36:36.916 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:36:37 np0005539505 nova_compute[186958]: 2025-11-29 07:36:37.017 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:36:37 np0005539505 nova_compute[186958]: 2025-11-29 07:36:37.018 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:36:37 np0005539505 podman[245559]: 2025-11-29 07:36:37.716195831 +0000 UTC m=+0.045231076 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:36:37 np0005539505 podman[245558]: 2025-11-29 07:36:37.719959637 +0000 UTC m=+0.053775986 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350)
Nov 29 02:36:39 np0005539505 nova_compute[186958]: 2025-11-29 07:36:39.013 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:40 np0005539505 podman[245599]: 2025-11-29 07:36:40.739001415 +0000 UTC m=+0.075481758 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:36:41 np0005539505 nova_compute[186958]: 2025-11-29 07:36:41.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:36:41 np0005539505 nova_compute[186958]: 2025-11-29 07:36:41.409 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:41 np0005539505 nova_compute[186958]: 2025-11-29 07:36:41.871 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:46 np0005539505 nova_compute[186958]: 2025-11-29 07:36:46.428 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:46 np0005539505 nova_compute[186958]: 2025-11-29 07:36:46.873 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:36:48.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:36:49 np0005539505 podman[245619]: 2025-11-29 07:36:49.717927437 +0000 UTC m=+0.055763592 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:36:49 np0005539505 podman[245620]: 2025-11-29 07:36:49.775471848 +0000 UTC m=+0.100415370 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Nov 29 02:36:51 np0005539505 nova_compute[186958]: 2025-11-29 07:36:51.431 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:51 np0005539505 nova_compute[186958]: 2025-11-29 07:36:51.875 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:52 np0005539505 podman[245667]: 2025-11-29 07:36:52.718722412 +0000 UTC m=+0.053853999 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 29 02:36:52 np0005539505 podman[245668]: 2025-11-29 07:36:52.750153877 +0000 UTC m=+0.071088654 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Nov 29 02:36:56 np0005539505 nova_compute[186958]: 2025-11-29 07:36:56.435 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:36:56 np0005539505 nova_compute[186958]: 2025-11-29 07:36:56.907 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:01 np0005539505 nova_compute[186958]: 2025-11-29 07:37:01.438 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:01 np0005539505 nova_compute[186958]: 2025-11-29 07:37:01.910 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:06 np0005539505 nova_compute[186958]: 2025-11-29 07:37:06.442 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:06 np0005539505 nova_compute[186958]: 2025-11-29 07:37:06.912 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:08 np0005539505 podman[245707]: 2025-11-29 07:37:08.738292619 +0000 UTC m=+0.071240259 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 29 02:37:08 np0005539505 podman[245708]: 2025-11-29 07:37:08.766095032 +0000 UTC m=+0.094965107 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:37:11 np0005539505 nova_compute[186958]: 2025-11-29 07:37:11.489 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:11 np0005539505 podman[245752]: 2025-11-29 07:37:11.745581756 +0000 UTC m=+0.069937402 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:37:11 np0005539505 nova_compute[186958]: 2025-11-29 07:37:11.913 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:13.527 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:37:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:13.528 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:37:13 np0005539505 nova_compute[186958]: 2025-11-29 07:37:13.529 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:13 np0005539505 nova_compute[186958]: 2025-11-29 07:37:13.899 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:13 np0005539505 nova_compute[186958]: 2025-11-29 07:37:13.900 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:13 np0005539505 nova_compute[186958]: 2025-11-29 07:37:13.922 186962 DEBUG nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.086 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.087 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.099 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.099 186962 INFO nova.compute.claims [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.258 186962 DEBUG nova.compute.provider_tree [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.278 186962 DEBUG nova.scheduler.client.report [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.329 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.330 186962 DEBUG nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.392 186962 DEBUG nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.393 186962 DEBUG nova.network.neutron [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.420 186962 INFO nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.438 186962 DEBUG nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.590 186962 DEBUG nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.593 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.593 186962 INFO nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Creating image(s)#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.596 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "/var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.597 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.599 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "/var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.624 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.664 186962 DEBUG nova.policy [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.716 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.717 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.719 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.745 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.838 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.840 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.895 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.896 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.897 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.969 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.970 186962 DEBUG nova.virt.disk.api [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Checking if we can resize image /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:37:14 np0005539505 nova_compute[186958]: 2025-11-29 07:37:14.970 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:37:15 np0005539505 nova_compute[186958]: 2025-11-29 07:37:15.058 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:37:15 np0005539505 nova_compute[186958]: 2025-11-29 07:37:15.059 186962 DEBUG nova.virt.disk.api [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Cannot resize image /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:37:15 np0005539505 nova_compute[186958]: 2025-11-29 07:37:15.059 186962 DEBUG nova.objects.instance [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'migration_context' on Instance uuid 2a9bd960-7f4e-411b-b743-70064e15a0d7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:37:15 np0005539505 nova_compute[186958]: 2025-11-29 07:37:15.389 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:37:15 np0005539505 nova_compute[186958]: 2025-11-29 07:37:15.390 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Ensure instance console log exists: /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:37:15 np0005539505 nova_compute[186958]: 2025-11-29 07:37:15.391 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:15 np0005539505 nova_compute[186958]: 2025-11-29 07:37:15.391 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:15 np0005539505 nova_compute[186958]: 2025-11-29 07:37:15.391 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:16 np0005539505 nova_compute[186958]: 2025-11-29 07:37:16.493 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:16 np0005539505 nova_compute[186958]: 2025-11-29 07:37:16.916 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:18 np0005539505 nova_compute[186958]: 2025-11-29 07:37:18.305 186962 DEBUG nova.network.neutron [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Successfully created port: f838accb-ce0a-45ec-bc0d-5aaaae01c4de _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:37:18 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:18.531 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:20 np0005539505 podman[245786]: 2025-11-29 07:37:20.733006137 +0000 UTC m=+0.066258658 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:37:20 np0005539505 podman[245787]: 2025-11-29 07:37:20.768445325 +0000 UTC m=+0.093812634 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:37:20 np0005539505 nova_compute[186958]: 2025-11-29 07:37:20.979 186962 DEBUG nova.network.neutron [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Successfully created port: 7b28d182-bf32-4711-805d-1bdeb68b89e9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:37:21 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:21Z|00724|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Nov 29 02:37:21 np0005539505 nova_compute[186958]: 2025-11-29 07:37:21.497 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:21 np0005539505 nova_compute[186958]: 2025-11-29 07:37:21.918 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:22 np0005539505 nova_compute[186958]: 2025-11-29 07:37:22.359 186962 DEBUG nova.network.neutron [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Successfully updated port: f838accb-ce0a-45ec-bc0d-5aaaae01c4de _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:37:22 np0005539505 nova_compute[186958]: 2025-11-29 07:37:22.480 186962 DEBUG nova.compute.manager [req-2cc5d496-77da-4da2-9573-17203c5cedb5 req-6257dcd8-f719-479d-8086-dc6f51164446 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-changed-f838accb-ce0a-45ec-bc0d-5aaaae01c4de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:37:22 np0005539505 nova_compute[186958]: 2025-11-29 07:37:22.481 186962 DEBUG nova.compute.manager [req-2cc5d496-77da-4da2-9573-17203c5cedb5 req-6257dcd8-f719-479d-8086-dc6f51164446 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Refreshing instance network info cache due to event network-changed-f838accb-ce0a-45ec-bc0d-5aaaae01c4de. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:37:22 np0005539505 nova_compute[186958]: 2025-11-29 07:37:22.482 186962 DEBUG oslo_concurrency.lockutils [req-2cc5d496-77da-4da2-9573-17203c5cedb5 req-6257dcd8-f719-479d-8086-dc6f51164446 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:37:22 np0005539505 nova_compute[186958]: 2025-11-29 07:37:22.482 186962 DEBUG oslo_concurrency.lockutils [req-2cc5d496-77da-4da2-9573-17203c5cedb5 req-6257dcd8-f719-479d-8086-dc6f51164446 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:37:22 np0005539505 nova_compute[186958]: 2025-11-29 07:37:22.483 186962 DEBUG nova.network.neutron [req-2cc5d496-77da-4da2-9573-17203c5cedb5 req-6257dcd8-f719-479d-8086-dc6f51164446 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Refreshing network info cache for port f838accb-ce0a-45ec-bc0d-5aaaae01c4de _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:37:22 np0005539505 nova_compute[186958]: 2025-11-29 07:37:22.721 186962 DEBUG nova.network.neutron [req-2cc5d496-77da-4da2-9573-17203c5cedb5 req-6257dcd8-f719-479d-8086-dc6f51164446 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:37:23 np0005539505 nova_compute[186958]: 2025-11-29 07:37:23.218 186962 DEBUG nova.network.neutron [req-2cc5d496-77da-4da2-9573-17203c5cedb5 req-6257dcd8-f719-479d-8086-dc6f51164446 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:37:23 np0005539505 nova_compute[186958]: 2025-11-29 07:37:23.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:23 np0005539505 nova_compute[186958]: 2025-11-29 07:37:23.411 186962 DEBUG oslo_concurrency.lockutils [req-2cc5d496-77da-4da2-9573-17203c5cedb5 req-6257dcd8-f719-479d-8086-dc6f51164446 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:37:23 np0005539505 nova_compute[186958]: 2025-11-29 07:37:23.523 186962 DEBUG nova.network.neutron [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Successfully updated port: 7b28d182-bf32-4711-805d-1bdeb68b89e9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:37:23 np0005539505 podman[245834]: 2025-11-29 07:37:23.752364954 +0000 UTC m=+0.082982009 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:37:23 np0005539505 podman[245835]: 2025-11-29 07:37:23.766612486 +0000 UTC m=+0.083609307 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:37:23 np0005539505 nova_compute[186958]: 2025-11-29 07:37:23.982 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:37:23 np0005539505 nova_compute[186958]: 2025-11-29 07:37:23.982 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquired lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:37:23 np0005539505 nova_compute[186958]: 2025-11-29 07:37:23.982 186962 DEBUG nova.network.neutron [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:37:24 np0005539505 nova_compute[186958]: 2025-11-29 07:37:24.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:26 np0005539505 nova_compute[186958]: 2025-11-29 07:37:26.502 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:26 np0005539505 nova_compute[186958]: 2025-11-29 07:37:26.921 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:27.518 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:27.519 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:27.519 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:27 np0005539505 nova_compute[186958]: 2025-11-29 07:37:27.632 186962 DEBUG nova.compute.manager [req-3a2e17bf-b7f2-4bc2-ada7-3e50e3213a96 req-6c028c6e-ff36-4914-b72b-f8e2b23ff220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-changed-7b28d182-bf32-4711-805d-1bdeb68b89e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:37:27 np0005539505 nova_compute[186958]: 2025-11-29 07:37:27.633 186962 DEBUG nova.compute.manager [req-3a2e17bf-b7f2-4bc2-ada7-3e50e3213a96 req-6c028c6e-ff36-4914-b72b-f8e2b23ff220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Refreshing instance network info cache due to event network-changed-7b28d182-bf32-4711-805d-1bdeb68b89e9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:37:27 np0005539505 nova_compute[186958]: 2025-11-29 07:37:27.633 186962 DEBUG oslo_concurrency.lockutils [req-3a2e17bf-b7f2-4bc2-ada7-3e50e3213a96 req-6c028c6e-ff36-4914-b72b-f8e2b23ff220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:37:28 np0005539505 nova_compute[186958]: 2025-11-29 07:37:28.307 186962 DEBUG nova.network.neutron [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:37:29 np0005539505 nova_compute[186958]: 2025-11-29 07:37:29.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:29 np0005539505 nova_compute[186958]: 2025-11-29 07:37:29.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:37:30 np0005539505 nova_compute[186958]: 2025-11-29 07:37:30.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:31 np0005539505 nova_compute[186958]: 2025-11-29 07:37:31.506 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:31 np0005539505 nova_compute[186958]: 2025-11-29 07:37:31.922 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:32 np0005539505 nova_compute[186958]: 2025-11-29 07:37:32.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:32 np0005539505 nova_compute[186958]: 2025-11-29 07:37:32.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:37:32 np0005539505 nova_compute[186958]: 2025-11-29 07:37:32.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:37:32 np0005539505 nova_compute[186958]: 2025-11-29 07:37:32.598 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:37:32 np0005539505 nova_compute[186958]: 2025-11-29 07:37:32.599 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:37:32 np0005539505 nova_compute[186958]: 2025-11-29 07:37:32.602 186962 DEBUG nova.network.neutron [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updating instance_info_cache with network_info: [{"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.565 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Releasing lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.566 186962 DEBUG nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Instance network_info: |[{"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.569 186962 DEBUG oslo_concurrency.lockutils [req-3a2e17bf-b7f2-4bc2-ada7-3e50e3213a96 req-6c028c6e-ff36-4914-b72b-f8e2b23ff220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.569 186962 DEBUG nova.network.neutron [req-3a2e17bf-b7f2-4bc2-ada7-3e50e3213a96 req-6c028c6e-ff36-4914-b72b-f8e2b23ff220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Refreshing network info cache for port 7b28d182-bf32-4711-805d-1bdeb68b89e9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.573 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Start _get_guest_xml network_info=[{"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.579 186962 WARNING nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.585 186962 DEBUG nova.virt.libvirt.host [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.585 186962 DEBUG nova.virt.libvirt.host [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.589 186962 DEBUG nova.virt.libvirt.host [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.590 186962 DEBUG nova.virt.libvirt.host [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.591 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.592 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.592 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.593 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.593 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.593 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.593 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.594 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.595 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.595 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.596 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.596 186962 DEBUG nova.virt.hardware [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.600 186962 DEBUG nova.virt.libvirt.vif [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1620649343',display_name='tempest-TestGettingAddress-server-1620649343',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1620649343',id=152,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-e6m3jp5s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:37:14Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2a9bd960-7f4e-411b-b743-70064e15a0d7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.600 186962 DEBUG nova.network.os_vif_util [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.601 186962 DEBUG nova.network.os_vif_util [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:9f:d4,bridge_name='br-int',has_traffic_filtering=True,id=f838accb-ce0a-45ec-bc0d-5aaaae01c4de,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838accb-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.602 186962 DEBUG nova.virt.libvirt.vif [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1620649343',display_name='tempest-TestGettingAddress-server-1620649343',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1620649343',id=152,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-e6m3jp5s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:37:14Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2a9bd960-7f4e-411b-b743-70064e15a0d7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.602 186962 DEBUG nova.network.os_vif_util [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.603 186962 DEBUG nova.network.os_vif_util [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:bc:1e,bridge_name='br-int',has_traffic_filtering=True,id=7b28d182-bf32-4711-805d-1bdeb68b89e9,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b28d182-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:37:33 np0005539505 nova_compute[186958]: 2025-11-29 07:37:33.603 186962 DEBUG nova.objects.instance [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'pci_devices' on Instance uuid 2a9bd960-7f4e-411b-b743-70064e15a0d7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.039 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  <uuid>2a9bd960-7f4e-411b-b743-70064e15a0d7</uuid>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  <name>instance-00000098</name>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestGettingAddress-server-1620649343</nova:name>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:37:33</nova:creationTime>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:        <nova:user uuid="31ac7b05b012433b89143dc9f259644a">tempest-TestGettingAddress-1465017630-project-member</nova:user>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:        <nova:project uuid="0111c22b4b954ea586ca20d91ed3970f">tempest-TestGettingAddress-1465017630</nova:project>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:        <nova:port uuid="f838accb-ce0a-45ec-bc0d-5aaaae01c4de">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:        <nova:port uuid="7b28d182-bf32-4711-805d-1bdeb68b89e9">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe4a:bc1e" ipVersion="6"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <entry name="serial">2a9bd960-7f4e-411b-b743-70064e15a0d7</entry>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <entry name="uuid">2a9bd960-7f4e-411b-b743-70064e15a0d7</entry>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.config"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:be:9f:d4"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <target dev="tapf838accb-ce"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:4a:bc:1e"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <target dev="tap7b28d182-bf"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/console.log" append="off"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:37:34 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:37:34 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:37:34 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:37:34 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.041 186962 DEBUG nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Preparing to wait for external event network-vif-plugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.042 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.042 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.042 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.042 186962 DEBUG nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Preparing to wait for external event network-vif-plugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.043 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.043 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.043 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.044 186962 DEBUG nova.virt.libvirt.vif [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1620649343',display_name='tempest-TestGettingAddress-server-1620649343',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1620649343',id=152,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-e6m3jp5s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:37:14Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2a9bd960-7f4e-411b-b743-70064e15a0d7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.044 186962 DEBUG nova.network.os_vif_util [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.045 186962 DEBUG nova.network.os_vif_util [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:9f:d4,bridge_name='br-int',has_traffic_filtering=True,id=f838accb-ce0a-45ec-bc0d-5aaaae01c4de,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838accb-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.045 186962 DEBUG os_vif [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:9f:d4,bridge_name='br-int',has_traffic_filtering=True,id=f838accb-ce0a-45ec-bc0d-5aaaae01c4de,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838accb-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.046 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.046 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.047 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.051 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.051 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf838accb-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.051 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf838accb-ce, col_values=(('external_ids', {'iface-id': 'f838accb-ce0a-45ec-bc0d-5aaaae01c4de', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:9f:d4', 'vm-uuid': '2a9bd960-7f4e-411b-b743-70064e15a0d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.053 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:34 np0005539505 NetworkManager[55134]: <info>  [1764401854.0546] manager: (tapf838accb-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.055 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.058 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.059 186962 INFO os_vif [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:9f:d4,bridge_name='br-int',has_traffic_filtering=True,id=f838accb-ce0a-45ec-bc0d-5aaaae01c4de,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838accb-ce')#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.060 186962 DEBUG nova.virt.libvirt.vif [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1620649343',display_name='tempest-TestGettingAddress-server-1620649343',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1620649343',id=152,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-e6m3jp5s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:37:14Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2a9bd960-7f4e-411b-b743-70064e15a0d7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.060 186962 DEBUG nova.network.os_vif_util [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.061 186962 DEBUG nova.network.os_vif_util [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:bc:1e,bridge_name='br-int',has_traffic_filtering=True,id=7b28d182-bf32-4711-805d-1bdeb68b89e9,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b28d182-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.061 186962 DEBUG os_vif [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:bc:1e,bridge_name='br-int',has_traffic_filtering=True,id=7b28d182-bf32-4711-805d-1bdeb68b89e9,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b28d182-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.061 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.061 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.062 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.064 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.064 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b28d182-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.064 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b28d182-bf, col_values=(('external_ids', {'iface-id': '7b28d182-bf32-4711-805d-1bdeb68b89e9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:bc:1e', 'vm-uuid': '2a9bd960-7f4e-411b-b743-70064e15a0d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.065 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:34 np0005539505 NetworkManager[55134]: <info>  [1764401854.0659] manager: (tap7b28d182-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.067 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.073 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.073 186962 INFO os_vif [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:bc:1e,bridge_name='br-int',has_traffic_filtering=True,id=7b28d182-bf32-4711-805d-1bdeb68b89e9,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b28d182-bf')#033[00m
Nov 29 02:37:34 np0005539505 nova_compute[186958]: 2025-11-29 07:37:34.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:35 np0005539505 nova_compute[186958]: 2025-11-29 07:37:35.707 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:37:35 np0005539505 nova_compute[186958]: 2025-11-29 07:37:35.708 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:37:35 np0005539505 nova_compute[186958]: 2025-11-29 07:37:35.708 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:be:9f:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:37:35 np0005539505 nova_compute[186958]: 2025-11-29 07:37:35.708 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] No VIF found with MAC fa:16:3e:4a:bc:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:37:35 np0005539505 nova_compute[186958]: 2025-11-29 07:37:35.709 186962 INFO nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Using config drive#033[00m
Nov 29 02:37:36 np0005539505 nova_compute[186958]: 2025-11-29 07:37:36.262 186962 DEBUG nova.network.neutron [req-3a2e17bf-b7f2-4bc2-ada7-3e50e3213a96 req-6c028c6e-ff36-4914-b72b-f8e2b23ff220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updated VIF entry in instance network info cache for port 7b28d182-bf32-4711-805d-1bdeb68b89e9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:37:36 np0005539505 nova_compute[186958]: 2025-11-29 07:37:36.262 186962 DEBUG nova.network.neutron [req-3a2e17bf-b7f2-4bc2-ada7-3e50e3213a96 req-6c028c6e-ff36-4914-b72b-f8e2b23ff220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updating instance_info_cache with network_info: [{"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:37:36 np0005539505 nova_compute[186958]: 2025-11-29 07:37:36.924 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.149 186962 DEBUG oslo_concurrency.lockutils [req-3a2e17bf-b7f2-4bc2-ada7-3e50e3213a96 req-6c028c6e-ff36-4914-b72b-f8e2b23ff220 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.173 186962 INFO nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Creating config drive at /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.config#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.178 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbf0ucrx9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.304 186962 DEBUG oslo_concurrency.processutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbf0ucrx9" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:37:37 np0005539505 NetworkManager[55134]: <info>  [1764401857.3609] manager: (tapf838accb-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/355)
Nov 29 02:37:37 np0005539505 kernel: tapf838accb-ce: entered promiscuous mode
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.365 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:37Z|00725|binding|INFO|Claiming lport f838accb-ce0a-45ec-bc0d-5aaaae01c4de for this chassis.
Nov 29 02:37:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:37Z|00726|binding|INFO|f838accb-ce0a-45ec-bc0d-5aaaae01c4de: Claiming fa:16:3e:be:9f:d4 10.100.0.11
Nov 29 02:37:37 np0005539505 NetworkManager[55134]: <info>  [1764401857.3766] manager: (tap7b28d182-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/356)
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:37 np0005539505 systemd-udevd[245898]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:37:37 np0005539505 systemd-udevd[245899]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:37:37 np0005539505 NetworkManager[55134]: <info>  [1764401857.4101] device (tapf838accb-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:37:37 np0005539505 NetworkManager[55134]: <info>  [1764401857.4115] device (tapf838accb-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:37:37 np0005539505 systemd-machined[153285]: New machine qemu-78-instance-00000098.
Nov 29 02:37:37 np0005539505 kernel: tap7b28d182-bf: entered promiscuous mode
Nov 29 02:37:37 np0005539505 NetworkManager[55134]: <info>  [1764401857.4448] device (tap7b28d182-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:37:37 np0005539505 NetworkManager[55134]: <info>  [1764401857.4467] device (tap7b28d182-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.446 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:37Z|00727|if_status|INFO|Not updating pb chassis for 7b28d182-bf32-4711-805d-1bdeb68b89e9 now as sb is readonly
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.448 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539505 systemd[1]: Started Virtual Machine qemu-78-instance-00000098.
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.456 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:37Z|00728|binding|INFO|Setting lport f838accb-ce0a-45ec-bc0d-5aaaae01c4de ovn-installed in OVS
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.467 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:37Z|00729|binding|INFO|Claiming lport 7b28d182-bf32-4711-805d-1bdeb68b89e9 for this chassis.
Nov 29 02:37:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:37Z|00730|binding|INFO|7b28d182-bf32-4711-805d-1bdeb68b89e9: Claiming fa:16:3e:4a:bc:1e 2001:db8::f816:3eff:fe4a:bc1e
Nov 29 02:37:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:37Z|00731|binding|INFO|Setting lport f838accb-ce0a-45ec-bc0d-5aaaae01c4de up in Southbound
Nov 29 02:37:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:37Z|00732|binding|INFO|Setting lport 7b28d182-bf32-4711-805d-1bdeb68b89e9 ovn-installed in OVS
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.520 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:9f:d4 10.100.0.11'], port_security=['fa:16:3e:be:9f:d4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b28f67ac-b290-4be5-88df-4393a9d30b89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c7b7bb-9c39-4f1c-a218-71c5fbf31db4, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=f838accb-ce0a-45ec-bc0d-5aaaae01c4de) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.521 104094 INFO neutron.agent.ovn.metadata.agent [-] Port f838accb-ce0a-45ec-bc0d-5aaaae01c4de in datapath 51013e93-c048-46cc-9a9d-a184eb63e1b4 bound to our chassis#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.523 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 51013e93-c048-46cc-9a9d-a184eb63e1b4#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.523 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.533 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[04612e52-fb18-446a-b7b3-e0416c6d2209]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.534 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap51013e93-c1 in ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.536 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap51013e93-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.536 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[404f2549-1896-432f-9ede-ed454056315b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.536 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d1496f11-f1af-4641-b6c8-7913bd0e8037]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.549 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[699da77c-a67a-48fe-acb4-4e28953b9d4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.573 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f5429353-1445-4cc6-9f3b-674b19b9f297]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.600 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[54b46b50-9882-47f9-ae27-c5c6b372efe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.604 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[750bf00d-26c7-47ac-82ad-d1f003e93648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 NetworkManager[55134]: <info>  [1764401857.6060] manager: (tap51013e93-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/357)
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.637 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ae09bdc0-e4a3-4512-91ba-c84f67d91999]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.640 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c8484e25-9e55-443f-8b3e-9c832d3ce5bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 NetworkManager[55134]: <info>  [1764401857.6625] device (tap51013e93-c0): carrier: link connected
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.668 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[75c3e05b-5e4a-4f6c-8915-e9019102b6e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.683 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[57a7ec41-9fd5-473f-8aa2-a07a5db0061f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51013e93-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:0a:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730524, 'reachable_time': 44797, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245935, 'error': None, 'target': 'ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.696 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[91238373-d8be-4edf-8755-77eccdfcf52e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe01:a76'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 730524, 'tstamp': 730524}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245936, 'error': None, 'target': 'ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.718 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7884a66d-320d-4d89-af8f-71933aa4a7d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51013e93-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:01:0a:76'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 228], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730524, 'reachable_time': 44797, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245937, 'error': None, 'target': 'ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.742 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe4fbbc-fcc2-4075-af97-57a7cc8bbbc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.795 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[22318e67-dc6c-4aef-a551-d71cd0b59fa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.796 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51013e93-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.797 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.797 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51013e93-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.843 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539505 kernel: tap51013e93-c0: entered promiscuous mode
Nov 29 02:37:37 np0005539505 NetworkManager[55134]: <info>  [1764401857.8461] manager: (tap51013e93-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/358)
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.848 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.848 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap51013e93-c0, col_values=(('external_ids', {'iface-id': '2ba344d6-366c-48e1-aea9-1f498f9fe4ff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:37Z|00733|binding|INFO|Releasing lport 2ba344d6-366c-48e1-aea9-1f498f9fe4ff from this chassis (sb_readonly=1)
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.850 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.861 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.862 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/51013e93-c048-46cc-9a9d-a184eb63e1b4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/51013e93-c048-46cc-9a9d-a184eb63e1b4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.862 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4a1598cf-fccd-4be2-abc9-55c885cae0e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.863 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-51013e93-c048-46cc-9a9d-a184eb63e1b4
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/51013e93-c048-46cc-9a9d-a184eb63e1b4.pid.haproxy
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 51013e93-c048-46cc-9a9d-a184eb63e1b4
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.864 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'env', 'PROCESS_TAG=haproxy-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/51013e93-c048-46cc-9a9d-a184eb63e1b4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.873 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.873 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.874 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.874 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:37:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:37Z|00734|binding|INFO|Setting lport 7b28d182-bf32-4711-805d-1bdeb68b89e9 up in Southbound
Nov 29 02:37:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:37.923 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:bc:1e 2001:db8::f816:3eff:fe4a:bc1e'], port_security=['fa:16:3e:4a:bc:1e 2001:db8::f816:3eff:fe4a:bc1e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4a:bc1e/64', 'neutron:device_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3085017e-01d1-448e-9eca-033b34f9e960', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b28f67ac-b290-4be5-88df-4393a9d30b89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3544b146-f2db-4cb1-8570-c2c0dfd4e173, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=7b28d182-bf32-4711-805d-1bdeb68b89e9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.968 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401857.9679146, 2a9bd960-7f4e-411b-b743-70064e15a0d7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:37:37 np0005539505 nova_compute[186958]: 2025-11-29 07:37:37.969 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] VM Started (Lifecycle Event)#033[00m
Nov 29 02:37:38 np0005539505 podman[245976]: 2025-11-29 07:37:38.221665148 +0000 UTC m=+0.061267368 container create 562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:37:38 np0005539505 systemd[1]: Started libpod-conmon-562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94.scope.
Nov 29 02:37:38 np0005539505 podman[245976]: 2025-11-29 07:37:38.187621519 +0000 UTC m=+0.027223759 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:37:38 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:37:38 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f004819139b7dbd682e3fdc1fe0beb44aac12f3779ddd603726f1e3ddb5e346/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:37:38 np0005539505 podman[245976]: 2025-11-29 07:37:38.302899597 +0000 UTC m=+0.142501837 container init 562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:37:38 np0005539505 podman[245976]: 2025-11-29 07:37:38.309264046 +0000 UTC m=+0.148866266 container start 562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:37:38 np0005539505 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[245991]: [NOTICE]   (245995) : New worker (245997) forked
Nov 29 02:37:38 np0005539505 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[245991]: [NOTICE]   (245995) : Loading success.
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.358 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 7b28d182-bf32-4711-805d-1bdeb68b89e9 in datapath 3085017e-01d1-448e-9eca-033b34f9e960 bound to our chassis#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.360 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3085017e-01d1-448e-9eca-033b34f9e960#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.372 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e688e5a6-876f-412f-8642-998f22b71a7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.373 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3085017e-01 in ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.374 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3085017e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.375 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3ba24c-7943-4c8a-9958-a006f7493836]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.375 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c0efc3f4-67d5-47d6-b07d-05ee45f796d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.384 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[45ab85ca-40c1-485b-a8f6-718c556ff372]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.406 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[683a2b79-0c91-409b-b29c-d6f78533101d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.430 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[782de939-2e95-4830-bc39-8df4aeff499b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 systemd-udevd[245932]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:37:38 np0005539505 NetworkManager[55134]: <info>  [1764401858.4392] manager: (tap3085017e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/359)
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.439 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[290616f5-c15e-4ad4-9e57-963619823e02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.468 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b27d47b8-2df0-403e-b38f-be6004f3c532]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.471 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[61581684-56d0-4834-8a91-f5a9c1cd4a9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 NetworkManager[55134]: <info>  [1764401858.4966] device (tap3085017e-00): carrier: link connected
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.502 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[707ab8cf-6d58-41be-b9fe-ed3575eca02c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.519 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3a157f91-315c-4af0-bae8-6f9f7b0b0024]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3085017e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:4a:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730608, 'reachable_time': 25309, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246016, 'error': None, 'target': 'ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.537 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[649d8e1f-5f08-4ba0-9e8d-f8db2c08b127]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:4adf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 730608, 'tstamp': 730608}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246017, 'error': None, 'target': 'ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.557 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ee5740-e6dc-460a-a4db-c1d9f084930b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3085017e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:4a:df'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730608, 'reachable_time': 25309, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246018, 'error': None, 'target': 'ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.582 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9904e01f-e432-4890-9c00-60293db10d03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.605 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0da4f74f-bec3-44b0-be8b-96decde5f4d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.607 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3085017e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.607 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.608 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3085017e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:38 np0005539505 NetworkManager[55134]: <info>  [1764401858.6108] manager: (tap3085017e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/360)
Nov 29 02:37:38 np0005539505 kernel: tap3085017e-00: entered promiscuous mode
Nov 29 02:37:38 np0005539505 nova_compute[186958]: 2025-11-29 07:37:38.611 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.613 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3085017e-00, col_values=(('external_ids', {'iface-id': '59e75940-874d-45e0-8a95-c21e1cf0e54f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:37:38 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:38Z|00735|binding|INFO|Releasing lport 59e75940-874d-45e0-8a95-c21e1cf0e54f from this chassis (sb_readonly=1)
Nov 29 02:37:38 np0005539505 nova_compute[186958]: 2025-11-29 07:37:38.614 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:38 np0005539505 nova_compute[186958]: 2025-11-29 07:37:38.625 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.625 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3085017e-01d1-448e-9eca-033b34f9e960.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3085017e-01d1-448e-9eca-033b34f9e960.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.626 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec9fbf7-9d13-49b8-91c2-8e3b4e277625]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.626 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-3085017e-01d1-448e-9eca-033b34f9e960
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/3085017e-01d1-448e-9eca-033b34f9e960.pid.haproxy
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 3085017e-01d1-448e-9eca-033b34f9e960
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:37:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:38.627 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960', 'env', 'PROCESS_TAG=haproxy-3085017e-01d1-448e-9eca-033b34f9e960', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3085017e-01d1-448e-9eca-033b34f9e960.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:37:38 np0005539505 podman[246048]: 2025-11-29 07:37:38.973278796 +0000 UTC m=+0.048259120 container create 0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:37:39 np0005539505 systemd[1]: Started libpod-conmon-0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa.scope.
Nov 29 02:37:39 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:37:39 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e339c1dc48483187fa10efd533323f7463969e4bf557d0a348037db376560c4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:37:39 np0005539505 podman[246048]: 2025-11-29 07:37:39.032913837 +0000 UTC m=+0.107894181 container init 0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:37:39 np0005539505 podman[246048]: 2025-11-29 07:37:39.038114333 +0000 UTC m=+0.113094657 container start 0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:37:39 np0005539505 podman[246048]: 2025-11-29 07:37:38.946610245 +0000 UTC m=+0.021590589 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:37:39 np0005539505 podman[246064]: 2025-11-29 07:37:39.056969055 +0000 UTC m=+0.049040453 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:37:39 np0005539505 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[246065]: [NOTICE]   (246096) : New worker (246115) forked
Nov 29 02:37:39 np0005539505 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[246065]: [NOTICE]   (246096) : Loading success.
Nov 29 02:37:39 np0005539505 podman[246061]: 2025-11-29 07:37:39.063861049 +0000 UTC m=+0.057931424 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Nov 29 02:37:39 np0005539505 nova_compute[186958]: 2025-11-29 07:37:39.065 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:40 np0005539505 nova_compute[186958]: 2025-11-29 07:37:40.177 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:37:40 np0005539505 nova_compute[186958]: 2025-11-29 07:37:40.184 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401857.9683301, 2a9bd960-7f4e-411b-b743-70064e15a0d7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:37:40 np0005539505 nova_compute[186958]: 2025-11-29 07:37:40.184 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:37:40 np0005539505 nova_compute[186958]: 2025-11-29 07:37:40.868 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:37:40 np0005539505 nova_compute[186958]: 2025-11-29 07:37:40.872 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:37:40 np0005539505 nova_compute[186958]: 2025-11-29 07:37:40.889 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:37:40 np0005539505 nova_compute[186958]: 2025-11-29 07:37:40.927 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:37:40 np0005539505 nova_compute[186958]: 2025-11-29 07:37:40.928 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:37:40 np0005539505 nova_compute[186958]: 2025-11-29 07:37:40.978 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:37:41 np0005539505 nova_compute[186958]: 2025-11-29 07:37:41.128 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:37:41 np0005539505 nova_compute[186958]: 2025-11-29 07:37:41.129 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5664MB free_disk=73.07282257080078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:37:41 np0005539505 nova_compute[186958]: 2025-11-29 07:37:41.130 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:41 np0005539505 nova_compute[186958]: 2025-11-29 07:37:41.130 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:41 np0005539505 nova_compute[186958]: 2025-11-29 07:37:41.267 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:37:41 np0005539505 nova_compute[186958]: 2025-11-29 07:37:41.537 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 2a9bd960-7f4e-411b-b743-70064e15a0d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:37:41 np0005539505 nova_compute[186958]: 2025-11-29 07:37:41.538 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:37:41 np0005539505 nova_compute[186958]: 2025-11-29 07:37:41.538 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:37:41 np0005539505 nova_compute[186958]: 2025-11-29 07:37:41.605 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:37:41 np0005539505 nova_compute[186958]: 2025-11-29 07:37:41.926 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:42 np0005539505 nova_compute[186958]: 2025-11-29 07:37:42.669 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:37:42 np0005539505 podman[246131]: 2025-11-29 07:37:42.723929009 +0000 UTC m=+0.052928852 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:37:44 np0005539505 nova_compute[186958]: 2025-11-29 07:37:44.067 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:44 np0005539505 nova_compute[186958]: 2025-11-29 07:37:44.253 186962 DEBUG nova.compute.manager [req-f4a069a0-2406-4676-8e31-1857db89e2d7 req-3a7b6e8b-7cf3-422d-80d1-0459ef60bc5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-plugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:37:44 np0005539505 nova_compute[186958]: 2025-11-29 07:37:44.254 186962 DEBUG oslo_concurrency.lockutils [req-f4a069a0-2406-4676-8e31-1857db89e2d7 req-3a7b6e8b-7cf3-422d-80d1-0459ef60bc5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:44 np0005539505 nova_compute[186958]: 2025-11-29 07:37:44.254 186962 DEBUG oslo_concurrency.lockutils [req-f4a069a0-2406-4676-8e31-1857db89e2d7 req-3a7b6e8b-7cf3-422d-80d1-0459ef60bc5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:44 np0005539505 nova_compute[186958]: 2025-11-29 07:37:44.254 186962 DEBUG oslo_concurrency.lockutils [req-f4a069a0-2406-4676-8e31-1857db89e2d7 req-3a7b6e8b-7cf3-422d-80d1-0459ef60bc5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:44 np0005539505 nova_compute[186958]: 2025-11-29 07:37:44.254 186962 DEBUG nova.compute.manager [req-f4a069a0-2406-4676-8e31-1857db89e2d7 req-3a7b6e8b-7cf3-422d-80d1-0459ef60bc5c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Processing event network-vif-plugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:37:45 np0005539505 nova_compute[186958]: 2025-11-29 07:37:45.665 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:37:45 np0005539505 nova_compute[186958]: 2025-11-29 07:37:45.665 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.535s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:45 np0005539505 nova_compute[186958]: 2025-11-29 07:37:45.666 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:46 np0005539505 nova_compute[186958]: 2025-11-29 07:37:46.952 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.708 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.709 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.870 186962 DEBUG nova.compute.manager [req-563dc648-d217-4b08-ab11-5468f4350510 req-d5808e17-8da3-47c1-a3a8-278e6ba28b4a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-plugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.870 186962 DEBUG oslo_concurrency.lockutils [req-563dc648-d217-4b08-ab11-5468f4350510 req-d5808e17-8da3-47c1-a3a8-278e6ba28b4a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.871 186962 DEBUG oslo_concurrency.lockutils [req-563dc648-d217-4b08-ab11-5468f4350510 req-d5808e17-8da3-47c1-a3a8-278e6ba28b4a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.871 186962 DEBUG oslo_concurrency.lockutils [req-563dc648-d217-4b08-ab11-5468f4350510 req-d5808e17-8da3-47c1-a3a8-278e6ba28b4a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.872 186962 DEBUG nova.compute.manager [req-563dc648-d217-4b08-ab11-5468f4350510 req-d5808e17-8da3-47c1-a3a8-278e6ba28b4a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Processing event network-vif-plugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.873 186962 DEBUG nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Instance event wait completed in 9 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.877 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764401867.8774018, 2a9bd960-7f4e-411b-b743-70064e15a0d7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.878 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.883 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.887 186962 INFO nova.virt.libvirt.driver [-] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Instance spawned successfully.#033[00m
Nov 29 02:37:47 np0005539505 nova_compute[186958]: 2025-11-29 07:37:47.888 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:37:49 np0005539505 nova_compute[186958]: 2025-11-29 07:37:49.069 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:49 np0005539505 nova_compute[186958]: 2025-11-29 07:37:49.686 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:37:49 np0005539505 nova_compute[186958]: 2025-11-29 07:37:49.694 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:37:49 np0005539505 nova_compute[186958]: 2025-11-29 07:37:49.695 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:37:49 np0005539505 nova_compute[186958]: 2025-11-29 07:37:49.696 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:37:49 np0005539505 nova_compute[186958]: 2025-11-29 07:37:49.697 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:37:49 np0005539505 nova_compute[186958]: 2025-11-29 07:37:49.698 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:37:49 np0005539505 nova_compute[186958]: 2025-11-29 07:37:49.699 186962 DEBUG nova.virt.libvirt.driver [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:37:49 np0005539505 nova_compute[186958]: 2025-11-29 07:37:49.705 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:37:50 np0005539505 nova_compute[186958]: 2025-11-29 07:37:50.348 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:37:50 np0005539505 nova_compute[186958]: 2025-11-29 07:37:50.917 186962 DEBUG nova.compute.manager [req-5fc332fc-4c64-4e66-87c6-539f630f054b req-dac03f62-6e9e-4c89-9a0e-e9bcccda4716 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-plugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:37:50 np0005539505 nova_compute[186958]: 2025-11-29 07:37:50.918 186962 DEBUG oslo_concurrency.lockutils [req-5fc332fc-4c64-4e66-87c6-539f630f054b req-dac03f62-6e9e-4c89-9a0e-e9bcccda4716 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:50 np0005539505 nova_compute[186958]: 2025-11-29 07:37:50.918 186962 DEBUG oslo_concurrency.lockutils [req-5fc332fc-4c64-4e66-87c6-539f630f054b req-dac03f62-6e9e-4c89-9a0e-e9bcccda4716 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:50 np0005539505 nova_compute[186958]: 2025-11-29 07:37:50.919 186962 DEBUG oslo_concurrency.lockutils [req-5fc332fc-4c64-4e66-87c6-539f630f054b req-dac03f62-6e9e-4c89-9a0e-e9bcccda4716 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:50 np0005539505 nova_compute[186958]: 2025-11-29 07:37:50.919 186962 DEBUG nova.compute.manager [req-5fc332fc-4c64-4e66-87c6-539f630f054b req-dac03f62-6e9e-4c89-9a0e-e9bcccda4716 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] No waiting events found dispatching network-vif-plugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:37:50 np0005539505 nova_compute[186958]: 2025-11-29 07:37:50.919 186962 WARNING nova.compute.manager [req-5fc332fc-4c64-4e66-87c6-539f630f054b req-dac03f62-6e9e-4c89-9a0e-e9bcccda4716 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received unexpected event network-vif-plugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 for instance with vm_state building and task_state spawning.#033[00m
Nov 29 02:37:51 np0005539505 nova_compute[186958]: 2025-11-29 07:37:51.039 186962 INFO nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Took 36.45 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:37:51 np0005539505 nova_compute[186958]: 2025-11-29 07:37:51.040 186962 DEBUG nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:37:51 np0005539505 nova_compute[186958]: 2025-11-29 07:37:51.637 186962 INFO nova.compute.manager [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Took 37.62 seconds to build instance.#033[00m
Nov 29 02:37:51 np0005539505 nova_compute[186958]: 2025-11-29 07:37:51.671 186962 DEBUG oslo_concurrency.lockutils [None req-b4028e70-2897-4cbb-82b9-6b352b23fc09 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 37.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:51 np0005539505 podman[246152]: 2025-11-29 07:37:51.734000159 +0000 UTC m=+0.056964036 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:37:51 np0005539505 podman[246153]: 2025-11-29 07:37:51.765393233 +0000 UTC m=+0.082470514 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:37:51 np0005539505 nova_compute[186958]: 2025-11-29 07:37:51.921 186962 DEBUG nova.compute.manager [req-b0bafb43-677a-4276-b6f2-d7ddebacb9f3 req-ed64aa59-2507-4bfd-bca3-6bd01637823f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-plugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:37:51 np0005539505 nova_compute[186958]: 2025-11-29 07:37:51.921 186962 DEBUG oslo_concurrency.lockutils [req-b0bafb43-677a-4276-b6f2-d7ddebacb9f3 req-ed64aa59-2507-4bfd-bca3-6bd01637823f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:37:51 np0005539505 nova_compute[186958]: 2025-11-29 07:37:51.922 186962 DEBUG oslo_concurrency.lockutils [req-b0bafb43-677a-4276-b6f2-d7ddebacb9f3 req-ed64aa59-2507-4bfd-bca3-6bd01637823f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:37:51 np0005539505 nova_compute[186958]: 2025-11-29 07:37:51.922 186962 DEBUG oslo_concurrency.lockutils [req-b0bafb43-677a-4276-b6f2-d7ddebacb9f3 req-ed64aa59-2507-4bfd-bca3-6bd01637823f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:37:51 np0005539505 nova_compute[186958]: 2025-11-29 07:37:51.922 186962 DEBUG nova.compute.manager [req-b0bafb43-677a-4276-b6f2-d7ddebacb9f3 req-ed64aa59-2507-4bfd-bca3-6bd01637823f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] No waiting events found dispatching network-vif-plugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:37:51 np0005539505 nova_compute[186958]: 2025-11-29 07:37:51.923 186962 WARNING nova.compute.manager [req-b0bafb43-677a-4276-b6f2-d7ddebacb9f3 req-ed64aa59-2507-4bfd-bca3-6bd01637823f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received unexpected event network-vif-plugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de for instance with vm_state active and task_state None.#033[00m
Nov 29 02:37:51 np0005539505 nova_compute[186958]: 2025-11-29 07:37:51.953 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:54 np0005539505 nova_compute[186958]: 2025-11-29 07:37:54.115 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:54 np0005539505 podman[246201]: 2025-11-29 07:37:54.721873439 +0000 UTC m=+0.054588319 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:37:54 np0005539505 podman[246202]: 2025-11-29 07:37:54.725968234 +0000 UTC m=+0.054214849 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:37:55 np0005539505 NetworkManager[55134]: <info>  [1764401875.9807] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Nov 29 02:37:55 np0005539505 NetworkManager[55134]: <info>  [1764401875.9814] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/362)
Nov 29 02:37:55 np0005539505 nova_compute[186958]: 2025-11-29 07:37:55.980 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.183 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:56 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:56Z|00736|binding|INFO|Releasing lport 2ba344d6-366c-48e1-aea9-1f498f9fe4ff from this chassis (sb_readonly=0)
Nov 29 02:37:56 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:56Z|00737|binding|INFO|Releasing lport 59e75940-874d-45e0-8a95-c21e1cf0e54f from this chassis (sb_readonly=0)
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.206 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.314 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.393 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:37:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:56.433 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:37:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:37:56.435 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.442 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.498 186962 DEBUG nova.compute.manager [req-95417360-fa82-4ed7-aba5-d2abe06b7a99 req-b2cab42d-df7b-4e9c-b772-33a99827115d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-changed-f838accb-ce0a-45ec-bc0d-5aaaae01c4de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.499 186962 DEBUG nova.compute.manager [req-95417360-fa82-4ed7-aba5-d2abe06b7a99 req-b2cab42d-df7b-4e9c-b772-33a99827115d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Refreshing instance network info cache due to event network-changed-f838accb-ce0a-45ec-bc0d-5aaaae01c4de. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.499 186962 DEBUG oslo_concurrency.lockutils [req-95417360-fa82-4ed7-aba5-d2abe06b7a99 req-b2cab42d-df7b-4e9c-b772-33a99827115d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.500 186962 DEBUG oslo_concurrency.lockutils [req-95417360-fa82-4ed7-aba5-d2abe06b7a99 req-b2cab42d-df7b-4e9c-b772-33a99827115d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.500 186962 DEBUG nova.network.neutron [req-95417360-fa82-4ed7-aba5-d2abe06b7a99 req-b2cab42d-df7b-4e9c-b772-33a99827115d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Refreshing network info cache for port f838accb-ce0a-45ec-bc0d-5aaaae01c4de _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:37:56 np0005539505 nova_compute[186958]: 2025-11-29 07:37:56.955 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:57 np0005539505 nova_compute[186958]: 2025-11-29 07:37:57.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:37:57 np0005539505 nova_compute[186958]: 2025-11-29 07:37:57.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:37:58 np0005539505 nova_compute[186958]: 2025-11-29 07:37:58.260 186962 DEBUG nova.network.neutron [req-95417360-fa82-4ed7-aba5-d2abe06b7a99 req-b2cab42d-df7b-4e9c-b772-33a99827115d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updated VIF entry in instance network info cache for port f838accb-ce0a-45ec-bc0d-5aaaae01c4de. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:37:58 np0005539505 nova_compute[186958]: 2025-11-29 07:37:58.261 186962 DEBUG nova.network.neutron [req-95417360-fa82-4ed7-aba5-d2abe06b7a99 req-b2cab42d-df7b-4e9c-b772-33a99827115d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updating instance_info_cache with network_info: [{"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:37:58 np0005539505 nova_compute[186958]: 2025-11-29 07:37:58.284 186962 DEBUG oslo_concurrency.lockutils [req-95417360-fa82-4ed7-aba5-d2abe06b7a99 req-b2cab42d-df7b-4e9c-b772-33a99827115d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:37:59 np0005539505 nova_compute[186958]: 2025-11-29 07:37:59.118 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:37:59 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:59Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:be:9f:d4 10.100.0.11
Nov 29 02:37:59 np0005539505 ovn_controller[95143]: 2025-11-29T07:37:59Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:be:9f:d4 10.100.0.11
Nov 29 02:38:01 np0005539505 nova_compute[186958]: 2025-11-29 07:38:01.329 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:01 np0005539505 nova_compute[186958]: 2025-11-29 07:38:01.958 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:04 np0005539505 nova_compute[186958]: 2025-11-29 07:38:04.121 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:04 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:38:04.437 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:38:06 np0005539505 nova_compute[186958]: 2025-11-29 07:38:06.960 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:09 np0005539505 nova_compute[186958]: 2025-11-29 07:38:09.124 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:09 np0005539505 podman[246258]: 2025-11-29 07:38:09.713532064 +0000 UTC m=+0.045891754 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:38:09 np0005539505 podman[246257]: 2025-11-29 07:38:09.756129714 +0000 UTC m=+0.087719183 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal)
Nov 29 02:38:11 np0005539505 nova_compute[186958]: 2025-11-29 07:38:11.964 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:13 np0005539505 podman[246303]: 2025-11-29 07:38:13.707438159 +0000 UTC m=+0.046082839 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:38:14 np0005539505 nova_compute[186958]: 2025-11-29 07:38:14.125 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:16 np0005539505 nova_compute[186958]: 2025-11-29 07:38:16.966 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:19 np0005539505 nova_compute[186958]: 2025-11-29 07:38:19.127 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:22 np0005539505 nova_compute[186958]: 2025-11-29 07:38:22.018 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:22 np0005539505 podman[246322]: 2025-11-29 07:38:22.719987048 +0000 UTC m=+0.057813820 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:38:22 np0005539505 podman[246323]: 2025-11-29 07:38:22.777976952 +0000 UTC m=+0.101549593 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:38:23 np0005539505 nova_compute[186958]: 2025-11-29 07:38:23.388 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:24 np0005539505 nova_compute[186958]: 2025-11-29 07:38:24.129 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:25 np0005539505 nova_compute[186958]: 2025-11-29 07:38:25.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:25 np0005539505 nova_compute[186958]: 2025-11-29 07:38:25.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:25 np0005539505 podman[246373]: 2025-11-29 07:38:25.737264486 +0000 UTC m=+0.061244467 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:38:25 np0005539505 podman[246374]: 2025-11-29 07:38:25.745965161 +0000 UTC m=+0.063262314 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:38:27 np0005539505 nova_compute[186958]: 2025-11-29 07:38:27.068 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:38:27.519 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:38:27.519 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:38:27.520 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:29 np0005539505 nova_compute[186958]: 2025-11-29 07:38:29.131 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:29 np0005539505 nova_compute[186958]: 2025-11-29 07:38:29.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:29 np0005539505 nova_compute[186958]: 2025-11-29 07:38:29.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:38:31 np0005539505 nova_compute[186958]: 2025-11-29 07:38:31.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:32 np0005539505 nova_compute[186958]: 2025-11-29 07:38:32.073 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:33 np0005539505 nova_compute[186958]: 2025-11-29 07:38:33.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:33 np0005539505 nova_compute[186958]: 2025-11-29 07:38:33.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:38:33 np0005539505 nova_compute[186958]: 2025-11-29 07:38:33.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:38:34 np0005539505 nova_compute[186958]: 2025-11-29 07:38:34.053 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:38:34 np0005539505 nova_compute[186958]: 2025-11-29 07:38:34.053 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:38:34 np0005539505 nova_compute[186958]: 2025-11-29 07:38:34.053 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:38:34 np0005539505 nova_compute[186958]: 2025-11-29 07:38:34.054 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2a9bd960-7f4e-411b-b743-70064e15a0d7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:38:34 np0005539505 nova_compute[186958]: 2025-11-29 07:38:34.133 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:37 np0005539505 nova_compute[186958]: 2025-11-29 07:38:37.075 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:37 np0005539505 nova_compute[186958]: 2025-11-29 07:38:37.180 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updating instance_info_cache with network_info: [{"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:38:37 np0005539505 nova_compute[186958]: 2025-11-29 07:38:37.542 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:38:37 np0005539505 nova_compute[186958]: 2025-11-29 07:38:37.542 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:38:37 np0005539505 nova_compute[186958]: 2025-11-29 07:38:37.543 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:37 np0005539505 nova_compute[186958]: 2025-11-29 07:38:37.543 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:37 np0005539505 nova_compute[186958]: 2025-11-29 07:38:37.653 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:37 np0005539505 nova_compute[186958]: 2025-11-29 07:38:37.653 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:37 np0005539505 nova_compute[186958]: 2025-11-29 07:38:37.653 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:37 np0005539505 nova_compute[186958]: 2025-11-29 07:38:37.654 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:38:38 np0005539505 nova_compute[186958]: 2025-11-29 07:38:38.238 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:38 np0005539505 nova_compute[186958]: 2025-11-29 07:38:38.292 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:38 np0005539505 nova_compute[186958]: 2025-11-29 07:38:38.293 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:38:38 np0005539505 nova_compute[186958]: 2025-11-29 07:38:38.352 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:38:38 np0005539505 nova_compute[186958]: 2025-11-29 07:38:38.521 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:38:38 np0005539505 nova_compute[186958]: 2025-11-29 07:38:38.523 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5555MB free_disk=73.04486846923828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:38:38 np0005539505 nova_compute[186958]: 2025-11-29 07:38:38.523 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:38:38 np0005539505 nova_compute[186958]: 2025-11-29 07:38:38.524 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:38:39 np0005539505 nova_compute[186958]: 2025-11-29 07:38:39.134 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:39 np0005539505 nova_compute[186958]: 2025-11-29 07:38:39.307 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 2a9bd960-7f4e-411b-b743-70064e15a0d7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:38:39 np0005539505 nova_compute[186958]: 2025-11-29 07:38:39.308 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:38:39 np0005539505 nova_compute[186958]: 2025-11-29 07:38:39.308 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:38:39 np0005539505 nova_compute[186958]: 2025-11-29 07:38:39.418 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:38:40 np0005539505 nova_compute[186958]: 2025-11-29 07:38:40.262 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:38:40 np0005539505 nova_compute[186958]: 2025-11-29 07:38:40.700 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:38:40 np0005539505 nova_compute[186958]: 2025-11-29 07:38:40.700 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:38:40 np0005539505 podman[246420]: 2025-11-29 07:38:40.712800113 +0000 UTC m=+0.051044589 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 02:38:40 np0005539505 podman[246421]: 2025-11-29 07:38:40.714353027 +0000 UTC m=+0.048881158 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:38:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:38:41.768 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:38:41 np0005539505 nova_compute[186958]: 2025-11-29 07:38:41.769 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:38:41.771 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:38:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:38:41.773 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:38:42 np0005539505 nova_compute[186958]: 2025-11-29 07:38:42.087 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:44 np0005539505 nova_compute[186958]: 2025-11-29 07:38:44.136 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:44 np0005539505 podman[246465]: 2025-11-29 07:38:44.716155077 +0000 UTC m=+0.045479072 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:38:46 np0005539505 nova_compute[186958]: 2025-11-29 07:38:46.537 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:46 np0005539505 nova_compute[186958]: 2025-11-29 07:38:46.538 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:38:47 np0005539505 nova_compute[186958]: 2025-11-29 07:38:47.143 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.104 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'name': 'tempest-TestGettingAddress-server-1620649343', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000098', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0111c22b4b954ea586ca20d91ed3970f', 'user_id': '31ac7b05b012433b89143dc9f259644a', 'hostId': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.108 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2a9bd960-7f4e-411b-b743-70064e15a0d7 / tapf838accb-ce inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.108 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 2a9bd960-7f4e-411b-b743-70064e15a0d7 / tap7b28d182-bf inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.108 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.109 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bca20570-f2d8-4204-91bb-72b977e831c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tapf838accb-ce', 'timestamp': '2025-11-29T07:38:48.105658', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tapf838accb-ce', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:9f:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf838accb-ce'}, 'message_id': '712a1a44-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': 'f6adfd3d3c859e6ac92f6313d561f5ef0a1b7ea82f1855db6201d646d13faa77'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tap7b28d182-bf', 'timestamp': '2025-11-29T07:38:48.105658', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tap7b28d182-bf', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:bc:1e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b28d182-bf'}, 'message_id': '712a2b2e-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '1365eee178941608165c3f3ca3ce455e44ad4557cc6b38bab240b56be5885dad'}]}, 'timestamp': '2025-11-29 07:38:48.109682', '_unique_id': '33cdcdf6390b402388468bf427073eff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.111 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.112 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.122 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.122 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b272a5b-8b83-4eb1-a9ab-80329522a759', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-vda', 'timestamp': '2025-11-29T07:38:48.112328', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '712c274e-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.753174609, 'message_signature': 'e45af7e218ec68f05d83275c715e246f0e3bbcae32cabb9eabed1f340b442399'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-sda', 'timestamp': '2025-11-29T07:38:48.112328', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '712c3504-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.753174609, 'message_signature': 'c524185b7e5b28d8b4f1ea39b0039e19f0dc398e9bfbd75017be10bf6c1e35b0'}]}, 'timestamp': '2025-11-29 07:38:48.123048', '_unique_id': '0847c605e72a4a358c1fb48bf2fbd856'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.125 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.125 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1620649343>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1620649343>]
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.126 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.126 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82afd899-23df-48d9-9177-aa4566e495f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-vda', 'timestamp': '2025-11-29T07:38:48.126040', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '712cb6dc-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.753174609, 'message_signature': '254c9e87384f7f6035ecd73b72885b89d21fbb4b05b857752f1e837a98a987cf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-sda', 'timestamp': '2025-11-29T07:38:48.126040', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '712cc3ca-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.753174609, 'message_signature': '5248b2df010a18d61fd9c90b3548a7b69e55a95a56fb8acac05d27e34be350ae'}]}, 'timestamp': '2025-11-29 07:38:48.126676', '_unique_id': '393a0ff80afa40f6b77d98e37062ba9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.128 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.128 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.outgoing.bytes volume: 2708 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d6940a7-9746-4779-b288-4206983350e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3390, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tapf838accb-ce', 'timestamp': '2025-11-29T07:38:48.128427', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tapf838accb-ce', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:9f:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf838accb-ce'}, 'message_id': '712d1334-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '4dd2db0951f8835bbf080d49d30b9f1bc523604138585ec25f499549a7b57876'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2708, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tap7b28d182-bf', 'timestamp': '2025-11-29T07:38:48.128427', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tap7b28d182-bf', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:bc:1e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b28d182-bf'}, 'message_id': '712d1d5c-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': 'b4acfa32f9bdc3d001bec9bb4454cd4c3c20eff2246d4ffe22614361fb1072e1'}]}, 'timestamp': '2025-11-29 07:38:48.128960', '_unique_id': '137027556f814c579d87cb785d02ecbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.130 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.130 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1620649343>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1620649343>]
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.130 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.incoming.bytes volume: 740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9467f429-52d1-49a0-adb3-44cf2eaa2a90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tapf838accb-ce', 'timestamp': '2025-11-29T07:38:48.130930', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tapf838accb-ce', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:9f:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf838accb-ce'}, 'message_id': '712d74a0-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': 'c6704ea84d2a3d4e0f3c46ba935c2a6010374a8c51b6b7a8d223c8ba41b13fd4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 740, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tap7b28d182-bf', 'timestamp': '2025-11-29T07:38:48.130930', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tap7b28d182-bf', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:bc:1e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b28d182-bf'}, 'message_id': '712d7f54-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '55c2d0e2b353023d88db18f6be9c623254acb7e0d70140c3c67cc9c0cda20659'}]}, 'timestamp': '2025-11-29 07:38:48.131469', '_unique_id': 'ef19a31b59084dd8a15aac96bf5b04e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.131 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.133 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.133 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '963b620a-d9c7-40ae-8420-3665ababa2eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tapf838accb-ce', 'timestamp': '2025-11-29T07:38:48.132995', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tapf838accb-ce', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:9f:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf838accb-ce'}, 'message_id': '712dc55e-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '507038496d3b37909521033e48c6377dc47a11ee7ea4b12fcd7a39fd96a706c6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tap7b28d182-bf', 'timestamp': '2025-11-29T07:38:48.132995', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tap7b28d182-bf', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:bc:1e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b28d182-bf'}, 'message_id': '712dd224-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '9da8550042cac5afc3453cf16c49b77d9c4e75400b5b0066d4097e36465befaf'}]}, 'timestamp': '2025-11-29 07:38:48.133624', '_unique_id': '317de8c48efc4201ba948f4f7df3f52c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.134 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.135 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.135 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1620649343>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1620649343>]
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.158 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.write.latency volume: 1982218085 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.159 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7b79cf2-51f5-4bde-a573-0f109c29bb04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1982218085, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-vda', 'timestamp': '2025-11-29T07:38:48.135635', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7131bb3c-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': 'f2b7708da525cf39f3f3baf3d4a18038b28563fcc2678bd43b607473b4f97698'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-sda', 'timestamp': '2025-11-29T07:38:48.135635', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7131c9e2-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': '1cdc837e51a859753b3cc11191c8ce55059ecfb87d4afacfff30b2d8a7ad655e'}]}, 'timestamp': '2025-11-29 07:38:48.159600', '_unique_id': 'a3ba5f5ae30443b69aae6dc2b37da7b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.161 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc772679-d096-4cb2-ae81-03a785e7097b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tapf838accb-ce', 'timestamp': '2025-11-29T07:38:48.161709', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tapf838accb-ce', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:9f:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf838accb-ce'}, 'message_id': '71322752-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': 'ff20469ff8e686ec98dc6a0e9c312cb6b063b9ca5ddb80df681ac58ff232c959'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tap7b28d182-bf', 'timestamp': '2025-11-29T07:38:48.161709', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tap7b28d182-bf', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:bc:1e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b28d182-bf'}, 'message_id': '71323486-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '54721a05a79dc35ac768404688175d3180fcd838d9a02a30cf22194dba6a858a'}]}, 'timestamp': '2025-11-29 07:38:48.162341', '_unique_id': '4617faa11f5441ce8c432b8a08602b21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.162 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.163 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.164 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64e25d06-2bc8-4358-8ebd-dd9ab5be88cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tapf838accb-ce', 'timestamp': '2025-11-29T07:38:48.163836', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tapf838accb-ce', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:9f:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf838accb-ce'}, 'message_id': '71327bbc-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '54a3f96a6a5f709c29047dde002653dd6508288cd93551678738b99ba7519f30'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tap7b28d182-bf', 'timestamp': '2025-11-29T07:38:48.163836', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tap7b28d182-bf', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:bc:1e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b28d182-bf'}, 'message_id': '713288a0-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': 'c6fb80b1684946ee1f5d08ad986bc393154958fc291583f5d09f98e1544e0279'}]}, 'timestamp': '2025-11-29 07:38:48.164477', '_unique_id': 'e80ba74ebfa04afdbad55537b5566587'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.166 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.read.bytes volume: 30325248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.166 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50626f62-4ed1-4316-bbb2-07224db19c62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30325248, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-vda', 'timestamp': '2025-11-29T07:38:48.166084', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7132d36e-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': 'f8ce8be26f1fe4de290cdec0ce792ebad059a9519029ad5eca030957dfecc822'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-sda', 'timestamp': '2025-11-29T07:38:48.166084', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7132dd78-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': 'b08d747041b06491a99940d1e18836cdfddd4b21adb031ee44bdaa3a556eed2b'}]}, 'timestamp': '2025-11-29 07:38:48.166667', '_unique_id': 'e71a2e229f5e467da1df60c03347d84a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.167 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.168 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.168 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a6b1792-0908-424c-9da8-91d3c2108488', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-vda', 'timestamp': '2025-11-29T07:38:48.168350', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71332a9e-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.753174609, 'message_signature': '1236ce50c3c8f1e97059e8da489bb81fb1ea34277ae9339fddb2923befa94c82'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-sda', 'timestamp': '2025-11-29T07:38:48.168350', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '713334e4-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.753174609, 'message_signature': '3c91589fa1733cc855943c8f42cf6cb4a05a4353a09d7a4f34543e6b84236822'}]}, 'timestamp': '2025-11-29 07:38:48.168877', '_unique_id': '51a6120cbd5145108002f4f5ac477a03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.170 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.170 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.outgoing.packets volume: 22 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c63f7635-898c-4ad3-acb1-037c04f3d62c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tapf838accb-ce', 'timestamp': '2025-11-29T07:38:48.170416', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tapf838accb-ce', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:9f:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf838accb-ce'}, 'message_id': '71337bf2-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': 'bb2f2c6430c169e4f455f2ed20af4bc116b66c936242300b07c158c06357f545'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 22, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tap7b28d182-bf', 'timestamp': '2025-11-29T07:38:48.170416', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tap7b28d182-bf', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:bc:1e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b28d182-bf'}, 'message_id': '71338732-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '09ec7901179a4d66b9189908cf4acda276f147476eeeb32a2cb2b8e16b4cc2d0'}]}, 'timestamp': '2025-11-29 07:38:48.170996', '_unique_id': 'aeed7416296f4a55be1a87aa1b954acb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.171 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.172 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.172 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.173 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25133e9a-5e13-4221-a67b-2548fa29e196', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tapf838accb-ce', 'timestamp': '2025-11-29T07:38:48.172789', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tapf838accb-ce', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:9f:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf838accb-ce'}, 'message_id': '7133daf2-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '0504529a39a6ef4a728e06a8292a0de9264e9f41a97622206f62def67e6c9e55'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tap7b28d182-bf', 'timestamp': '2025-11-29T07:38:48.172789', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tap7b28d182-bf', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:bc:1e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b28d182-bf'}, 'message_id': '7133e970-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '231949a17d1f09a819b4931fb176c7b61e8d75e33456cb632eeb360f64f1f47b'}]}, 'timestamp': '2025-11-29 07:38:48.173512', '_unique_id': '370af7f25d2f45de8cbcb9486aef120a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.175 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.175 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.175 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '966a7b54-0445-48e2-8d85-eaf9fa5f02d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tapf838accb-ce', 'timestamp': '2025-11-29T07:38:48.175298', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tapf838accb-ce', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:9f:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf838accb-ce'}, 'message_id': '71343a42-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '21196bce5bc18ee930c6e961ed12fa874275bb4675b15f68fd052f2c3db1c5bb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tap7b28d182-bf', 'timestamp': '2025-11-29T07:38:48.175298', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tap7b28d182-bf', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:bc:1e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b28d182-bf'}, 'message_id': '7134450a-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': 'ad09fb68768ad189b679f3ccb3b21901b0a39b343ba928cac8df2c9d51ce1d28'}]}, 'timestamp': '2025-11-29 07:38:48.175852', '_unique_id': 'ebbb5364844d40cc83ba33dab5426dc3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.176 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.177 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.177 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.read.requests volume: 1087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.177 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aaa5f169-638c-483f-83d8-b4f9a6d7f800', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1087, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-vda', 'timestamp': '2025-11-29T07:38:48.177395', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71348c0e-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': 'be22497ff827a8f3148b23fdbc798cf0c3a48b84469b77bd405e15d9240aa625'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-sda', 'timestamp': '2025-11-29T07:38:48.177395', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '713495a0-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': '7ff7a62dc76773f851e6031288cff5dfaded8343bc72161014a30c5f8aeee60d'}]}, 'timestamp': '2025-11-29 07:38:48.177933', '_unique_id': '700900c66790409fba52753a69987c88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.179 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.179 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a00274a4-ddc5-4a6b-bbef-8c53076b7fb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tapf838accb-ce', 'timestamp': '2025-11-29T07:38:48.179450', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tapf838accb-ce', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:9f:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf838accb-ce'}, 'message_id': '7134dce0-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '33fdd517352ea4f86e4d1952afaf54aad9a991a8e5693d78efceff6bab997b3a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': 'instance-00000098-2a9bd960-7f4e-411b-b743-70064e15a0d7-tap7b28d182-bf', 'timestamp': '2025-11-29T07:38:48.179450', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'tap7b28d182-bf', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4a:bc:1e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b28d182-bf'}, 'message_id': '7134e712-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.746504971, 'message_signature': '298982c50fcbef5542c5e218b48a1508dac0ddae00b73b408e169709a0080a29'}]}, 'timestamp': '2025-11-29 07:38:48.179999', '_unique_id': 'bb3bfc334e5a4a6982928bf75a616547'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.180 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.181 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.194 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/cpu volume: 11630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6deeac89-0aaa-4d3f-97b6-f9873a61be17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11630000000, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'timestamp': '2025-11-29T07:38:48.181442', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '71372cc0-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.835149229, 'message_signature': '29a9ed47808abbd24c333781a326846ca813941efa5d3935caa72764369cd0c7'}]}, 'timestamp': '2025-11-29 07:38:48.194953', '_unique_id': '0d33994d05984f928fb13598a8d7cf4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.196 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.write.requests volume: 325 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6669fdc1-62fa-4392-be51-e75e9a737546', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 325, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-vda', 'timestamp': '2025-11-29T07:38:48.196815', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71378260-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': '1ce98b3b84f92ae75685ac60ed04ada9ccb7031cd0108e037a529a922f6988bc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-sda', 'timestamp': '2025-11-29T07:38:48.196815', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '71378c56-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': 'da072a031e329c78597770f8724a5247cda25f197d9ddccd5dc73543884346c9'}]}, 'timestamp': '2025-11-29 07:38:48.197333', '_unique_id': '680704c8faad4bcfaf714e8c181ddb5c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.198 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.write.bytes volume: 73060352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69a25c34-ab15-4a9e-8ca0-63c3da9535a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73060352, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-vda', 'timestamp': '2025-11-29T07:38:48.198793', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7137cf7c-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': '35ec4e92e07a5db2011c7dd085cdf58c8dbaea6d4f7522fce44e37797b6ac0c4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-sda', 'timestamp': '2025-11-29T07:38:48.198793', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7137db70-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': '65f2e7d7bf7d654a4e66f99e83b9efc2ab75661596abba0177dc210808e886b4'}]}, 'timestamp': '2025-11-29 07:38:48.199380', '_unique_id': '119f735d240349d6be1053e5545e45fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.200 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.200 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/memory.usage volume: 43.99609375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d6c3685-66bb-4acd-bd07-62ad87fd1562', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.99609375, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'timestamp': '2025-11-29T07:38:48.200808', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '71381e82-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.835149229, 'message_signature': 'c53233108b30d484d973b10938350204a3f8b811daf486ee60b8a4ce8f4bdb8b'}]}, 'timestamp': '2025-11-29 07:38:48.201076', '_unique_id': '9d4008fe14a64edfbb3ca12191162cef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.202 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.read.latency volume: 200006424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.202 12 DEBUG ceilometer.compute.pollsters [-] 2a9bd960-7f4e-411b-b743-70064e15a0d7/disk.device.read.latency volume: 25440841 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dfd206c-78e2-4db1-b658-af9a0c9acf29', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 200006424, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-vda', 'timestamp': '2025-11-29T07:38:48.202658', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '71386694-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': '2297d0db27d07830fad17b59b1024ca2ac72658d6d99c24a9e7cd251769e033a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25440841, 'user_id': '31ac7b05b012433b89143dc9f259644a', 'user_name': None, 'project_id': '0111c22b4b954ea586ca20d91ed3970f', 'project_name': None, 'resource_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7-sda', 'timestamp': '2025-11-29T07:38:48.202658', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1620649343', 'name': 'instance-00000098', 'instance_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'instance_type': 'm1.nano', 'host': 'e8769ffa53588665b86a5435855d8a2e32cb3c8bf4ab50262a7c6208', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7138704e-ccf6-11f0-8954-fa163e5a5606', 'monotonic_time': 7375.776483456, 'message_signature': 'ba0ca5e784c440e85d35d5207cea9b3083b79917f561a1a8ef084e6eda6d29a3'}]}, 'timestamp': '2025-11-29 07:38:48.203195', '_unique_id': '5380e515795b49c3814e97675a89555f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.205 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:38:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:38:48.205 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1620649343>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1620649343>]
Nov 29 02:38:49 np0005539505 nova_compute[186958]: 2025-11-29 07:38:49.138 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:52 np0005539505 nova_compute[186958]: 2025-11-29 07:38:52.145 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:53 np0005539505 podman[246489]: 2025-11-29 07:38:53.729001553 +0000 UTC m=+0.061705339 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:38:53 np0005539505 podman[246490]: 2025-11-29 07:38:53.780953397 +0000 UTC m=+0.108382465 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:38:54 np0005539505 nova_compute[186958]: 2025-11-29 07:38:54.187 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:56 np0005539505 podman[246540]: 2025-11-29 07:38:56.724155287 +0000 UTC m=+0.054734273 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:38:56 np0005539505 podman[246539]: 2025-11-29 07:38:56.724534468 +0000 UTC m=+0.059583560 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:38:57 np0005539505 nova_compute[186958]: 2025-11-29 07:38:57.147 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:38:59 np0005539505 nova_compute[186958]: 2025-11-29 07:38:59.190 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:02 np0005539505 nova_compute[186958]: 2025-11-29 07:39:02.149 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:04 np0005539505 nova_compute[186958]: 2025-11-29 07:39:04.226 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:07 np0005539505 nova_compute[186958]: 2025-11-29 07:39:07.151 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:09 np0005539505 nova_compute[186958]: 2025-11-29 07:39:09.227 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:11 np0005539505 podman[246579]: 2025-11-29 07:39:11.722462329 +0000 UTC m=+0.060138876 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:39:11 np0005539505 podman[246580]: 2025-11-29 07:39:11.7249944 +0000 UTC m=+0.055190806 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:39:12 np0005539505 nova_compute[186958]: 2025-11-29 07:39:12.154 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:14 np0005539505 nova_compute[186958]: 2025-11-29 07:39:14.229 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:15 np0005539505 podman[246624]: 2025-11-29 07:39:15.714429551 +0000 UTC m=+0.047611802 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:39:17 np0005539505 nova_compute[186958]: 2025-11-29 07:39:17.155 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:19 np0005539505 nova_compute[186958]: 2025-11-29 07:39:19.232 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:20.905 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:39:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:20.906 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:39:20 np0005539505 nova_compute[186958]: 2025-11-29 07:39:20.929 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:22 np0005539505 nova_compute[186958]: 2025-11-29 07:39:22.156 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:24 np0005539505 nova_compute[186958]: 2025-11-29 07:39:24.233 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:24 np0005539505 podman[246644]: 2025-11-29 07:39:24.764093546 +0000 UTC m=+0.057782747 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:39:24 np0005539505 podman[246645]: 2025-11-29 07:39:24.817957493 +0000 UTC m=+0.116622254 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:39:27 np0005539505 nova_compute[186958]: 2025-11-29 07:39:27.175 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:27 np0005539505 nova_compute[186958]: 2025-11-29 07:39:27.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:27 np0005539505 nova_compute[186958]: 2025-11-29 07:39:27.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:27.520 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:27.520 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:27.521 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:27 np0005539505 podman[246694]: 2025-11-29 07:39:27.727018932 +0000 UTC m=+0.056183794 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:39:27 np0005539505 podman[246695]: 2025-11-29 07:39:27.738739092 +0000 UTC m=+0.064442496 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm)
Nov 29 02:39:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:28.908 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:29 np0005539505 nova_compute[186958]: 2025-11-29 07:39:29.235 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:31 np0005539505 nova_compute[186958]: 2025-11-29 07:39:31.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:31 np0005539505 nova_compute[186958]: 2025-11-29 07:39:31.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.177 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.381 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.470 186962 DEBUG nova.compute.manager [req-11f9658b-c575-4f9c-9db2-9ee16de8c795 req-87bcc5ac-e1cf-4e07-94b0-7da969932014 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-changed-f838accb-ce0a-45ec-bc0d-5aaaae01c4de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.471 186962 DEBUG nova.compute.manager [req-11f9658b-c575-4f9c-9db2-9ee16de8c795 req-87bcc5ac-e1cf-4e07-94b0-7da969932014 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Refreshing instance network info cache due to event network-changed-f838accb-ce0a-45ec-bc0d-5aaaae01c4de. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.471 186962 DEBUG oslo_concurrency.lockutils [req-11f9658b-c575-4f9c-9db2-9ee16de8c795 req-87bcc5ac-e1cf-4e07-94b0-7da969932014 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.471 186962 DEBUG oslo_concurrency.lockutils [req-11f9658b-c575-4f9c-9db2-9ee16de8c795 req-87bcc5ac-e1cf-4e07-94b0-7da969932014 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.472 186962 DEBUG nova.network.neutron [req-11f9658b-c575-4f9c-9db2-9ee16de8c795 req-87bcc5ac-e1cf-4e07-94b0-7da969932014 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Refreshing network info cache for port f838accb-ce0a-45ec-bc0d-5aaaae01c4de _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.792 186962 DEBUG oslo_concurrency.lockutils [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.793 186962 DEBUG oslo_concurrency.lockutils [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.794 186962 DEBUG oslo_concurrency.lockutils [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.794 186962 DEBUG oslo_concurrency.lockutils [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.795 186962 DEBUG oslo_concurrency.lockutils [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.826 186962 INFO nova.compute.manager [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Terminating instance#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.838 186962 DEBUG nova.compute.manager [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:39:32 np0005539505 kernel: tapf838accb-ce (unregistering): left promiscuous mode
Nov 29 02:39:32 np0005539505 NetworkManager[55134]: <info>  [1764401972.8673] device (tapf838accb-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.875 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:39:32Z|00738|binding|INFO|Releasing lport f838accb-ce0a-45ec-bc0d-5aaaae01c4de from this chassis (sb_readonly=0)
Nov 29 02:39:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:39:32Z|00739|binding|INFO|Setting lport f838accb-ce0a-45ec-bc0d-5aaaae01c4de down in Southbound
Nov 29 02:39:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:39:32Z|00740|binding|INFO|Removing iface tapf838accb-ce ovn-installed in OVS
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.879 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.889 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:32 np0005539505 kernel: tap7b28d182-bf (unregistering): left promiscuous mode
Nov 29 02:39:32 np0005539505 NetworkManager[55134]: <info>  [1764401972.8925] device (tap7b28d182-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:39:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:39:32Z|00741|binding|INFO|Releasing lport 7b28d182-bf32-4711-805d-1bdeb68b89e9 from this chassis (sb_readonly=1)
Nov 29 02:39:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:39:32Z|00742|binding|INFO|Removing iface tap7b28d182-bf ovn-installed in OVS
Nov 29 02:39:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:39:32Z|00743|if_status|INFO|Dropped 1 log messages in last 666 seconds (most recently, 666 seconds ago) due to excessive rate
Nov 29 02:39:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:39:32Z|00744|if_status|INFO|Not setting lport 7b28d182-bf32-4711-805d-1bdeb68b89e9 down as sb is readonly
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.899 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:32 np0005539505 nova_compute[186958]: 2025-11-29 07:39:32.916 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:32 np0005539505 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d00000098.scope: Deactivated successfully.
Nov 29 02:39:32 np0005539505 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d00000098.scope: Consumed 16.855s CPU time.
Nov 29 02:39:32 np0005539505 systemd-machined[153285]: Machine qemu-78-instance-00000098 terminated.
Nov 29 02:39:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:39:32Z|00745|binding|INFO|Setting lport 7b28d182-bf32-4711-805d-1bdeb68b89e9 down in Southbound
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:32.999 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:9f:d4 10.100.0.11'], port_security=['fa:16:3e:be:9f:d4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b28f67ac-b290-4be5-88df-4393a9d30b89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7c7b7bb-9c39-4f1c-a218-71c5fbf31db4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=f838accb-ce0a-45ec-bc0d-5aaaae01c4de) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.000 104094 INFO neutron.agent.ovn.metadata.agent [-] Port f838accb-ce0a-45ec-bc0d-5aaaae01c4de in datapath 51013e93-c048-46cc-9a9d-a184eb63e1b4 unbound from our chassis#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.003 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51013e93-c048-46cc-9a9d-a184eb63e1b4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.004 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[34a67993-f700-4ddc-8c1f-c2ec29e34f0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.005 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4 namespace which is not needed anymore#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.022 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:bc:1e 2001:db8::f816:3eff:fe4a:bc1e'], port_security=['fa:16:3e:4a:bc:1e 2001:db8::f816:3eff:fe4a:bc1e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4a:bc1e/64', 'neutron:device_id': '2a9bd960-7f4e-411b-b743-70064e15a0d7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3085017e-01d1-448e-9eca-033b34f9e960', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b28f67ac-b290-4be5-88df-4393a9d30b89', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3544b146-f2db-4cb1-8570-c2c0dfd4e173, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=7b28d182-bf32-4711-805d-1bdeb68b89e9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:39:33 np0005539505 NetworkManager[55134]: <info>  [1764401973.0698] manager: (tap7b28d182-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/363)
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.076 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.085 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.118 186962 INFO nova.virt.libvirt.driver [-] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Instance destroyed successfully.#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.118 186962 DEBUG nova.objects.instance [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lazy-loading 'resources' on Instance uuid 2a9bd960-7f4e-411b-b743-70064e15a0d7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:39:33 np0005539505 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[245991]: [NOTICE]   (245995) : haproxy version is 2.8.14-c23fe91
Nov 29 02:39:33 np0005539505 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[245991]: [NOTICE]   (245995) : path to executable is /usr/sbin/haproxy
Nov 29 02:39:33 np0005539505 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[245991]: [WARNING]  (245995) : Exiting Master process...
Nov 29 02:39:33 np0005539505 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[245991]: [ALERT]    (245995) : Current worker (245997) exited with code 143 (Terminated)
Nov 29 02:39:33 np0005539505 neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4[245991]: [WARNING]  (245995) : All workers exited. Exiting... (0)
Nov 29 02:39:33 np0005539505 systemd[1]: libpod-562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94.scope: Deactivated successfully.
Nov 29 02:39:33 np0005539505 podman[246785]: 2025-11-29 07:39:33.158074283 +0000 UTC m=+0.046921533 container died 562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 02:39:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94-userdata-shm.mount: Deactivated successfully.
Nov 29 02:39:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay-2f004819139b7dbd682e3fdc1fe0beb44aac12f3779ddd603726f1e3ddb5e346-merged.mount: Deactivated successfully.
Nov 29 02:39:33 np0005539505 podman[246785]: 2025-11-29 07:39:33.195793296 +0000 UTC m=+0.084640516 container cleanup 562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:39:33 np0005539505 systemd[1]: libpod-conmon-562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94.scope: Deactivated successfully.
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.255 186962 DEBUG nova.virt.libvirt.vif [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1620649343',display_name='tempest-TestGettingAddress-server-1620649343',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1620649343',id=152,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:37:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-e6m3jp5s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:37:51Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2a9bd960-7f4e-411b-b743-70064e15a0d7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.255 186962 DEBUG nova.network.os_vif_util [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.256 186962 DEBUG nova.network.os_vif_util [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:be:9f:d4,bridge_name='br-int',has_traffic_filtering=True,id=f838accb-ce0a-45ec-bc0d-5aaaae01c4de,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838accb-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.256 186962 DEBUG os_vif [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:9f:d4,bridge_name='br-int',has_traffic_filtering=True,id=f838accb-ce0a-45ec-bc0d-5aaaae01c4de,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838accb-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.258 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.258 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf838accb-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.259 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:33 np0005539505 podman[246819]: 2025-11-29 07:39:33.260327144 +0000 UTC m=+0.044279619 container remove 562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.262 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.266 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.265 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5b96d59f-7f2b-4cf0-be51-8819456674f1]: (4, ('Sat Nov 29 07:39:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4 (562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94)\n562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94\nSat Nov 29 07:39:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4 (562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94)\n562160b253ba958b62ad296e2eed792f1d657f20c66b8c7f350cc1eeed2eeb94\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.268 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[efefbc66-ea65-42b3-b7ac-1894c45f085e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.269 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51013e93-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.270 186962 INFO os_vif [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:9f:d4,bridge_name='br-int',has_traffic_filtering=True,id=f838accb-ce0a-45ec-bc0d-5aaaae01c4de,network=Network(51013e93-c048-46cc-9a9d-a184eb63e1b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf838accb-ce')#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.270 186962 DEBUG nova.virt.libvirt.vif [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1620649343',display_name='tempest-TestGettingAddress-server-1620649343',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1620649343',id=152,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB3is4E+83iEsFAN3k4vmM3DIB3FGnja3FrFYLTxp4hwLSgaHjf7h1x9RWYq0bVKXNWPDcV4Et8a4J2p23tZrThcwleGMa0jsxDB4wzeuHiJCpifdffKRCxdxwXJAAdszQ==',key_name='tempest-TestGettingAddress-369881010',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:37:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0111c22b4b954ea586ca20d91ed3970f',ramdisk_id='',reservation_id='r-e6m3jp5s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1465017630',owner_user_name='tempest-TestGettingAddress-1465017630-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:37:51Z,user_data=None,user_id='31ac7b05b012433b89143dc9f259644a',uuid=2a9bd960-7f4e-411b-b743-70064e15a0d7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.270 186962 DEBUG nova.network.os_vif_util [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converting VIF {"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:39:33 np0005539505 kernel: tap51013e93-c0: left promiscuous mode
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.272 186962 DEBUG nova.network.os_vif_util [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:bc:1e,bridge_name='br-int',has_traffic_filtering=True,id=7b28d182-bf32-4711-805d-1bdeb68b89e9,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b28d182-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.272 186962 DEBUG os_vif [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:bc:1e,bridge_name='br-int',has_traffic_filtering=True,id=7b28d182-bf32-4711-805d-1bdeb68b89e9,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b28d182-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.275 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.276 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b28d182-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.276 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.278 186962 DEBUG nova.compute.manager [req-7eaf232d-49e7-4f71-81f0-dc067716a5ad req-e6a3aeca-1ff1-4e5a-ac03-eff2189bd9c5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-unplugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.278 186962 DEBUG oslo_concurrency.lockutils [req-7eaf232d-49e7-4f71-81f0-dc067716a5ad req-e6a3aeca-1ff1-4e5a-ac03-eff2189bd9c5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.278 186962 DEBUG oslo_concurrency.lockutils [req-7eaf232d-49e7-4f71-81f0-dc067716a5ad req-e6a3aeca-1ff1-4e5a-ac03-eff2189bd9c5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.279 186962 DEBUG oslo_concurrency.lockutils [req-7eaf232d-49e7-4f71-81f0-dc067716a5ad req-e6a3aeca-1ff1-4e5a-ac03-eff2189bd9c5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.279 186962 DEBUG nova.compute.manager [req-7eaf232d-49e7-4f71-81f0-dc067716a5ad req-e6a3aeca-1ff1-4e5a-ac03-eff2189bd9c5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] No waiting events found dispatching network-vif-unplugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.279 186962 DEBUG nova.compute.manager [req-7eaf232d-49e7-4f71-81f0-dc067716a5ad req-e6a3aeca-1ff1-4e5a-ac03-eff2189bd9c5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-unplugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.280 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.282 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.284 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.287 186962 INFO os_vif [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:bc:1e,bridge_name='br-int',has_traffic_filtering=True,id=7b28d182-bf32-4711-805d-1bdeb68b89e9,network=Network(3085017e-01d1-448e-9eca-033b34f9e960),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b28d182-bf')#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.287 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7170d07b-939e-4239-9122-156e19eeabda]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.287 186962 INFO nova.virt.libvirt.driver [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Deleting instance files /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7_del#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.288 186962 INFO nova.virt.libvirt.driver [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Deletion of /var/lib/nova/instances/2a9bd960-7f4e-411b-b743-70064e15a0d7_del complete#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.301 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[02de8765-5e1c-4203-b561-9c3c89b9bb39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.303 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5db5684b-f4e9-4b07-b4e1-d5ebaf98ebfb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.319 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[70ec1f72-e554-4cc4-b69d-b07fbe19f817]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730517, 'reachable_time': 24273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246836, 'error': None, 'target': 'ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 systemd[1]: run-netns-ovnmeta\x2d51013e93\x2dc048\x2d46cc\x2d9a9d\x2da184eb63e1b4.mount: Deactivated successfully.
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.323 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-51013e93-c048-46cc-9a9d-a184eb63e1b4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.324 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[950a543c-c4f9-4bed-aaff-3d588cb4b8bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.325 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 7b28d182-bf32-4711-805d-1bdeb68b89e9 in datapath 3085017e-01d1-448e-9eca-033b34f9e960 unbound from our chassis#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.326 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3085017e-01d1-448e-9eca-033b34f9e960, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.327 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[80613eb4-b294-42a2-b07e-1d8bac5aa3a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.328 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960 namespace which is not needed anymore#033[00m
Nov 29 02:39:33 np0005539505 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[246065]: [NOTICE]   (246096) : haproxy version is 2.8.14-c23fe91
Nov 29 02:39:33 np0005539505 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[246065]: [NOTICE]   (246096) : path to executable is /usr/sbin/haproxy
Nov 29 02:39:33 np0005539505 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[246065]: [WARNING]  (246096) : Exiting Master process...
Nov 29 02:39:33 np0005539505 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[246065]: [ALERT]    (246096) : Current worker (246115) exited with code 143 (Terminated)
Nov 29 02:39:33 np0005539505 neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960[246065]: [WARNING]  (246096) : All workers exited. Exiting... (0)
Nov 29 02:39:33 np0005539505 systemd[1]: libpod-0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa.scope: Deactivated successfully.
Nov 29 02:39:33 np0005539505 podman[246854]: 2025-11-29 07:39:33.45361356 +0000 UTC m=+0.044518135 container died 0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:39:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa-userdata-shm.mount: Deactivated successfully.
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.481 186962 INFO nova.compute.manager [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Took 0.64 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:39:33 np0005539505 systemd[1]: var-lib-containers-storage-overlay-8e339c1dc48483187fa10efd533323f7463969e4bf557d0a348037db376560c4-merged.mount: Deactivated successfully.
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.482 186962 DEBUG oslo.service.loopingcall [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.482 186962 DEBUG nova.compute.manager [-] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.483 186962 DEBUG nova.network.neutron [-] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:39:33 np0005539505 podman[246854]: 2025-11-29 07:39:33.494111632 +0000 UTC m=+0.085016207 container cleanup 0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:39:33 np0005539505 systemd[1]: libpod-conmon-0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa.scope: Deactivated successfully.
Nov 29 02:39:33 np0005539505 podman[246883]: 2025-11-29 07:39:33.552537398 +0000 UTC m=+0.037744205 container remove 0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.557 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a141bbe9-d285-4942-8a34-ca64d8140357]: (4, ('Sat Nov 29 07:39:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960 (0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa)\n0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa\nSat Nov 29 07:39:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960 (0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa)\n0f41b1cb1037b0202f2dac736e02c161e7bbaa24eae346f26b8eb15d6fe86caa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.560 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[629bf4ea-c524-48af-b5c5-98e40e639cba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.561 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3085017e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.563 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:33 np0005539505 kernel: tap3085017e-00: left promiscuous mode
Nov 29 02:39:33 np0005539505 nova_compute[186958]: 2025-11-29 07:39:33.574 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.577 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9741ab9a-6b7f-4f01-b256-bd2aeb0a4ea4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.592 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f92f1ddb-c1cf-438f-8cdc-8d2b9dcf0cd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.594 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2e04099e-18d6-4ab3-8fbb-0d869f7473ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.609 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[04d44e35-5edf-40ee-95ae-67b5bd3713c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 730601, 'reachable_time': 30781, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246898, 'error': None, 'target': 'ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.611 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3085017e-01d1-448e-9eca-033b34f9e960 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:39:33 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:39:33.612 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[d9b8b5c0-6886-48a4-8777-90fa9d55989c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.140 186962 DEBUG nova.network.neutron [req-11f9658b-c575-4f9c-9db2-9ee16de8c795 req-87bcc5ac-e1cf-4e07-94b0-7da969932014 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updated VIF entry in instance network info cache for port f838accb-ce0a-45ec-bc0d-5aaaae01c4de. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.140 186962 DEBUG nova.network.neutron [req-11f9658b-c575-4f9c-9db2-9ee16de8c795 req-87bcc5ac-e1cf-4e07-94b0-7da969932014 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updating instance_info_cache with network_info: [{"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "address": "fa:16:3e:4a:bc:1e", "network": {"id": "3085017e-01d1-448e-9eca-033b34f9e960", "bridge": "br-int", "label": "tempest-network-smoke--1681462854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe4a:bc1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b28d182-bf", "ovs_interfaceid": "7b28d182-bf32-4711-805d-1bdeb68b89e9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:39:34 np0005539505 systemd[1]: run-netns-ovnmeta\x2d3085017e\x2d01d1\x2d448e\x2d9eca\x2d033b34f9e960.mount: Deactivated successfully.
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.288 186962 DEBUG oslo_concurrency.lockutils [req-11f9658b-c575-4f9c-9db2-9ee16de8c795 req-87bcc5ac-e1cf-4e07-94b0-7da969932014 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-2a9bd960-7f4e-411b-b743-70064e15a0d7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.630 186962 DEBUG nova.compute.manager [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-unplugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.630 186962 DEBUG oslo_concurrency.lockutils [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.630 186962 DEBUG oslo_concurrency.lockutils [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.630 186962 DEBUG oslo_concurrency.lockutils [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.631 186962 DEBUG nova.compute.manager [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] No waiting events found dispatching network-vif-unplugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.631 186962 DEBUG nova.compute.manager [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-unplugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.631 186962 DEBUG nova.compute.manager [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-plugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.631 186962 DEBUG oslo_concurrency.lockutils [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.631 186962 DEBUG oslo_concurrency.lockutils [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.631 186962 DEBUG oslo_concurrency.lockutils [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.632 186962 DEBUG nova.compute.manager [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] No waiting events found dispatching network-vif-plugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:39:34 np0005539505 nova_compute[186958]: 2025-11-29 07:39:34.632 186962 WARNING nova.compute.manager [req-2e8b1b93-f0fc-4a01-982d-a23e4c6a08c4 req-06a1f2b6-f97d-49ae-a715-3f976baea27a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received unexpected event network-vif-plugged-7b28d182-bf32-4711-805d-1bdeb68b89e9 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:39:35 np0005539505 nova_compute[186958]: 2025-11-29 07:39:35.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:35 np0005539505 nova_compute[186958]: 2025-11-29 07:39:35.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:39:35 np0005539505 nova_compute[186958]: 2025-11-29 07:39:35.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:39:35 np0005539505 nova_compute[186958]: 2025-11-29 07:39:35.436 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 29 02:39:35 np0005539505 nova_compute[186958]: 2025-11-29 07:39:35.436 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:39:35 np0005539505 nova_compute[186958]: 2025-11-29 07:39:35.543 186962 DEBUG nova.compute.manager [req-fa0f7d05-a151-434d-b5c3-a5a871d2c77b req-30daacfd-c945-4fa7-92f5-cc857f09252e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-plugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:39:35 np0005539505 nova_compute[186958]: 2025-11-29 07:39:35.544 186962 DEBUG oslo_concurrency.lockutils [req-fa0f7d05-a151-434d-b5c3-a5a871d2c77b req-30daacfd-c945-4fa7-92f5-cc857f09252e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:35 np0005539505 nova_compute[186958]: 2025-11-29 07:39:35.544 186962 DEBUG oslo_concurrency.lockutils [req-fa0f7d05-a151-434d-b5c3-a5a871d2c77b req-30daacfd-c945-4fa7-92f5-cc857f09252e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:35 np0005539505 nova_compute[186958]: 2025-11-29 07:39:35.545 186962 DEBUG oslo_concurrency.lockutils [req-fa0f7d05-a151-434d-b5c3-a5a871d2c77b req-30daacfd-c945-4fa7-92f5-cc857f09252e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:35 np0005539505 nova_compute[186958]: 2025-11-29 07:39:35.545 186962 DEBUG nova.compute.manager [req-fa0f7d05-a151-434d-b5c3-a5a871d2c77b req-30daacfd-c945-4fa7-92f5-cc857f09252e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] No waiting events found dispatching network-vif-plugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:39:35 np0005539505 nova_compute[186958]: 2025-11-29 07:39:35.545 186962 WARNING nova.compute.manager [req-fa0f7d05-a151-434d-b5c3-a5a871d2c77b req-30daacfd-c945-4fa7-92f5-cc857f09252e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received unexpected event network-vif-plugged-f838accb-ce0a-45ec-bc0d-5aaaae01c4de for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:39:36 np0005539505 nova_compute[186958]: 2025-11-29 07:39:36.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:37 np0005539505 nova_compute[186958]: 2025-11-29 07:39:37.179 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.279 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.678 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.678 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.678 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.679 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.871 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.872 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5707MB free_disk=73.0736198425293GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.873 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.873 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.900 186962 DEBUG nova.compute.manager [req-484fa6fc-37dc-4829-bb20-f48fd520d592 req-5362a42b-3a6f-48b1-a029-8487d7d02872 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-deleted-7b28d182-bf32-4711-805d-1bdeb68b89e9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.901 186962 INFO nova.compute.manager [req-484fa6fc-37dc-4829-bb20-f48fd520d592 req-5362a42b-3a6f-48b1-a029-8487d7d02872 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Neutron deleted interface 7b28d182-bf32-4711-805d-1bdeb68b89e9; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:39:38 np0005539505 nova_compute[186958]: 2025-11-29 07:39:38.901 186962 DEBUG nova.network.neutron [req-484fa6fc-37dc-4829-bb20-f48fd520d592 req-5362a42b-3a6f-48b1-a029-8487d7d02872 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updating instance_info_cache with network_info: [{"id": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "address": "fa:16:3e:be:9f:d4", "network": {"id": "51013e93-c048-46cc-9a9d-a184eb63e1b4", "bridge": "br-int", "label": "tempest-network-smoke--890595323", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0111c22b4b954ea586ca20d91ed3970f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf838accb-ce", "ovs_interfaceid": "f838accb-ce0a-45ec-bc0d-5aaaae01c4de", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:39:39 np0005539505 nova_compute[186958]: 2025-11-29 07:39:39.070 186962 DEBUG nova.compute.manager [req-484fa6fc-37dc-4829-bb20-f48fd520d592 req-5362a42b-3a6f-48b1-a029-8487d7d02872 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Detach interface failed, port_id=7b28d182-bf32-4711-805d-1bdeb68b89e9, reason: Instance 2a9bd960-7f4e-411b-b743-70064e15a0d7 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:39:39 np0005539505 nova_compute[186958]: 2025-11-29 07:39:39.487 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 2a9bd960-7f4e-411b-b743-70064e15a0d7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:39:39 np0005539505 nova_compute[186958]: 2025-11-29 07:39:39.488 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:39:39 np0005539505 nova_compute[186958]: 2025-11-29 07:39:39.488 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:39:39 np0005539505 nova_compute[186958]: 2025-11-29 07:39:39.596 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:39:40 np0005539505 nova_compute[186958]: 2025-11-29 07:39:40.312 186962 DEBUG nova.network.neutron [-] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:39:40 np0005539505 nova_compute[186958]: 2025-11-29 07:39:40.768 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:39:42 np0005539505 nova_compute[186958]: 2025-11-29 07:39:42.225 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:42 np0005539505 podman[246900]: 2025-11-29 07:39:42.731095055 +0000 UTC m=+0.061944836 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64)
Nov 29 02:39:42 np0005539505 podman[246901]: 2025-11-29 07:39:42.742360123 +0000 UTC m=+0.069888321 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:39:43 np0005539505 nova_compute[186958]: 2025-11-29 07:39:43.225 186962 DEBUG nova.compute.manager [req-93a0f950-8ac1-4b31-aa2b-5212b2ef4a85 req-eda3ff90-3401-4a71-8419-a131d9468fd8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Received event network-vif-deleted-f838accb-ce0a-45ec-bc0d-5aaaae01c4de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:39:43 np0005539505 nova_compute[186958]: 2025-11-29 07:39:43.226 186962 INFO nova.compute.manager [req-93a0f950-8ac1-4b31-aa2b-5212b2ef4a85 req-eda3ff90-3401-4a71-8419-a131d9468fd8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Neutron deleted interface f838accb-ce0a-45ec-bc0d-5aaaae01c4de; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:39:43 np0005539505 nova_compute[186958]: 2025-11-29 07:39:43.226 186962 DEBUG nova.network.neutron [req-93a0f950-8ac1-4b31-aa2b-5212b2ef4a85 req-eda3ff90-3401-4a71-8419-a131d9468fd8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:39:43 np0005539505 nova_compute[186958]: 2025-11-29 07:39:43.240 186962 INFO nova.compute.manager [-] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Took 9.76 seconds to deallocate network for instance.#033[00m
Nov 29 02:39:43 np0005539505 nova_compute[186958]: 2025-11-29 07:39:43.271 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:39:43 np0005539505 nova_compute[186958]: 2025-11-29 07:39:43.272 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:43 np0005539505 nova_compute[186958]: 2025-11-29 07:39:43.328 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:43 np0005539505 nova_compute[186958]: 2025-11-29 07:39:43.384 186962 DEBUG nova.compute.manager [req-93a0f950-8ac1-4b31-aa2b-5212b2ef4a85 req-eda3ff90-3401-4a71-8419-a131d9468fd8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Detach interface failed, port_id=f838accb-ce0a-45ec-bc0d-5aaaae01c4de, reason: Instance 2a9bd960-7f4e-411b-b743-70064e15a0d7 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:39:44 np0005539505 nova_compute[186958]: 2025-11-29 07:39:44.761 186962 DEBUG oslo_concurrency.lockutils [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:39:44 np0005539505 nova_compute[186958]: 2025-11-29 07:39:44.762 186962 DEBUG oslo_concurrency.lockutils [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:39:44 np0005539505 nova_compute[186958]: 2025-11-29 07:39:44.816 186962 DEBUG nova.compute.provider_tree [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:39:44 np0005539505 nova_compute[186958]: 2025-11-29 07:39:44.983 186962 DEBUG nova.scheduler.client.report [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:39:45 np0005539505 nova_compute[186958]: 2025-11-29 07:39:45.160 186962 DEBUG oslo_concurrency.lockutils [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:45 np0005539505 nova_compute[186958]: 2025-11-29 07:39:45.257 186962 INFO nova.scheduler.client.report [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Deleted allocations for instance 2a9bd960-7f4e-411b-b743-70064e15a0d7#033[00m
Nov 29 02:39:45 np0005539505 nova_compute[186958]: 2025-11-29 07:39:45.698 186962 DEBUG oslo_concurrency.lockutils [None req-24be3e00-78a7-4860-a433-38a2b29dc212 31ac7b05b012433b89143dc9f259644a 0111c22b4b954ea586ca20d91ed3970f - - default default] Lock "2a9bd960-7f4e-411b-b743-70064e15a0d7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 12.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:39:46 np0005539505 podman[246946]: 2025-11-29 07:39:46.710582525 +0000 UTC m=+0.048425855 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:39:47 np0005539505 nova_compute[186958]: 2025-11-29 07:39:47.262 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:48 np0005539505 nova_compute[186958]: 2025-11-29 07:39:48.117 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764401973.1156332, 2a9bd960-7f4e-411b-b743-70064e15a0d7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:39:48 np0005539505 nova_compute[186958]: 2025-11-29 07:39:48.118 186962 INFO nova.compute.manager [-] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:39:48 np0005539505 nova_compute[186958]: 2025-11-29 07:39:48.267 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:48 np0005539505 nova_compute[186958]: 2025-11-29 07:39:48.268 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:39:48 np0005539505 nova_compute[186958]: 2025-11-29 07:39:48.278 186962 DEBUG nova.compute.manager [None req-b383651a-c177-42c5-9502-40e2334005b5 - - - - - -] [instance: 2a9bd960-7f4e-411b-b743-70064e15a0d7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:39:48 np0005539505 nova_compute[186958]: 2025-11-29 07:39:48.386 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:52 np0005539505 nova_compute[186958]: 2025-11-29 07:39:52.263 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:53 np0005539505 nova_compute[186958]: 2025-11-29 07:39:53.389 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:54 np0005539505 nova_compute[186958]: 2025-11-29 07:39:54.503 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:54 np0005539505 nova_compute[186958]: 2025-11-29 07:39:54.766 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:55 np0005539505 podman[246967]: 2025-11-29 07:39:55.736308596 +0000 UTC m=+0.072188715 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:39:55 np0005539505 podman[246968]: 2025-11-29 07:39:55.748016496 +0000 UTC m=+0.071598759 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:39:57 np0005539505 nova_compute[186958]: 2025-11-29 07:39:57.296 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:58 np0005539505 nova_compute[186958]: 2025-11-29 07:39:58.392 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:39:58 np0005539505 podman[247017]: 2025-11-29 07:39:58.74033019 +0000 UTC m=+0.068691756 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 29 02:39:58 np0005539505 podman[247018]: 2025-11-29 07:39:58.74031285 +0000 UTC m=+0.067254446 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:40:02 np0005539505 nova_compute[186958]: 2025-11-29 07:40:02.298 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:03 np0005539505 nova_compute[186958]: 2025-11-29 07:40:03.396 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:07 np0005539505 nova_compute[186958]: 2025-11-29 07:40:07.301 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:08 np0005539505 nova_compute[186958]: 2025-11-29 07:40:08.399 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:12 np0005539505 nova_compute[186958]: 2025-11-29 07:40:12.302 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:13 np0005539505 nova_compute[186958]: 2025-11-29 07:40:13.403 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:13 np0005539505 podman[247058]: 2025-11-29 07:40:13.730683936 +0000 UTC m=+0.055636528 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41)
Nov 29 02:40:13 np0005539505 podman[247059]: 2025-11-29 07:40:13.762793541 +0000 UTC m=+0.068185062 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:40:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:14.785 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:40:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:14.786 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:40:14 np0005539505 nova_compute[186958]: 2025-11-29 07:40:14.787 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:16.788 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:40:17 np0005539505 nova_compute[186958]: 2025-11-29 07:40:17.359 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:17 np0005539505 podman[247101]: 2025-11-29 07:40:17.715052715 +0000 UTC m=+0.045011389 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:40:18 np0005539505 nova_compute[186958]: 2025-11-29 07:40:18.450 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:22 np0005539505 nova_compute[186958]: 2025-11-29 07:40:22.361 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:23 np0005539505 nova_compute[186958]: 2025-11-29 07:40:23.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:23 np0005539505 nova_compute[186958]: 2025-11-29 07:40:23.452 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:23.781 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2 2001:db8::f816:3eff:feac:f62b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feac:f62b/64', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de1096f6-2a15-4f04-9ea7-22d2dff24e74, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2459b7bb-f6d0-4520-a009-14c9d4a2b794) old=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:40:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:23.782 104094 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2459b7bb-f6d0-4520-a009-14c9d4a2b794 in datapath 600edac6-24aa-414f-b977-07c2890470f1 updated#033[00m
Nov 29 02:40:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:23.784 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 600edac6-24aa-414f-b977-07c2890470f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:40:23 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:23.786 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[78a0969d-d3e5-4460-a023-214222af5043]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:26 np0005539505 podman[247121]: 2025-11-29 07:40:26.763759917 +0000 UTC m=+0.073374145 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:40:26 np0005539505 podman[247122]: 2025-11-29 07:40:26.787751875 +0000 UTC m=+0.099765291 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:40:27 np0005539505 nova_compute[186958]: 2025-11-29 07:40:27.362 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:27.521 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:27.522 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:27.522 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:28 np0005539505 nova_compute[186958]: 2025-11-29 07:40:28.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:28 np0005539505 nova_compute[186958]: 2025-11-29 07:40:28.492 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:29 np0005539505 nova_compute[186958]: 2025-11-29 07:40:29.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:29 np0005539505 podman[247169]: 2025-11-29 07:40:29.748210141 +0000 UTC m=+0.069495263 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:40:29 np0005539505 podman[247168]: 2025-11-29 07:40:29.774155145 +0000 UTC m=+0.087409377 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:40:31 np0005539505 nova_compute[186958]: 2025-11-29 07:40:31.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:31 np0005539505 nova_compute[186958]: 2025-11-29 07:40:31.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:40:32 np0005539505 nova_compute[186958]: 2025-11-29 07:40:32.365 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:32 np0005539505 nova_compute[186958]: 2025-11-29 07:40:32.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:33 np0005539505 nova_compute[186958]: 2025-11-29 07:40:33.495 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:35 np0005539505 nova_compute[186958]: 2025-11-29 07:40:35.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:35 np0005539505 nova_compute[186958]: 2025-11-29 07:40:35.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:40:35 np0005539505 nova_compute[186958]: 2025-11-29 07:40:35.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:40:35 np0005539505 nova_compute[186958]: 2025-11-29 07:40:35.406 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:40:36 np0005539505 nova_compute[186958]: 2025-11-29 07:40:36.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:37.177 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2 2001:db8:0:1:f816:3eff:feac:f62b 2001:db8::f816:3eff:feac:f62b'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:feac:f62b/64 2001:db8::f816:3eff:feac:f62b/64', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=de1096f6-2a15-4f04-9ea7-22d2dff24e74, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2459b7bb-f6d0-4520-a009-14c9d4a2b794) old=Port_Binding(mac=['fa:16:3e:ac:f6:2b 10.100.0.2 2001:db8::f816:3eff:feac:f62b'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feac:f62b/64', 'neutron:device_id': 'ovnmeta-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-600edac6-24aa-414f-b977-07c2890470f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:40:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:37.180 104094 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2459b7bb-f6d0-4520-a009-14c9d4a2b794 in datapath 600edac6-24aa-414f-b977-07c2890470f1 updated#033[00m
Nov 29 02:40:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:37.183 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 600edac6-24aa-414f-b977-07c2890470f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:40:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:37.184 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5b5db967-4e5d-4fbb-a415-5748350e5993]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:40:37 np0005539505 nova_compute[186958]: 2025-11-29 07:40:37.371 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:38 np0005539505 nova_compute[186958]: 2025-11-29 07:40:38.547 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.414 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.414 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.415 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.415 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.592 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.594 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5733MB free_disk=73.0736198425293GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.594 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.594 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.782 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.782 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.807 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.882 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.882 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.903 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.958 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:40:40 np0005539505 nova_compute[186958]: 2025-11-29 07:40:40.993 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:40:41 np0005539505 nova_compute[186958]: 2025-11-29 07:40:41.037 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:40:41 np0005539505 nova_compute[186958]: 2025-11-29 07:40:41.120 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:40:41 np0005539505 nova_compute[186958]: 2025-11-29 07:40:41.120 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.526s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:40:42 np0005539505 nova_compute[186958]: 2025-11-29 07:40:42.418 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:43 np0005539505 nova_compute[186958]: 2025-11-29 07:40:43.601 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:44 np0005539505 nova_compute[186958]: 2025-11-29 07:40:44.113 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:44 np0005539505 podman[247207]: 2025-11-29 07:40:44.720105194 +0000 UTC m=+0.054664538 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 02:40:44 np0005539505 podman[247208]: 2025-11-29 07:40:44.736767562 +0000 UTC m=+0.068039782 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:40:45 np0005539505 nova_compute[186958]: 2025-11-29 07:40:45.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:40:47 np0005539505 nova_compute[186958]: 2025-11-29 07:40:47.419 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:40:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:40:48 np0005539505 nova_compute[186958]: 2025-11-29 07:40:48.603 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:48 np0005539505 podman[247253]: 2025-11-29 07:40:48.719055436 +0000 UTC m=+0.051019944 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:40:52 np0005539505 nova_compute[186958]: 2025-11-29 07:40:52.420 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:52.644 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:40:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:52.645 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:40:52 np0005539505 nova_compute[186958]: 2025-11-29 07:40:52.689 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:53 np0005539505 nova_compute[186958]: 2025-11-29 07:40:53.606 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:57 np0005539505 nova_compute[186958]: 2025-11-29 07:40:57.421 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:57 np0005539505 podman[247272]: 2025-11-29 07:40:57.753121374 +0000 UTC m=+0.069164074 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:40:57 np0005539505 podman[247273]: 2025-11-29 07:40:57.773309823 +0000 UTC m=+0.091938527 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:40:58 np0005539505 nova_compute[186958]: 2025-11-29 07:40:58.608 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:40:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:40:59.648 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:00 np0005539505 podman[247322]: 2025-11-29 07:41:00.737188306 +0000 UTC m=+0.061926156 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:41:00 np0005539505 podman[247323]: 2025-11-29 07:41:00.79442745 +0000 UTC m=+0.114144126 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:41:02 np0005539505 nova_compute[186958]: 2025-11-29 07:41:02.422 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:03 np0005539505 nova_compute[186958]: 2025-11-29 07:41:03.612 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:07 np0005539505 nova_compute[186958]: 2025-11-29 07:41:07.424 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:08 np0005539505 nova_compute[186958]: 2025-11-29 07:41:08.614 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:41:09Z|00746|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Nov 29 02:41:10 np0005539505 nova_compute[186958]: 2025-11-29 07:41:10.603 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:10 np0005539505 nova_compute[186958]: 2025-11-29 07:41:10.603 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:10 np0005539505 nova_compute[186958]: 2025-11-29 07:41:10.647 186962 DEBUG nova.compute.manager [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:41:10 np0005539505 nova_compute[186958]: 2025-11-29 07:41:10.788 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:10 np0005539505 nova_compute[186958]: 2025-11-29 07:41:10.788 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:10 np0005539505 nova_compute[186958]: 2025-11-29 07:41:10.795 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:41:10 np0005539505 nova_compute[186958]: 2025-11-29 07:41:10.795 186962 INFO nova.compute.claims [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:41:11 np0005539505 nova_compute[186958]: 2025-11-29 07:41:11.382 186962 DEBUG nova.compute.provider_tree [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:41:11 np0005539505 nova_compute[186958]: 2025-11-29 07:41:11.403 186962 DEBUG nova.scheduler.client.report [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:41:11 np0005539505 nova_compute[186958]: 2025-11-29 07:41:11.443 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.654s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:11 np0005539505 nova_compute[186958]: 2025-11-29 07:41:11.444 186962 DEBUG nova.compute.manager [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:41:11 np0005539505 nova_compute[186958]: 2025-11-29 07:41:11.533 186962 DEBUG nova.compute.manager [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:41:11 np0005539505 nova_compute[186958]: 2025-11-29 07:41:11.534 186962 DEBUG nova.network.neutron [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:41:11 np0005539505 nova_compute[186958]: 2025-11-29 07:41:11.832 186962 INFO nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:41:11 np0005539505 nova_compute[186958]: 2025-11-29 07:41:11.888 186962 DEBUG nova.compute.manager [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.025 186962 DEBUG nova.compute.manager [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.026 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.026 186962 INFO nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Creating image(s)#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.027 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.027 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.028 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.040 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.083 186962 DEBUG nova.policy [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.130 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.131 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.132 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.143 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.204 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.205 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.242 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.243 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.243 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.297 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.300 186962 DEBUG nova.virt.disk.api [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.301 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.366 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.370 186962 DEBUG nova.virt.disk.api [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.370 186962 DEBUG nova.objects.instance [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.416 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.417 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Ensure instance console log exists: /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.417 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.418 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.418 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:12 np0005539505 nova_compute[186958]: 2025-11-29 07:41:12.426 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:13 np0005539505 nova_compute[186958]: 2025-11-29 07:41:13.617 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:15 np0005539505 nova_compute[186958]: 2025-11-29 07:41:15.379 186962 DEBUG nova.network.neutron [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Successfully created port: 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:41:15 np0005539505 podman[247377]: 2025-11-29 07:41:15.735040846 +0000 UTC m=+0.059740374 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:41:15 np0005539505 podman[247376]: 2025-11-29 07:41:15.770357269 +0000 UTC m=+0.094389298 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal)
Nov 29 02:41:17 np0005539505 nova_compute[186958]: 2025-11-29 07:41:17.005 186962 DEBUG nova.network.neutron [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Successfully updated port: 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:41:17 np0005539505 nova_compute[186958]: 2025-11-29 07:41:17.429 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:17 np0005539505 nova_compute[186958]: 2025-11-29 07:41:17.620 186962 DEBUG nova.compute.manager [req-7aef9ff0-6aca-4f91-95ef-0035ea086db3 req-1c6182ff-3543-486a-8b64-3c9676188184 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-changed-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:41:17 np0005539505 nova_compute[186958]: 2025-11-29 07:41:17.620 186962 DEBUG nova.compute.manager [req-7aef9ff0-6aca-4f91-95ef-0035ea086db3 req-1c6182ff-3543-486a-8b64-3c9676188184 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Refreshing instance network info cache due to event network-changed-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:41:17 np0005539505 nova_compute[186958]: 2025-11-29 07:41:17.621 186962 DEBUG oslo_concurrency.lockutils [req-7aef9ff0-6aca-4f91-95ef-0035ea086db3 req-1c6182ff-3543-486a-8b64-3c9676188184 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:41:17 np0005539505 nova_compute[186958]: 2025-11-29 07:41:17.621 186962 DEBUG oslo_concurrency.lockutils [req-7aef9ff0-6aca-4f91-95ef-0035ea086db3 req-1c6182ff-3543-486a-8b64-3c9676188184 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:41:17 np0005539505 nova_compute[186958]: 2025-11-29 07:41:17.621 186962 DEBUG nova.network.neutron [req-7aef9ff0-6aca-4f91-95ef-0035ea086db3 req-1c6182ff-3543-486a-8b64-3c9676188184 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Refreshing network info cache for port 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:41:17 np0005539505 nova_compute[186958]: 2025-11-29 07:41:17.624 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:41:18 np0005539505 nova_compute[186958]: 2025-11-29 07:41:18.620 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:18 np0005539505 nova_compute[186958]: 2025-11-29 07:41:18.808 186962 DEBUG nova.network.neutron [req-7aef9ff0-6aca-4f91-95ef-0035ea086db3 req-1c6182ff-3543-486a-8b64-3c9676188184 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:41:19 np0005539505 podman[247418]: 2025-11-29 07:41:19.725027189 +0000 UTC m=+0.053922237 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:41:20 np0005539505 nova_compute[186958]: 2025-11-29 07:41:20.450 186962 DEBUG nova.network.neutron [req-7aef9ff0-6aca-4f91-95ef-0035ea086db3 req-1c6182ff-3543-486a-8b64-3c9676188184 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:41:20 np0005539505 nova_compute[186958]: 2025-11-29 07:41:20.485 186962 DEBUG oslo_concurrency.lockutils [req-7aef9ff0-6aca-4f91-95ef-0035ea086db3 req-1c6182ff-3543-486a-8b64-3c9676188184 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:41:20 np0005539505 nova_compute[186958]: 2025-11-29 07:41:20.486 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:41:20 np0005539505 nova_compute[186958]: 2025-11-29 07:41:20.486 186962 DEBUG nova.network.neutron [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:41:20 np0005539505 nova_compute[186958]: 2025-11-29 07:41:20.824 186962 DEBUG nova.network.neutron [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:41:22 np0005539505 nova_compute[186958]: 2025-11-29 07:41:22.431 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:23 np0005539505 nova_compute[186958]: 2025-11-29 07:41:23.622 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.794 186962 DEBUG nova.network.neutron [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.827 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.827 186962 DEBUG nova.compute.manager [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Instance network_info: |[{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.830 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Start _get_guest_xml network_info=[{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.834 186962 WARNING nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.843 186962 DEBUG nova.virt.libvirt.host [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.844 186962 DEBUG nova.virt.libvirt.host [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.850 186962 DEBUG nova.virt.libvirt.host [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.851 186962 DEBUG nova.virt.libvirt.host [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.854 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.856 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.857 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.857 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.858 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.858 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.858 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.859 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.860 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.860 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.860 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.860 186962 DEBUG nova.virt.hardware [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.864 186962 DEBUG nova.virt.libvirt.vif [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:41:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2110898837',display_name='tempest-TestNetworkBasicOps-server-2110898837',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2110898837',id=159,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM7UKFuj7QMrD2gwvRJwF1C5ODafQoalw3wt4tl3CWp1E3Ov5Wq4NHZjxUJq9RlkJeCJZQ+vSdDu4+Tn4UobItS+wK5vemrE3fQE4FqHuLj0BbgXviF1Wn0sGulLRgX6cA==',key_name='tempest-TestNetworkBasicOps-1286988878',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-yxnrze2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:41:11Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1054c168-50b7-42e4-aedb-6ddca8a197a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.864 186962 DEBUG nova.network.os_vif_util [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.865 186962 DEBUG nova.network.os_vif_util [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d,network=Network(699261cc-4df3-4556-934e-8ab5d6d3f144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a99c6a7-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.866 186962 DEBUG nova.objects.instance [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.883 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  <uuid>1054c168-50b7-42e4-aedb-6ddca8a197a4</uuid>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  <name>instance-0000009f</name>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestNetworkBasicOps-server-2110898837</nova:name>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:41:24</nova:creationTime>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:        <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:        <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:        <nova:port uuid="7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <entry name="serial">1054c168-50b7-42e4-aedb-6ddca8a197a4</entry>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <entry name="uuid">1054c168-50b7-42e4-aedb-6ddca8a197a4</entry>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.config"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:0d:fa:32"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <target dev="tap7a99c6a7-ba"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/console.log" append="off"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:41:24 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:41:24 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:41:24 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:41:24 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.883 186962 DEBUG nova.compute.manager [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Preparing to wait for external event network-vif-plugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.883 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.883 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.883 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.884 186962 DEBUG nova.virt.libvirt.vif [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:41:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2110898837',display_name='tempest-TestNetworkBasicOps-server-2110898837',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2110898837',id=159,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM7UKFuj7QMrD2gwvRJwF1C5ODafQoalw3wt4tl3CWp1E3Ov5Wq4NHZjxUJq9RlkJeCJZQ+vSdDu4+Tn4UobItS+wK5vemrE3fQE4FqHuLj0BbgXviF1Wn0sGulLRgX6cA==',key_name='tempest-TestNetworkBasicOps-1286988878',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-yxnrze2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:41:11Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1054c168-50b7-42e4-aedb-6ddca8a197a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.884 186962 DEBUG nova.network.os_vif_util [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.885 186962 DEBUG nova.network.os_vif_util [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d,network=Network(699261cc-4df3-4556-934e-8ab5d6d3f144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a99c6a7-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.885 186962 DEBUG os_vif [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d,network=Network(699261cc-4df3-4556-934e-8ab5d6d3f144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a99c6a7-ba') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.885 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.886 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.886 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.889 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.889 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a99c6a7-ba, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.889 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a99c6a7-ba, col_values=(('external_ids', {'iface-id': '7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:fa:32', 'vm-uuid': '1054c168-50b7-42e4-aedb-6ddca8a197a4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.890 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:24 np0005539505 NetworkManager[55134]: <info>  [1764402084.8918] manager: (tap7a99c6a7-ba): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/364)
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.893 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.897 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.898 186962 INFO os_vif [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d,network=Network(699261cc-4df3-4556-934e-8ab5d6d3f144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a99c6a7-ba')#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.963 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.964 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.965 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:0d:fa:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:41:24 np0005539505 nova_compute[186958]: 2025-11-29 07:41:24.966 186962 INFO nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Using config drive#033[00m
Nov 29 02:41:26 np0005539505 nova_compute[186958]: 2025-11-29 07:41:26.462 186962 INFO nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Creating config drive at /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.config#033[00m
Nov 29 02:41:26 np0005539505 nova_compute[186958]: 2025-11-29 07:41:26.473 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgf5b6e8v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:41:26 np0005539505 nova_compute[186958]: 2025-11-29 07:41:26.610 186962 DEBUG oslo_concurrency.processutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgf5b6e8v" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:41:26 np0005539505 kernel: tap7a99c6a7-ba: entered promiscuous mode
Nov 29 02:41:26 np0005539505 NetworkManager[55134]: <info>  [1764402086.6765] manager: (tap7a99c6a7-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/365)
Nov 29 02:41:26 np0005539505 nova_compute[186958]: 2025-11-29 07:41:26.677 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:41:26Z|00747|binding|INFO|Claiming lport 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d for this chassis.
Nov 29 02:41:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:41:26Z|00748|binding|INFO|7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d: Claiming fa:16:3e:0d:fa:32 10.100.0.12
Nov 29 02:41:26 np0005539505 nova_compute[186958]: 2025-11-29 07:41:26.679 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:26 np0005539505 nova_compute[186958]: 2025-11-29 07:41:26.684 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:26 np0005539505 systemd-udevd[247458]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:41:26 np0005539505 NetworkManager[55134]: <info>  [1764402086.7148] device (tap7a99c6a7-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:41:26 np0005539505 NetworkManager[55134]: <info>  [1764402086.7159] device (tap7a99c6a7-ba): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:41:26 np0005539505 systemd-machined[153285]: New machine qemu-79-instance-0000009f.
Nov 29 02:41:26 np0005539505 nova_compute[186958]: 2025-11-29 07:41:26.734 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:41:26Z|00749|binding|INFO|Setting lport 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d ovn-installed in OVS
Nov 29 02:41:26 np0005539505 nova_compute[186958]: 2025-11-29 07:41:26.738 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:26 np0005539505 systemd[1]: Started Virtual Machine qemu-79-instance-0000009f.
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.809 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:fa:32 10.100.0.12'], port_security=['fa:16:3e:0d:fa:32 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-699261cc-4df3-4556-934e-8ab5d6d3f144', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a4dc7eec-7c17-458c-a1f4-6b636722c8d4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd9dc223-0c14-4e63-a38e-bc90457ac64b, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.811 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d in datapath 699261cc-4df3-4556-934e-8ab5d6d3f144 bound to our chassis#033[00m
Nov 29 02:41:26 np0005539505 ovn_controller[95143]: 2025-11-29T07:41:26Z|00750|binding|INFO|Setting lport 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d up in Southbound
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.812 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 699261cc-4df3-4556-934e-8ab5d6d3f144#033[00m
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.823 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c6f66e-84cc-4c6a-99ab-8996b8f98eef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.823 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap699261cc-41 in ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.825 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap699261cc-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.825 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a75b24a7-678f-47df-88a8-9ade507b00de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.826 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[04be4417-b61a-4184-8178-37294e0ae614]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.838 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[cc1904b7-b373-4c92-b212-6d9fd5077f1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.861 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[542ada3d-1fea-464e-931c-fb3ad0b51909]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.897 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d64be9-df5b-432b-a5ac-d5b698c8a9ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:26 np0005539505 NetworkManager[55134]: <info>  [1764402086.9034] manager: (tap699261cc-40): new Veth device (/org/freedesktop/NetworkManager/Devices/366)
Nov 29 02:41:26 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.904 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a6d114-6e18-4a91-b993-402a2963421c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:26.944 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4d3a6e-8735-4c66-9f86-b8be52c4109b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.465 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.465 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ba70e437-c9a5-4959-964a-98c9abf2f598]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:27 np0005539505 NetworkManager[55134]: <info>  [1764402087.4939] device (tap699261cc-40): carrier: link connected
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.496 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402087.4959958, 1054c168-50b7-42e4-aedb-6ddca8a197a4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.497 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] VM Started (Lifecycle Event)#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.500 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b92fdada-5e6c-48f0-8e57-3de0a69fe478]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.517 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ce980f-2615-4126-93be-3a41814d863b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap699261cc-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:62:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 753507, 'reachable_time': 31992, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247499, 'error': None, 'target': 'ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.523 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.523 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.524 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.531 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3fa54471-f133-424c-9260-d85d9fa75d83]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feed:629a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 753507, 'tstamp': 753507}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247500, 'error': None, 'target': 'ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.545 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ae2614-134d-4b87-8fbb-75e6396f696e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap699261cc-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:62:9a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 753507, 'reachable_time': 31992, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247501, 'error': None, 'target': 'ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.582 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2493a515-1df2-431d-a0b0-6dc194845427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.614 186962 DEBUG nova.compute.manager [req-edbfe385-ca3f-4498-9933-c3b210ea1301 req-1689e52b-b518-4984-a717-fa5ddea84625 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-vif-plugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.615 186962 DEBUG oslo_concurrency.lockutils [req-edbfe385-ca3f-4498-9933-c3b210ea1301 req-1689e52b-b518-4984-a717-fa5ddea84625 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.616 186962 DEBUG oslo_concurrency.lockutils [req-edbfe385-ca3f-4498-9933-c3b210ea1301 req-1689e52b-b518-4984-a717-fa5ddea84625 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.616 186962 DEBUG oslo_concurrency.lockutils [req-edbfe385-ca3f-4498-9933-c3b210ea1301 req-1689e52b-b518-4984-a717-fa5ddea84625 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.616 186962 DEBUG nova.compute.manager [req-edbfe385-ca3f-4498-9933-c3b210ea1301 req-1689e52b-b518-4984-a717-fa5ddea84625 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Processing event network-vif-plugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.618 186962 DEBUG nova.compute.manager [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.622 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.628 186962 INFO nova.virt.libvirt.driver [-] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Instance spawned successfully.#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.629 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.635 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c77206af-3687-422f-95fb-9d9dee3e9f52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.636 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap699261cc-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.636 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.637 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap699261cc-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:27 np0005539505 NetworkManager[55134]: <info>  [1764402087.6878] manager: (tap699261cc-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Nov 29 02:41:27 np0005539505 kernel: tap699261cc-40: entered promiscuous mode
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.688 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.691 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.692 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap699261cc-40, col_values=(('external_ids', {'iface-id': '3e62b79d-3423-4f59-a770-7cbfebfe062a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:41:27Z|00751|binding|INFO|Releasing lport 3e62b79d-3423-4f59-a770-7cbfebfe062a from this chassis (sb_readonly=0)
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.695 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/699261cc-4df3-4556-934e-8ab5d6d3f144.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/699261cc-4df3-4556-934e-8ab5d6d3f144.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.697 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[aeddb91f-81d1-4964-b9d9-baa424ca67bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.698 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-699261cc-4df3-4556-934e-8ab5d6d3f144
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/699261cc-4df3-4556-934e-8ab5d6d3f144.pid.haproxy
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 699261cc-4df3-4556-934e-8ab5d6d3f144
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:41:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:27.698 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144', 'env', 'PROCESS_TAG=haproxy-699261cc-4df3-4556-934e-8ab5d6d3f144', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/699261cc-4df3-4556-934e-8ab5d6d3f144.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.700 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.704 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.704 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.705 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.705 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.706 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.706 186962 DEBUG nova.virt.libvirt.driver [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.709 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.753 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.761 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402087.4968965, 1054c168-50b7-42e4-aedb-6ddca8a197a4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.761 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.797 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.802 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402087.6215858, 1054c168-50b7-42e4-aedb-6ddca8a197a4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.802 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.827 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.830 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.839 186962 INFO nova.compute.manager [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Took 15.81 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.840 186962 DEBUG nova.compute.manager [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:41:27 np0005539505 nova_compute[186958]: 2025-11-29 07:41:27.865 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:41:28 np0005539505 podman[247534]: 2025-11-29 07:41:28.092230407 +0000 UTC m=+0.058544290 container create 204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:41:28 np0005539505 systemd[1]: Started libpod-conmon-204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d.scope.
Nov 29 02:41:28 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:41:28 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc5284d99314774fb41c7ff8fedba3829dc8a84836e22d985964efdb503197c4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:41:28 np0005539505 podman[247534]: 2025-11-29 07:41:28.061566488 +0000 UTC m=+0.027880400 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:41:28 np0005539505 podman[247534]: 2025-11-29 07:41:28.161453782 +0000 UTC m=+0.127767674 container init 204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 29 02:41:28 np0005539505 podman[247534]: 2025-11-29 07:41:28.16661469 +0000 UTC m=+0.132928572 container start 204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:41:28 np0005539505 podman[247547]: 2025-11-29 07:41:28.180121877 +0000 UTC m=+0.055292356 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:41:28 np0005539505 neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144[247552]: [NOTICE]   (247587) : New worker (247598) forked
Nov 29 02:41:28 np0005539505 neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144[247552]: [NOTICE]   (247587) : Loading success.
Nov 29 02:41:28 np0005539505 podman[247551]: 2025-11-29 07:41:28.207248505 +0000 UTC m=+0.079922042 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:41:28 np0005539505 nova_compute[186958]: 2025-11-29 07:41:28.488 186962 INFO nova.compute.manager [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Took 17.74 seconds to build instance.#033[00m
Nov 29 02:41:28 np0005539505 nova_compute[186958]: 2025-11-29 07:41:28.525 186962 DEBUG oslo_concurrency.lockutils [None req-e0e01e58-0af4-484f-a112-be580b1963f5 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:29 np0005539505 nova_compute[186958]: 2025-11-29 07:41:29.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:29 np0005539505 nova_compute[186958]: 2025-11-29 07:41:29.858 186962 DEBUG nova.compute.manager [req-4121d2ac-cb69-430b-917f-1f8053af3d64 req-3b6e9276-875d-4254-b85c-1b1e44ae9920 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-vif-plugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:41:29 np0005539505 nova_compute[186958]: 2025-11-29 07:41:29.859 186962 DEBUG oslo_concurrency.lockutils [req-4121d2ac-cb69-430b-917f-1f8053af3d64 req-3b6e9276-875d-4254-b85c-1b1e44ae9920 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:29 np0005539505 nova_compute[186958]: 2025-11-29 07:41:29.859 186962 DEBUG oslo_concurrency.lockutils [req-4121d2ac-cb69-430b-917f-1f8053af3d64 req-3b6e9276-875d-4254-b85c-1b1e44ae9920 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:29 np0005539505 nova_compute[186958]: 2025-11-29 07:41:29.860 186962 DEBUG oslo_concurrency.lockutils [req-4121d2ac-cb69-430b-917f-1f8053af3d64 req-3b6e9276-875d-4254-b85c-1b1e44ae9920 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:29 np0005539505 nova_compute[186958]: 2025-11-29 07:41:29.860 186962 DEBUG nova.compute.manager [req-4121d2ac-cb69-430b-917f-1f8053af3d64 req-3b6e9276-875d-4254-b85c-1b1e44ae9920 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] No waiting events found dispatching network-vif-plugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:41:29 np0005539505 nova_compute[186958]: 2025-11-29 07:41:29.860 186962 WARNING nova.compute.manager [req-4121d2ac-cb69-430b-917f-1f8053af3d64 req-3b6e9276-875d-4254-b85c-1b1e44ae9920 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received unexpected event network-vif-plugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d for instance with vm_state active and task_state None.#033[00m
Nov 29 02:41:29 np0005539505 nova_compute[186958]: 2025-11-29 07:41:29.893 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:30 np0005539505 nova_compute[186958]: 2025-11-29 07:41:30.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:31 np0005539505 podman[247612]: 2025-11-29 07:41:31.723349622 +0000 UTC m=+0.057788198 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:41:31 np0005539505 podman[247613]: 2025-11-29 07:41:31.724860635 +0000 UTC m=+0.057647864 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:41:32 np0005539505 nova_compute[186958]: 2025-11-29 07:41:32.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:32 np0005539505 nova_compute[186958]: 2025-11-29 07:41:32.469 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:33 np0005539505 nova_compute[186958]: 2025-11-29 07:41:33.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:33 np0005539505 nova_compute[186958]: 2025-11-29 07:41:33.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:33 np0005539505 nova_compute[186958]: 2025-11-29 07:41:33.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:41:34 np0005539505 nova_compute[186958]: 2025-11-29 07:41:34.895 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:36 np0005539505 nova_compute[186958]: 2025-11-29 07:41:36.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:36 np0005539505 nova_compute[186958]: 2025-11-29 07:41:36.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:41:36 np0005539505 nova_compute[186958]: 2025-11-29 07:41:36.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:41:37 np0005539505 nova_compute[186958]: 2025-11-29 07:41:37.472 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:38 np0005539505 nova_compute[186958]: 2025-11-29 07:41:38.655 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:41:38 np0005539505 nova_compute[186958]: 2025-11-29 07:41:38.655 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:41:38 np0005539505 nova_compute[186958]: 2025-11-29 07:41:38.656 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:41:38 np0005539505 nova_compute[186958]: 2025-11-29 07:41:38.656 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:41:39 np0005539505 nova_compute[186958]: 2025-11-29 07:41:39.079 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:39 np0005539505 NetworkManager[55134]: <info>  [1764402099.0799] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/368)
Nov 29 02:41:39 np0005539505 NetworkManager[55134]: <info>  [1764402099.0808] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/369)
Nov 29 02:41:39 np0005539505 nova_compute[186958]: 2025-11-29 07:41:39.220 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:39 np0005539505 ovn_controller[95143]: 2025-11-29T07:41:39Z|00752|binding|INFO|Releasing lport 3e62b79d-3423-4f59-a770-7cbfebfe062a from this chassis (sb_readonly=0)
Nov 29 02:41:39 np0005539505 nova_compute[186958]: 2025-11-29 07:41:39.242 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:39 np0005539505 nova_compute[186958]: 2025-11-29 07:41:39.897 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:39 np0005539505 nova_compute[186958]: 2025-11-29 07:41:39.914 186962 DEBUG nova.compute.manager [req-f040fa52-3b72-43ea-9f39-8b0dba8e5a21 req-5e8fa665-ed47-43ed-a9bd-446c1dd63198 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-changed-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:41:39 np0005539505 nova_compute[186958]: 2025-11-29 07:41:39.914 186962 DEBUG nova.compute.manager [req-f040fa52-3b72-43ea-9f39-8b0dba8e5a21 req-5e8fa665-ed47-43ed-a9bd-446c1dd63198 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Refreshing instance network info cache due to event network-changed-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:41:39 np0005539505 nova_compute[186958]: 2025-11-29 07:41:39.914 186962 DEBUG oslo_concurrency.lockutils [req-f040fa52-3b72-43ea-9f39-8b0dba8e5a21 req-5e8fa665-ed47-43ed-a9bd-446c1dd63198 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:41:40 np0005539505 ovn_controller[95143]: 2025-11-29T07:41:40Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0d:fa:32 10.100.0.12
Nov 29 02:41:40 np0005539505 ovn_controller[95143]: 2025-11-29T07:41:40Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0d:fa:32 10.100.0.12
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.473 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.623 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.656 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.656 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.657 186962 DEBUG oslo_concurrency.lockutils [req-f040fa52-3b72-43ea-9f39-8b0dba8e5a21 req-5e8fa665-ed47-43ed-a9bd-446c1dd63198 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.657 186962 DEBUG nova.network.neutron [req-f040fa52-3b72-43ea-9f39-8b0dba8e5a21 req-5e8fa665-ed47-43ed-a9bd-446c1dd63198 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Refreshing network info cache for port 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.659 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.660 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.700 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.701 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.701 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.701 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.783 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.862 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.865 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:41:42 np0005539505 nova_compute[186958]: 2025-11-29 07:41:42.926 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:41:43 np0005539505 nova_compute[186958]: 2025-11-29 07:41:43.104 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:41:43 np0005539505 nova_compute[186958]: 2025-11-29 07:41:43.105 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5541MB free_disk=73.04440689086914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:41:43 np0005539505 nova_compute[186958]: 2025-11-29 07:41:43.106 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:41:43 np0005539505 nova_compute[186958]: 2025-11-29 07:41:43.106 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:41:43 np0005539505 nova_compute[186958]: 2025-11-29 07:41:43.263 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 1054c168-50b7-42e4-aedb-6ddca8a197a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:41:43 np0005539505 nova_compute[186958]: 2025-11-29 07:41:43.264 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:41:43 np0005539505 nova_compute[186958]: 2025-11-29 07:41:43.264 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:41:43 np0005539505 nova_compute[186958]: 2025-11-29 07:41:43.339 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:41:43 np0005539505 nova_compute[186958]: 2025-11-29 07:41:43.446 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:41:43 np0005539505 nova_compute[186958]: 2025-11-29 07:41:43.482 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:41:43 np0005539505 nova_compute[186958]: 2025-11-29 07:41:43.482 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:41:44 np0005539505 nova_compute[186958]: 2025-11-29 07:41:44.899 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:46 np0005539505 nova_compute[186958]: 2025-11-29 07:41:46.202 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:46 np0005539505 nova_compute[186958]: 2025-11-29 07:41:46.202 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:41:46 np0005539505 podman[247673]: 2025-11-29 07:41:46.719186603 +0000 UTC m=+0.050007504 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:41:46 np0005539505 podman[247672]: 2025-11-29 07:41:46.721692385 +0000 UTC m=+0.055684127 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_id=edpm, version=9.6, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Nov 29 02:41:46 np0005539505 nova_compute[186958]: 2025-11-29 07:41:46.767 186962 DEBUG nova.network.neutron [req-f040fa52-3b72-43ea-9f39-8b0dba8e5a21 req-5e8fa665-ed47-43ed-a9bd-446c1dd63198 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updated VIF entry in instance network info cache for port 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:41:46 np0005539505 nova_compute[186958]: 2025-11-29 07:41:46.768 186962 DEBUG nova.network.neutron [req-f040fa52-3b72-43ea-9f39-8b0dba8e5a21 req-5e8fa665-ed47-43ed-a9bd-446c1dd63198 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:41:47 np0005539505 nova_compute[186958]: 2025-11-29 07:41:47.077 186962 DEBUG oslo_concurrency.lockutils [req-f040fa52-3b72-43ea-9f39-8b0dba8e5a21 req-5e8fa665-ed47-43ed-a9bd-446c1dd63198 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:41:47 np0005539505 nova_compute[186958]: 2025-11-29 07:41:47.458 186962 INFO nova.compute.manager [None req-97f8d132-84e1-4af0-9077-659a224e91f9 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Get console output#033[00m
Nov 29 02:41:47 np0005539505 nova_compute[186958]: 2025-11-29 07:41:47.465 213540 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:41:47 np0005539505 nova_compute[186958]: 2025-11-29 07:41:47.474 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:49 np0005539505 nova_compute[186958]: 2025-11-29 07:41:49.903 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:50 np0005539505 podman[247713]: 2025-11-29 07:41:50.717004611 +0000 UTC m=+0.051738024 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 02:41:52 np0005539505 nova_compute[186958]: 2025-11-29 07:41:52.476 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:54 np0005539505 nova_compute[186958]: 2025-11-29 07:41:54.934 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:56.349 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:41:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:56.351 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:41:56 np0005539505 nova_compute[186958]: 2025-11-29 07:41:56.350 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:57 np0005539505 nova_compute[186958]: 2025-11-29 07:41:57.479 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:41:58 np0005539505 podman[247730]: 2025-11-29 07:41:58.718891786 +0000 UTC m=+0.054162794 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:41:58 np0005539505 podman[247731]: 2025-11-29 07:41:58.76230488 +0000 UTC m=+0.091487794 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 02:41:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:41:59.353 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:41:59 np0005539505 nova_compute[186958]: 2025-11-29 07:41:59.936 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:02 np0005539505 nova_compute[186958]: 2025-11-29 07:42:02.480 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:02 np0005539505 podman[247781]: 2025-11-29 07:42:02.718396632 +0000 UTC m=+0.055224384 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd)
Nov 29 02:42:02 np0005539505 podman[247782]: 2025-11-29 07:42:02.7531849 +0000 UTC m=+0.086582063 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:42:04 np0005539505 nova_compute[186958]: 2025-11-29 07:42:04.939 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:07 np0005539505 nova_compute[186958]: 2025-11-29 07:42:07.131 186962 DEBUG oslo_concurrency.lockutils [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "interface-1054c168-50b7-42e4-aedb-6ddca8a197a4-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:07 np0005539505 nova_compute[186958]: 2025-11-29 07:42:07.132 186962 DEBUG oslo_concurrency.lockutils [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "interface-1054c168-50b7-42e4-aedb-6ddca8a197a4-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:07 np0005539505 nova_compute[186958]: 2025-11-29 07:42:07.133 186962 DEBUG nova.objects.instance [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'flavor' on Instance uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:42:07 np0005539505 nova_compute[186958]: 2025-11-29 07:42:07.481 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:08 np0005539505 nova_compute[186958]: 2025-11-29 07:42:08.034 186962 DEBUG nova.objects.instance [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_requests' on Instance uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:42:08 np0005539505 nova_compute[186958]: 2025-11-29 07:42:08.584 186962 DEBUG nova.network.neutron [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:42:08 np0005539505 nova_compute[186958]: 2025-11-29 07:42:08.894 186962 DEBUG nova.policy [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:42:09 np0005539505 nova_compute[186958]: 2025-11-29 07:42:09.940 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:11 np0005539505 nova_compute[186958]: 2025-11-29 07:42:11.596 186962 DEBUG nova.network.neutron [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Successfully created port: f8129917-8284-46b6-9714-dc84a6cb4b04 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:42:12 np0005539505 nova_compute[186958]: 2025-11-29 07:42:12.483 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:14 np0005539505 nova_compute[186958]: 2025-11-29 07:42:14.944 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:17 np0005539505 nova_compute[186958]: 2025-11-29 07:42:17.485 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:17 np0005539505 nova_compute[186958]: 2025-11-29 07:42:17.615 186962 DEBUG nova.network.neutron [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Successfully updated port: f8129917-8284-46b6-9714-dc84a6cb4b04 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:42:17 np0005539505 podman[247821]: 2025-11-29 07:42:17.773301647 +0000 UTC m=+0.088035105 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:42:17 np0005539505 podman[247820]: 2025-11-29 07:42:17.777165237 +0000 UTC m=+0.096649882 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Nov 29 02:42:18 np0005539505 nova_compute[186958]: 2025-11-29 07:42:18.250 186962 DEBUG oslo_concurrency.lockutils [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:42:18 np0005539505 nova_compute[186958]: 2025-11-29 07:42:18.250 186962 DEBUG oslo_concurrency.lockutils [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:42:18 np0005539505 nova_compute[186958]: 2025-11-29 07:42:18.251 186962 DEBUG nova.network.neutron [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:42:18 np0005539505 nova_compute[186958]: 2025-11-29 07:42:18.256 186962 DEBUG nova.compute.manager [req-6dbde7a7-5ef8-4e44-aa3a-e704c6ae2644 req-fba398b4-35dd-4a0b-b0b7-9ccc96ee2f6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-changed-f8129917-8284-46b6-9714-dc84a6cb4b04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:42:18 np0005539505 nova_compute[186958]: 2025-11-29 07:42:18.256 186962 DEBUG nova.compute.manager [req-6dbde7a7-5ef8-4e44-aa3a-e704c6ae2644 req-fba398b4-35dd-4a0b-b0b7-9ccc96ee2f6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Refreshing instance network info cache due to event network-changed-f8129917-8284-46b6-9714-dc84a6cb4b04. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:42:18 np0005539505 nova_compute[186958]: 2025-11-29 07:42:18.256 186962 DEBUG oslo_concurrency.lockutils [req-6dbde7a7-5ef8-4e44-aa3a-e704c6ae2644 req-fba398b4-35dd-4a0b-b0b7-9ccc96ee2f6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:42:19 np0005539505 nova_compute[186958]: 2025-11-29 07:42:19.948 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:21 np0005539505 podman[247866]: 2025-11-29 07:42:21.746434547 +0000 UTC m=+0.084301998 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:42:22 np0005539505 nova_compute[186958]: 2025-11-29 07:42:22.486 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.068 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.392 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.536 186962 DEBUG nova.network.neutron [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.562 186962 DEBUG oslo_concurrency.lockutils [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.562 186962 DEBUG oslo_concurrency.lockutils [req-6dbde7a7-5ef8-4e44-aa3a-e704c6ae2644 req-fba398b4-35dd-4a0b-b0b7-9ccc96ee2f6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.562 186962 DEBUG nova.network.neutron [req-6dbde7a7-5ef8-4e44-aa3a-e704c6ae2644 req-fba398b4-35dd-4a0b-b0b7-9ccc96ee2f6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Refreshing network info cache for port f8129917-8284-46b6-9714-dc84a6cb4b04 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.565 186962 DEBUG nova.virt.libvirt.vif [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:41:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2110898837',display_name='tempest-TestNetworkBasicOps-server-2110898837',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2110898837',id=159,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM7UKFuj7QMrD2gwvRJwF1C5ODafQoalw3wt4tl3CWp1E3Ov5Wq4NHZjxUJq9RlkJeCJZQ+vSdDu4+Tn4UobItS+wK5vemrE3fQE4FqHuLj0BbgXviF1Wn0sGulLRgX6cA==',key_name='tempest-TestNetworkBasicOps-1286988878',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:41:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-yxnrze2m',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:41:27Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1054c168-50b7-42e4-aedb-6ddca8a197a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.566 186962 DEBUG nova.network.os_vif_util [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.566 186962 DEBUG nova.network.os_vif_util [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.566 186962 DEBUG os_vif [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.567 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.567 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.568 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.572 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.572 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8129917-82, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.573 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf8129917-82, col_values=(('external_ids', {'iface-id': 'f8129917-8284-46b6-9714-dc84a6cb4b04', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3b:cc:f7', 'vm-uuid': '1054c168-50b7-42e4-aedb-6ddca8a197a4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:24 np0005539505 NetworkManager[55134]: <info>  [1764402144.5764] manager: (tapf8129917-82): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/370)
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.579 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.585 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.586 186962 INFO os_vif [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82')#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.586 186962 DEBUG nova.virt.libvirt.vif [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:41:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2110898837',display_name='tempest-TestNetworkBasicOps-server-2110898837',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2110898837',id=159,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM7UKFuj7QMrD2gwvRJwF1C5ODafQoalw3wt4tl3CWp1E3Ov5Wq4NHZjxUJq9RlkJeCJZQ+vSdDu4+Tn4UobItS+wK5vemrE3fQE4FqHuLj0BbgXviF1Wn0sGulLRgX6cA==',key_name='tempest-TestNetworkBasicOps-1286988878',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:41:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-yxnrze2m',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:41:27Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1054c168-50b7-42e4-aedb-6ddca8a197a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.587 186962 DEBUG nova.network.os_vif_util [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.587 186962 DEBUG nova.network.os_vif_util [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.589 186962 DEBUG nova.virt.libvirt.guest [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] attach device xml: <interface type="ethernet">
Nov 29 02:42:24 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:3b:cc:f7"/>
Nov 29 02:42:24 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:42:24 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:42:24 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:42:24 np0005539505 nova_compute[186958]:  <target dev="tapf8129917-82"/>
Nov 29 02:42:24 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:42:24 np0005539505 nova_compute[186958]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 29 02:42:24 np0005539505 kernel: tapf8129917-82: entered promiscuous mode
Nov 29 02:42:24 np0005539505 NetworkManager[55134]: <info>  [1764402144.6030] manager: (tapf8129917-82): new Tun device (/org/freedesktop/NetworkManager/Devices/371)
Nov 29 02:42:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:42:24Z|00753|binding|INFO|Claiming lport f8129917-8284-46b6-9714-dc84a6cb4b04 for this chassis.
Nov 29 02:42:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:42:24Z|00754|binding|INFO|f8129917-8284-46b6-9714-dc84a6cb4b04: Claiming fa:16:3e:3b:cc:f7 10.100.0.46
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.623 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:cc:f7 10.100.0.46'], port_security=['fa:16:3e:3b:cc:f7 10.100.0.46'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.46/28', 'neutron:device_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '046111b1-8479-4ebb-8db5-573e164c575e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79da6174-0485-4e06-8898-c13055f8ac79, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=f8129917-8284-46b6-9714-dc84a6cb4b04) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.624 104094 INFO neutron.agent.ovn.metadata.agent [-] Port f8129917-8284-46b6-9714-dc84a6cb4b04 in datapath 6f1be974-bcaa-4b93-ab01-8adab0060f10 bound to our chassis#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.627 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6f1be974-bcaa-4b93-ab01-8adab0060f10#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.645 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[99d6b881-6252-493e-8bbd-b4bb8e7f3dd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.646 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6f1be974-b1 in ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.649 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6f1be974-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.649 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[013fa856-6ff7-4f0a-a18f-7b46c0a05352]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.650 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5ddb58cb-8490-417b-8041-3b10ad4f8196]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 systemd-udevd[247895]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:42:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:42:24Z|00755|binding|INFO|Setting lport f8129917-8284-46b6-9714-dc84a6cb4b04 ovn-installed in OVS
Nov 29 02:42:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:42:24Z|00756|binding|INFO|Setting lport f8129917-8284-46b6-9714-dc84a6cb4b04 up in Southbound
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.666 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[c236d131-6577-4ae4-a3d2-46117a3a8108]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.683 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:24 np0005539505 NetworkManager[55134]: <info>  [1764402144.6873] device (tapf8129917-82): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:42:24 np0005539505 NetworkManager[55134]: <info>  [1764402144.6900] device (tapf8129917-82): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:42:24 np0005539505 nova_compute[186958]: 2025-11-29 07:42:24.695 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.702 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a2c35765-671b-4a89-b4f8-a216540f9577]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.734 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[97a556ba-78ab-4481-95cf-a9412286df1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 systemd-udevd[247898]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:42:24 np0005539505 NetworkManager[55134]: <info>  [1764402144.7447] manager: (tap6f1be974-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/372)
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.743 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[042871bf-4ebe-4ca0-8c0e-a5e4ea119af8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.791 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[0daa8177-1bf8-42b6-b75d-79cf01597585]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.794 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[e9fcf7be-7999-4d24-8ded-e5f6cf7cade3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 NetworkManager[55134]: <info>  [1764402144.8187] device (tap6f1be974-b0): carrier: link connected
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.828 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[d573869c-3cca-4555-80a5-e0be90511b2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.851 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2c6b4787-fffb-4e23-bd56-7d9d160f7d5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6f1be974-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:49:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 759240, 'reachable_time': 31047, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247922, 'error': None, 'target': 'ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.875 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[970ab2e0-5170-4855-b047-84eac2c1cf28]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:49ed'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 759240, 'tstamp': 759240}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247923, 'error': None, 'target': 'ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.900 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1a032257-cdd1-443c-8ad9-24c31794d851]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6f1be974-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:49:ed'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 235], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 759240, 'reachable_time': 31047, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247924, 'error': None, 'target': 'ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:24.944 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[03472135-2a79-4082-980a-9f42ea3ca0cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:25.027 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[66db397f-fd3d-40d5-bca2-ff29ee453c2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:25.029 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6f1be974-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:25.029 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:25.030 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6f1be974-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.033 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:25 np0005539505 kernel: tap6f1be974-b0: entered promiscuous mode
Nov 29 02:42:25 np0005539505 NetworkManager[55134]: <info>  [1764402145.0347] manager: (tap6f1be974-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/373)
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.038 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:25.039 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6f1be974-b0, col_values=(('external_ids', {'iface-id': '5727c765-4dbb-4890-b58e-a90c8d5f55f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:42:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:42:25Z|00757|binding|INFO|Releasing lport 5727c765-4dbb-4890-b58e-a90c8d5f55f2 from this chassis (sb_readonly=0)
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.041 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.063 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:25.064 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6f1be974-bcaa-4b93-ab01-8adab0060f10.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6f1be974-bcaa-4b93-ab01-8adab0060f10.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:25.065 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3d57579e-e106-4017-981f-8938b1fb26ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:25.066 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-6f1be974-bcaa-4b93-ab01-8adab0060f10
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/6f1be974-bcaa-4b93-ab01-8adab0060f10.pid.haproxy
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 6f1be974-bcaa-4b93-ab01-8adab0060f10
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:42:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:25.067 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'env', 'PROCESS_TAG=haproxy-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6f1be974-bcaa-4b93-ab01-8adab0060f10.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:42:25 np0005539505 podman[247956]: 2025-11-29 07:42:25.434865105 +0000 UTC m=+0.037374922 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.585 186962 DEBUG nova.virt.libvirt.driver [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.585 186962 DEBUG nova.virt.libvirt.driver [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.586 186962 DEBUG nova.virt.libvirt.driver [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:0d:fa:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.586 186962 DEBUG nova.virt.libvirt.driver [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:3b:cc:f7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.610 186962 DEBUG nova.virt.libvirt.guest [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:42:25 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:  <nova:name>tempest-TestNetworkBasicOps-server-2110898837</nova:name>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:42:25</nova:creationTime>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:42:25 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:    <nova:port uuid="7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d">
Nov 29 02:42:25 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:    <nova:port uuid="f8129917-8284-46b6-9714-dc84a6cb4b04">
Nov 29 02:42:25 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.46" ipVersion="4"/>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:42:25 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:42:25 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:42:25 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.624 186962 DEBUG nova.compute.manager [req-e728533f-3cf4-46cb-b407-eee19f4d9ad3 req-f9887cb3-f9cc-461f-b554-6f79d11a120b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-vif-plugged-f8129917-8284-46b6-9714-dc84a6cb4b04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.624 186962 DEBUG oslo_concurrency.lockutils [req-e728533f-3cf4-46cb-b407-eee19f4d9ad3 req-f9887cb3-f9cc-461f-b554-6f79d11a120b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.625 186962 DEBUG oslo_concurrency.lockutils [req-e728533f-3cf4-46cb-b407-eee19f4d9ad3 req-f9887cb3-f9cc-461f-b554-6f79d11a120b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.625 186962 DEBUG oslo_concurrency.lockutils [req-e728533f-3cf4-46cb-b407-eee19f4d9ad3 req-f9887cb3-f9cc-461f-b554-6f79d11a120b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.625 186962 DEBUG nova.compute.manager [req-e728533f-3cf4-46cb-b407-eee19f4d9ad3 req-f9887cb3-f9cc-461f-b554-6f79d11a120b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] No waiting events found dispatching network-vif-plugged-f8129917-8284-46b6-9714-dc84a6cb4b04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.626 186962 WARNING nova.compute.manager [req-e728533f-3cf4-46cb-b407-eee19f4d9ad3 req-f9887cb3-f9cc-461f-b554-6f79d11a120b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received unexpected event network-vif-plugged-f8129917-8284-46b6-9714-dc84a6cb4b04 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:42:25 np0005539505 nova_compute[186958]: 2025-11-29 07:42:25.645 186962 DEBUG oslo_concurrency.lockutils [None req-356b7608-bba5-40af-a4a3-963f89acfe17 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "interface-1054c168-50b7-42e4-aedb-6ddca8a197a4-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 18.513s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:26 np0005539505 podman[247956]: 2025-11-29 07:42:26.369121242 +0000 UTC m=+0.971631029 container create 6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 02:42:26 np0005539505 systemd[1]: Started libpod-conmon-6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873.scope.
Nov 29 02:42:26 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:42:26 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec088acc589728e063b7da2c3bb90c76d29b7e4aa4df3d6b39e9ab55f2fa2acf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:42:26 np0005539505 podman[247956]: 2025-11-29 07:42:26.565349348 +0000 UTC m=+1.167859155 container init 6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 29 02:42:26 np0005539505 podman[247956]: 2025-11-29 07:42:26.573336807 +0000 UTC m=+1.175846584 container start 6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:42:26 np0005539505 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[247972]: [NOTICE]   (247976) : New worker (247978) forked
Nov 29 02:42:26 np0005539505 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[247972]: [NOTICE]   (247976) : Loading success.
Nov 29 02:42:27 np0005539505 nova_compute[186958]: 2025-11-29 07:42:27.489 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:27.524 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:27.525 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:42:27.526 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:27 np0005539505 nova_compute[186958]: 2025-11-29 07:42:27.728 186962 DEBUG nova.compute.manager [req-9cd5ecc0-8c94-4cd2-bb56-63ad4896f10f req-3b6af2b6-fbed-4aa9-92dd-2081f8b65422 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-vif-plugged-f8129917-8284-46b6-9714-dc84a6cb4b04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:42:27 np0005539505 nova_compute[186958]: 2025-11-29 07:42:27.729 186962 DEBUG oslo_concurrency.lockutils [req-9cd5ecc0-8c94-4cd2-bb56-63ad4896f10f req-3b6af2b6-fbed-4aa9-92dd-2081f8b65422 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:27 np0005539505 nova_compute[186958]: 2025-11-29 07:42:27.730 186962 DEBUG oslo_concurrency.lockutils [req-9cd5ecc0-8c94-4cd2-bb56-63ad4896f10f req-3b6af2b6-fbed-4aa9-92dd-2081f8b65422 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:27 np0005539505 nova_compute[186958]: 2025-11-29 07:42:27.730 186962 DEBUG oslo_concurrency.lockutils [req-9cd5ecc0-8c94-4cd2-bb56-63ad4896f10f req-3b6af2b6-fbed-4aa9-92dd-2081f8b65422 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:27 np0005539505 nova_compute[186958]: 2025-11-29 07:42:27.731 186962 DEBUG nova.compute.manager [req-9cd5ecc0-8c94-4cd2-bb56-63ad4896f10f req-3b6af2b6-fbed-4aa9-92dd-2081f8b65422 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] No waiting events found dispatching network-vif-plugged-f8129917-8284-46b6-9714-dc84a6cb4b04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:42:27 np0005539505 nova_compute[186958]: 2025-11-29 07:42:27.731 186962 WARNING nova.compute.manager [req-9cd5ecc0-8c94-4cd2-bb56-63ad4896f10f req-3b6af2b6-fbed-4aa9-92dd-2081f8b65422 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received unexpected event network-vif-plugged-f8129917-8284-46b6-9714-dc84a6cb4b04 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:42:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:42:27Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3b:cc:f7 10.100.0.46
Nov 29 02:42:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:42:27Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3b:cc:f7 10.100.0.46
Nov 29 02:42:27 np0005539505 nova_compute[186958]: 2025-11-29 07:42:27.799 186962 DEBUG nova.network.neutron [req-6dbde7a7-5ef8-4e44-aa3a-e704c6ae2644 req-fba398b4-35dd-4a0b-b0b7-9ccc96ee2f6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updated VIF entry in instance network info cache for port f8129917-8284-46b6-9714-dc84a6cb4b04. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:42:27 np0005539505 nova_compute[186958]: 2025-11-29 07:42:27.800 186962 DEBUG nova.network.neutron [req-6dbde7a7-5ef8-4e44-aa3a-e704c6ae2644 req-fba398b4-35dd-4a0b-b0b7-9ccc96ee2f6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:42:28 np0005539505 nova_compute[186958]: 2025-11-29 07:42:28.086 186962 DEBUG oslo_concurrency.lockutils [req-6dbde7a7-5ef8-4e44-aa3a-e704c6ae2644 req-fba398b4-35dd-4a0b-b0b7-9ccc96ee2f6f 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:42:29 np0005539505 nova_compute[186958]: 2025-11-29 07:42:29.577 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:29 np0005539505 podman[247996]: 2025-11-29 07:42:29.771158378 +0000 UTC m=+0.079544991 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:42:29 np0005539505 podman[247997]: 2025-11-29 07:42:29.843507772 +0000 UTC m=+0.146527832 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 02:42:31 np0005539505 nova_compute[186958]: 2025-11-29 07:42:31.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:32 np0005539505 nova_compute[186958]: 2025-11-29 07:42:32.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:32 np0005539505 nova_compute[186958]: 2025-11-29 07:42:32.490 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:33 np0005539505 nova_compute[186958]: 2025-11-29 07:42:33.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:33 np0005539505 nova_compute[186958]: 2025-11-29 07:42:33.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:42:33 np0005539505 podman[248044]: 2025-11-29 07:42:33.742349342 +0000 UTC m=+0.078085899 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 29 02:42:33 np0005539505 podman[248045]: 2025-11-29 07:42:33.764088046 +0000 UTC m=+0.084302788 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:42:34 np0005539505 nova_compute[186958]: 2025-11-29 07:42:34.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:34 np0005539505 nova_compute[186958]: 2025-11-29 07:42:34.581 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:36 np0005539505 nova_compute[186958]: 2025-11-29 07:42:36.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:36 np0005539505 nova_compute[186958]: 2025-11-29 07:42:36.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:42:36 np0005539505 nova_compute[186958]: 2025-11-29 07:42:36.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:42:36 np0005539505 nova_compute[186958]: 2025-11-29 07:42:36.558 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:42:36 np0005539505 nova_compute[186958]: 2025-11-29 07:42:36.558 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:42:36 np0005539505 nova_compute[186958]: 2025-11-29 07:42:36.558 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:42:36 np0005539505 nova_compute[186958]: 2025-11-29 07:42:36.559 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:42:37 np0005539505 nova_compute[186958]: 2025-11-29 07:42:37.492 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:39 np0005539505 nova_compute[186958]: 2025-11-29 07:42:39.585 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:41 np0005539505 nova_compute[186958]: 2025-11-29 07:42:41.620 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.401 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.402 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.402 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.402 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.427 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.427 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.427 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.427 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.495 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.579 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.636 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.638 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.695 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.874 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.875 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5560MB free_disk=73.0444221496582GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.875 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.875 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.981 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 1054c168-50b7-42e4-aedb-6ddca8a197a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.981 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:42:42 np0005539505 nova_compute[186958]: 2025-11-29 07:42:42.981 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:42:43 np0005539505 nova_compute[186958]: 2025-11-29 07:42:43.019 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:42:43 np0005539505 nova_compute[186958]: 2025-11-29 07:42:43.062 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:42:43 np0005539505 nova_compute[186958]: 2025-11-29 07:42:43.065 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:42:43 np0005539505 nova_compute[186958]: 2025-11-29 07:42:43.066 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:42:44 np0005539505 nova_compute[186958]: 2025-11-29 07:42:44.589 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:46 np0005539505 nova_compute[186958]: 2025-11-29 07:42:46.042 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:46 np0005539505 nova_compute[186958]: 2025-11-29 07:42:46.043 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:47 np0005539505 nova_compute[186958]: 2025-11-29 07:42:47.497 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.109 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'name': 'tempest-TestNetworkBasicOps-server-2110898837', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000009f', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ec8b80be17a14d1caf666636283749d0', 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'hostId': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.110 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.114 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1054c168-50b7-42e4-aedb-6ddca8a197a4 / tap7a99c6a7-ba inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.115 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1054c168-50b7-42e4-aedb-6ddca8a197a4 / tapf8129917-82 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.116 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.117 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d3bbc15-1a89-42ad-a7d9-555054c58df5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tap7a99c6a7-ba', 'timestamp': '2025-11-29T07:42:48.111115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tap7a99c6a7-ba', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:fa:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a99c6a7-ba'}, 'message_id': '00385886-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': 'cb5e81dfc5aba64f588780fa66bbedd1fd4cd761b5e0c218c58ee076622425a7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tapf8129917-82', 'timestamp': '2025-11-29T07:42:48.111115', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tapf8129917-82', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:cc:f7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf8129917-82'}, 'message_id': '003875dc-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '187faca721e6e7551e5e34e20566e2fa36af9d0b655323dd285ff990abeb0f8f'}]}, 'timestamp': '2025-11-29 07:42:48.117653', '_unique_id': 'f75a42b7ce7d4875bb3349f0337f3b75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.122 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.122 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47d9ee3e-b877-4160-b301-3636463e8d77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tap7a99c6a7-ba', 'timestamp': '2025-11-29T07:42:48.122269', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tap7a99c6a7-ba', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:fa:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a99c6a7-ba'}, 'message_id': '0039428c-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '9bff4ef3356a0411cbd1f0c946d22ed6bd12f91c2bc9362ec84b0b50d850dad0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tapf8129917-82', 'timestamp': '2025-11-29T07:42:48.122269', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tapf8129917-82', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:cc:f7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf8129917-82'}, 'message_id': '0039590c-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '344a84a143762cea0cdaec764e84bcfbbbc57fea040f14b347e2ebe8acec1e7a'}]}, 'timestamp': '2025-11-29 07:42:48.123498', '_unique_id': 'babd78d6c59b4bacb5bdd50317e949f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.124 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.126 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.127 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.127 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6df865d5-5475-4264-bd10-2114a52aab83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tap7a99c6a7-ba', 'timestamp': '2025-11-29T07:42:48.127124', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tap7a99c6a7-ba', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:fa:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a99c6a7-ba'}, 'message_id': '0039fb14-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '85c72a8c165327867353a777003be303d57c2d34eaf4a65719b4fe73d0b4150d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tapf8129917-82', 'timestamp': '2025-11-29T07:42:48.127124', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tapf8129917-82', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:cc:f7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf8129917-82'}, 'message_id': '003a09d8-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '5f24816c727f3c26bfd942bc9e5065fcfb7ccf4c794219ebd185604c1ada3060'}]}, 'timestamp': '2025-11-29 07:42:48.127857', '_unique_id': 'fc14e5d6b86c4e1687d4c2eace4d3a33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.128 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.129 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.129 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.incoming.bytes volume: 27803 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.incoming.bytes volume: 1330 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18fa9d49-3bc3-4535-bfce-9e5b825e2641', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27803, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tap7a99c6a7-ba', 'timestamp': '2025-11-29T07:42:48.129937', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tap7a99c6a7-ba', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:fa:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a99c6a7-ba'}, 'message_id': '003a6612-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': 'a0289bb87b3d0a65098745f728d4e5d6b0c8334888f5d369c7c07fe3332a762a'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1330, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tapf8129917-82', 'timestamp': '2025-11-29T07:42:48.129937', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tapf8129917-82', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:cc:f7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf8129917-82'}, 'message_id': '003a70c6-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '8da4f654e443e8fd6d9e2d603048721c6ec1c9bb648bf14b5d416a766e22c808'}]}, 'timestamp': '2025-11-29 07:42:48.130482', '_unique_id': '7c4cd2c070454180a5f51bf2e722511f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.166 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.read.latency volume: 174624732 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.167 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.read.latency volume: 19037743 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd989dd53-9d3a-484f-a52f-063ffde4440d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 174624732, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-vda', 'timestamp': '2025-11-29T07:42:48.131925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '004012a6-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': 'fbae842c4b2eb8a52cf80f0cf5ec153d640a933b1450f51483e32c673235fc03'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19037743, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-sda', 'timestamp': '2025-11-29T07:42:48.131925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00402b4c-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': '0e60f4a6173b4164fa9fed71c54b9cc00e25312d7204ca0fdc5e07ca5fe9accd'}]}, 'timestamp': '2025-11-29 07:42:48.168155', '_unique_id': 'e9ab326cfa9842cfa225c882810f67ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.169 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.171 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.171 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.read.requests volume: 1081 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.172 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5370100f-da4a-412d-9039-369b5be24e4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1081, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-vda', 'timestamp': '2025-11-29T07:42:48.171933', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0040d664-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': '8cfb3ae59382bbbbf44f04232fdc1ac0107f60bee166da9591403c0511fa6e20'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-sda', 'timestamp': '2025-11-29T07:42:48.171933', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0040eb0e-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': '35030f605a9cc48b120a6fb17898a3ff6717f9059f18d93bd3ef0109cd0e873e'}]}, 'timestamp': '2025-11-29 07:42:48.173045', '_unique_id': 'fe2102897b0d445a8b036a3989127722'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.176 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.write.requests volume: 353 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.176 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e941cf62-3f06-47f5-b57b-e9457980e859', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 353, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-vda', 'timestamp': '2025-11-29T07:42:48.176161', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00417e3e-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': '20726923f418396e5288d7ce63ba8b0879bb4ebce772c6f5ed7ba0a97582118a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-sda', 'timestamp': '2025-11-29T07:42:48.176161', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00419342-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': 'fbfc3be697ee8dd4350ab45c275af49e35f3663585d49240263aad7b7db4fea2'}]}, 'timestamp': '2025-11-29 07:42:48.177399', '_unique_id': 'd71ffc4527f74064924d656d7a243517'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.180 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.181 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.read.bytes volume: 30063104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.181 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e3d9875-842f-4dd7-b5f1-77d00074c0c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30063104, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-vda', 'timestamp': '2025-11-29T07:42:48.181138', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00423e50-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': 'c11af0f529b628d60b4b9b88d40ff7a084611241937348564e7ca29bbd76cb6b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-sda', 'timestamp': '2025-11-29T07:42:48.181138', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '004252aa-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': '766916d804037e53f322ac79cfaa219b572aa3ccade3712b02f228ec942d0332'}]}, 'timestamp': '2025-11-29 07:42:48.182292', '_unique_id': '42c5ec53aac94023abbbc86ed8360af0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.184 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.185 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b87d88d-8734-4f8e-969a-aee2357fdbe8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tap7a99c6a7-ba', 'timestamp': '2025-11-29T07:42:48.184846', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tap7a99c6a7-ba', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:fa:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a99c6a7-ba'}, 'message_id': '0042cae6-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '0f6bce4996cb4aacaaa0b9feec0368b0c57e74c12bc740324f6e5f35f18dadf3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tapf8129917-82', 'timestamp': '2025-11-29T07:42:48.184846', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tapf8129917-82', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:cc:f7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf8129917-82'}, 'message_id': '0042def0-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '45ff04e2730094b08a65be7427888a96208c92309fbc35b55dd6eba0f1b5a0a9'}]}, 'timestamp': '2025-11-29 07:42:48.185884', '_unique_id': 'd23002df2d2f4b11891ffb7266234d5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.189 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.incoming.packets volume: 146 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.189 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70a33398-6b8f-4756-9241-66fe8e259369', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 146, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tap7a99c6a7-ba', 'timestamp': '2025-11-29T07:42:48.189176', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tap7a99c6a7-ba', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:fa:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a99c6a7-ba'}, 'message_id': '00437982-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': 'f957b17fea8d58f29d2de67c206604dc12280c6953e9b4bfa9aa7a2ee1fe06a7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tapf8129917-82', 'timestamp': '2025-11-29T07:42:48.189176', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tapf8129917-82', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:cc:f7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf8129917-82'}, 'message_id': '004390fc-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': 'acb40e8e403fc84ac1f5626a17601e8bb65f3e2efea636f908ca2c040b5e4880'}]}, 'timestamp': '2025-11-29 07:42:48.190477', '_unique_id': 'a1241ef42ca34034a3e191bf393e6f6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.193 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.write.latency volume: 40320541394 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.194 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ffc4dcd-6b81-4bc1-b13d-d9dc91357034', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40320541394, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-vda', 'timestamp': '2025-11-29T07:42:48.193788', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00443160-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': 'cd052d75d78f1cc4b583be7b77c6da70f66b7e4e8b9e719383ad938cf00e546d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-sda', 'timestamp': '2025-11-29T07:42:48.193788', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0044481c-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': '53d11890cef889caf073a59faff8dbd5c7a9bad48dd99e1a4eef2b836e602cf3'}]}, 'timestamp': '2025-11-29 07:42:48.195089', '_unique_id': '531c07a295244ec99327fa02a0d727d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.198 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.198 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2110898837>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2110898837>]
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.214 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/memory.usage volume: 43.46484375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8d25fe3-d328-4706-a6bf-51979bac5a25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.46484375, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'timestamp': '2025-11-29T07:42:48.199678', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '004756b0-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.855188732, 'message_signature': 'b882593d5ebd68da5c5031cfa6d077124825c5198e5b71fe023a07db09cbd70d'}]}, 'timestamp': '2025-11-29 07:42:48.215164', '_unique_id': '7a6752e438eb477e84b7a57f96aff9e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.219 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.219 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2110898837>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2110898837>]
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.219 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.220 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2110898837>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2110898837>]
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.220 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.221 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de6af3cf-c9cb-4ef3-8d1c-6f9782f60784', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tap7a99c6a7-ba', 'timestamp': '2025-11-29T07:42:48.220489', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tap7a99c6a7-ba', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:fa:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a99c6a7-ba'}, 'message_id': '00483e9a-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '72c36820fb4f7c9f919ca8b639aee9c3f53c7fded89dbde783ef4498a7c3cab8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tapf8129917-82', 'timestamp': '2025-11-29T07:42:48.220489', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tapf8129917-82', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:cc:f7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf8129917-82'}, 'message_id': '00484f98-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '6517c4b4476414519f4bdc16180489e50128682e9d4caabf94aa800fc67bd59e'}]}, 'timestamp': '2025-11-29 07:42:48.221466', '_unique_id': '8dd955ea8dc34391ae32701a6e6b0d7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.222 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.224 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.outgoing.bytes volume: 23742 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.224 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.outgoing.bytes volume: 1494 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23bf308a-6f6c-4486-9403-80bc2eb787f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23742, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tap7a99c6a7-ba', 'timestamp': '2025-11-29T07:42:48.224002', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tap7a99c6a7-ba', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:fa:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a99c6a7-ba'}, 'message_id': '0048c112-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '6fa77a458bcaf2c30ccfb46f693f8cc4ee9d8f61504f41776c705c718a89c619'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1494, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tapf8129917-82', 'timestamp': '2025-11-29T07:42:48.224002', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tapf8129917-82', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:cc:f7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf8129917-82'}, 'message_id': '0048cf40-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '7b6e44e49109344e6f9f2e57ee08a731005c65edd9588b4aed4b00e39fe10370'}]}, 'timestamp': '2025-11-29 07:42:48.224725', '_unique_id': 'fdd2b93a0aa64dfa819dd2778f77d201'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.225 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.226 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.227 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7a24107-72e3-4ce7-904b-fdf65bb23920', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tap7a99c6a7-ba', 'timestamp': '2025-11-29T07:42:48.226737', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tap7a99c6a7-ba', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:fa:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a99c6a7-ba'}, 'message_id': '00492bca-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': 'cb07c43669a4c6c558a8236e972a223e1acf33c22b27dfbca7b13a3efad2693f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tapf8129917-82', 'timestamp': '2025-11-29T07:42:48.226737', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tapf8129917-82', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:cc:f7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf8129917-82'}, 'message_id': '004939a8-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '5d42f00906e6b03dc1b2c14de346749a26064d6a2486954aa9bb1e9898230d7a'}]}, 'timestamp': '2025-11-29 07:42:48.227443', '_unique_id': 'e43c18d2c2b84872945e0207d571b903'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.228 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.229 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.229 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2110898837>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2110898837>]
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.240 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.241 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec2cdb22-7235-4f87-a08f-2e4015682c22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-vda', 'timestamp': '2025-11-29T07:42:48.229653', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '004b5bd4-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.870531132, 'message_signature': 'bbfaca04bb1bd9e3f9903ad675c591f26d15346726f32e05939026bc5958cb9c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-sda', 'timestamp': '2025-11-29T07:42:48.229653', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '004b6a3e-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.870531132, 'message_signature': '2e63722d92d2d8e86a3f3b4b5e4cb4bbb4c3c85bc9354d3665f697a7030b8ced'}]}, 'timestamp': '2025-11-29 07:42:48.241793', '_unique_id': 'b40247ba694a485183763cd0dd7cbe0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.242 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.243 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.243 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c2f6b58-dd61-4a74-90d0-fcebd8c1a35f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-vda', 'timestamp': '2025-11-29T07:42:48.243541', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '004bbbec-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.870531132, 'message_signature': '5eee1ab1021ba908ddb847817110d2dc0d14910ad4d263cf750e5fb1f00c57cb'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-sda', 'timestamp': '2025-11-29T07:42:48.243541', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '004bc7a4-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.870531132, 'message_signature': '99fdf84d49ede62652c7c93f22d5735b32cb837408d32b68d78d4b3fa5c0a830'}]}, 'timestamp': '2025-11-29 07:42:48.244152', '_unique_id': '2cacb4f037f04ad18f2c8b7434d71d7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.244 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.245 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.245 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.outgoing.packets volume: 148 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.246 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/network.outgoing.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0fac2b3e-ddee-49e3-8695-97eadc62091e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 148, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tap7a99c6a7-ba', 'timestamp': '2025-11-29T07:42:48.245884', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tap7a99c6a7-ba', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:fa:32', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7a99c6a7-ba'}, 'message_id': '004c17f4-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '605372a16fe5b12a578b57b21c285063a4efcbc0a36f1b67442b553fc9fbf53a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': 'instance-0000009f-1054c168-50b7-42e4-aedb-6ddca8a197a4-tapf8129917-82', 'timestamp': '2025-11-29T07:42:48.245884', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'tapf8129917-82', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3b:cc:f7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf8129917-82'}, 'message_id': '004c256e-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.751987803, 'message_signature': '484f793e368332a2e0efef2d505ac0a801f7e0d6c93c70548fca67ddbc7f7987'}]}, 'timestamp': '2025-11-29 07:42:48.246564', '_unique_id': 'd59301d5408946a587c75e272221f785'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.247 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.248 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.248 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/cpu volume: 12770000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e77b79c-4725-476a-84da-019ee4f38984', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12770000000, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'timestamp': '2025-11-29T07:42:48.248248', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '004c7442-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.855188732, 'message_signature': '89b4151ba80910b18e93047d2c67d072b861175787f9980d59c136fb9746651e'}]}, 'timestamp': '2025-11-29 07:42:48.248584', '_unique_id': '646de4ece67747d2a1d31fb9727f7073'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.249 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.250 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.250 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.250 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd11a693-4625-47aa-91a4-efd30da45ee1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-vda', 'timestamp': '2025-11-29T07:42:48.250154', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '004cc000-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.870531132, 'message_signature': 'b261fbbb83ef367c83e7867b81b8e1aaa4fda2e0aeefde3d45ed46c4777f869f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-sda', 'timestamp': '2025-11-29T07:42:48.250154', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '004cce74-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.870531132, 'message_signature': 'e2aa12208c3cb696aba2596b8c94110bc6d45590a7af1edcb59188f68ba62283'}]}, 'timestamp': '2025-11-29 07:42:48.250917', '_unique_id': '004f38d064e64680a373e4061a437250'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.251 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.252 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.write.bytes volume: 73166848 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.253 12 DEBUG ceilometer.compute.pollsters [-] 1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9db098e0-b4b8-4ccc-b8da-cd278c72073a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73166848, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-vda', 'timestamp': '2025-11-29T07:42:48.252888', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '004d2b62-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': 'dde352a36d75018dfe561446bd3c627ad19b90f650b0e7c0b76455b95171fcf4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_name': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_name': None, 'resource_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4-sda', 'timestamp': '2025-11-29T07:42:48.252888', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2110898837', 'name': 'instance-0000009f', 'instance_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'instance_type': 'm1.nano', 'host': 'c34a7a35dd3c433c3f109f39c3184cfe6f95e71dba189cfdbb38a073', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '004d38fa-ccf7-11f0-8954-fa163e5a5606', 'monotonic_time': 7615.772777629, 'message_signature': '89858044c51a60bcc7cbfcb8547aed36c675e0dd77fe5817d7fdfdc50dc37287'}]}, 'timestamp': '2025-11-29 07:42:48.253605', '_unique_id': '455f97fb597044edb57c52ab6ea96007'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:42:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:42:48.254 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:42:48 np0005539505 nova_compute[186958]: 2025-11-29 07:42:48.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:42:48 np0005539505 podman[248092]: 2025-11-29 07:42:48.731695137 +0000 UTC m=+0.059383323 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:42:48 np0005539505 podman[248091]: 2025-11-29 07:42:48.741234341 +0000 UTC m=+0.067637600 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, release=1755695350, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7)
Nov 29 02:42:49 np0005539505 nova_compute[186958]: 2025-11-29 07:42:49.593 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:52 np0005539505 nova_compute[186958]: 2025-11-29 07:42:52.499 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:52 np0005539505 podman[248133]: 2025-11-29 07:42:52.715589196 +0000 UTC m=+0.049280604 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Nov 29 02:42:54 np0005539505 nova_compute[186958]: 2025-11-29 07:42:54.597 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:57 np0005539505 nova_compute[186958]: 2025-11-29 07:42:57.500 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:42:59 np0005539505 nova_compute[186958]: 2025-11-29 07:42:59.601 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:00 np0005539505 podman[248152]: 2025-11-29 07:43:00.763965506 +0000 UTC m=+0.078841742 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:43:00 np0005539505 podman[248153]: 2025-11-29 07:43:00.807169294 +0000 UTC m=+0.122045690 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 29 02:43:02 np0005539505 nova_compute[186958]: 2025-11-29 07:43:02.538 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:04 np0005539505 nova_compute[186958]: 2025-11-29 07:43:04.604 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:04 np0005539505 podman[248202]: 2025-11-29 07:43:04.740617165 +0000 UTC m=+0.065048016 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 02:43:04 np0005539505 podman[248203]: 2025-11-29 07:43:04.785597154 +0000 UTC m=+0.099808612 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:43:05 np0005539505 nova_compute[186958]: 2025-11-29 07:43:05.921 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:05 np0005539505 nova_compute[186958]: 2025-11-29 07:43:05.950 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Triggering sync for uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 29 02:43:05 np0005539505 nova_compute[186958]: 2025-11-29 07:43:05.951 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:43:05 np0005539505 nova_compute[186958]: 2025-11-29 07:43:05.952 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:43:06 np0005539505 nova_compute[186958]: 2025-11-29 07:43:06.039 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:43:06 np0005539505 nova_compute[186958]: 2025-11-29 07:43:06.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:06 np0005539505 nova_compute[186958]: 2025-11-29 07:43:06.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:43:06 np0005539505 nova_compute[186958]: 2025-11-29 07:43:06.399 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:43:06 np0005539505 nova_compute[186958]: 2025-11-29 07:43:06.400 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:06 np0005539505 nova_compute[186958]: 2025-11-29 07:43:06.401 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:43:07 np0005539505 nova_compute[186958]: 2025-11-29 07:43:07.541 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:07.796 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:43:07 np0005539505 nova_compute[186958]: 2025-11-29 07:43:07.797 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:07 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:07.797 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:43:09 np0005539505 nova_compute[186958]: 2025-11-29 07:43:09.608 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:43:10Z|00758|binding|INFO|Releasing lport 3e62b79d-3423-4f59-a770-7cbfebfe062a from this chassis (sb_readonly=0)
Nov 29 02:43:10 np0005539505 ovn_controller[95143]: 2025-11-29T07:43:10Z|00759|binding|INFO|Releasing lport 5727c765-4dbb-4890-b58e-a90c8d5f55f2 from this chassis (sb_readonly=0)
Nov 29 02:43:11 np0005539505 nova_compute[186958]: 2025-11-29 07:43:11.044 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:12 np0005539505 nova_compute[186958]: 2025-11-29 07:43:12.542 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:13 np0005539505 nova_compute[186958]: 2025-11-29 07:43:13.090 186962 DEBUG nova.compute.manager [req-c4625f37-81ad-40e3-8a93-c4d5ac99e10a req-a753f67a-4d9b-455f-94f8-2cb22278f632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-changed-f8129917-8284-46b6-9714-dc84a6cb4b04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:43:13 np0005539505 nova_compute[186958]: 2025-11-29 07:43:13.091 186962 DEBUG nova.compute.manager [req-c4625f37-81ad-40e3-8a93-c4d5ac99e10a req-a753f67a-4d9b-455f-94f8-2cb22278f632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Refreshing instance network info cache due to event network-changed-f8129917-8284-46b6-9714-dc84a6cb4b04. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:43:13 np0005539505 nova_compute[186958]: 2025-11-29 07:43:13.091 186962 DEBUG oslo_concurrency.lockutils [req-c4625f37-81ad-40e3-8a93-c4d5ac99e10a req-a753f67a-4d9b-455f-94f8-2cb22278f632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:43:13 np0005539505 nova_compute[186958]: 2025-11-29 07:43:13.092 186962 DEBUG oslo_concurrency.lockutils [req-c4625f37-81ad-40e3-8a93-c4d5ac99e10a req-a753f67a-4d9b-455f-94f8-2cb22278f632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:43:13 np0005539505 nova_compute[186958]: 2025-11-29 07:43:13.092 186962 DEBUG nova.network.neutron [req-c4625f37-81ad-40e3-8a93-c4d5ac99e10a req-a753f67a-4d9b-455f-94f8-2cb22278f632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Refreshing network info cache for port f8129917-8284-46b6-9714-dc84a6cb4b04 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:43:14 np0005539505 nova_compute[186958]: 2025-11-29 07:43:14.274 186962 DEBUG nova.network.neutron [req-c4625f37-81ad-40e3-8a93-c4d5ac99e10a req-a753f67a-4d9b-455f-94f8-2cb22278f632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updated VIF entry in instance network info cache for port f8129917-8284-46b6-9714-dc84a6cb4b04. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:43:14 np0005539505 nova_compute[186958]: 2025-11-29 07:43:14.274 186962 DEBUG nova.network.neutron [req-c4625f37-81ad-40e3-8a93-c4d5ac99e10a req-a753f67a-4d9b-455f-94f8-2cb22278f632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:43:14 np0005539505 nova_compute[186958]: 2025-11-29 07:43:14.292 186962 DEBUG oslo_concurrency.lockutils [req-c4625f37-81ad-40e3-8a93-c4d5ac99e10a req-a753f67a-4d9b-455f-94f8-2cb22278f632 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:43:14 np0005539505 nova_compute[186958]: 2025-11-29 07:43:14.612 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:15 np0005539505 podman[201154]: time="2025-11-29T07:43:15Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 29 02:43:15 np0005539505 podman[201154]: @ - - [29/Nov/2025:07:43:15 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 25376 "" "Go-http-client/1.1"
Nov 29 02:43:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:15.799 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:43:17 np0005539505 nova_compute[186958]: 2025-11-29 07:43:17.596 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:19 np0005539505 nova_compute[186958]: 2025-11-29 07:43:19.615 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:19 np0005539505 podman[248244]: 2025-11-29 07:43:19.71912012 +0000 UTC m=+0.048969845 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:43:19 np0005539505 podman[248243]: 2025-11-29 07:43:19.720065697 +0000 UTC m=+0.053259318 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:43:21 np0005539505 nova_compute[186958]: 2025-11-29 07:43:21.130 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:22 np0005539505 nova_compute[186958]: 2025-11-29 07:43:22.625 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:23 np0005539505 nova_compute[186958]: 2025-11-29 07:43:23.281 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:23 np0005539505 podman[248284]: 2025-11-29 07:43:23.725441303 +0000 UTC m=+0.052297840 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:43:24 np0005539505 nova_compute[186958]: 2025-11-29 07:43:24.618 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:24.699 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2 2001:db8::f816:3eff:fec8:31c8'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec8:31c8/64', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff62ba9a-db01-45ed-b4a4-c5b2c8f5434e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4e85a268-4b8a-4015-a903-2252d696f8f5) old=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:43:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:24.702 104094 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4e85a268-4b8a-4015-a903-2252d696f8f5 in datapath e23e9510-a780-4254-b7f0-36040139e7db updated#033[00m
Nov 29 02:43:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:24.706 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e23e9510-a780-4254-b7f0-36040139e7db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:43:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:24.708 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7a1afa90-0fba-4387-b677-9ff06dc00a8f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:43:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:27.525 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:43:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:27.525 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:43:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:27.526 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:43:27 np0005539505 nova_compute[186958]: 2025-11-29 07:43:27.628 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:28.956 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2 2001:db8:0:1:f816:3eff:fec8:31c8 2001:db8::f816:3eff:fec8:31c8'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fec8:31c8/64 2001:db8::f816:3eff:fec8:31c8/64', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ff62ba9a-db01-45ed-b4a4-c5b2c8f5434e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4e85a268-4b8a-4015-a903-2252d696f8f5) old=Port_Binding(mac=['fa:16:3e:c8:31:c8 10.100.0.2 2001:db8::f816:3eff:fec8:31c8'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec8:31c8/64', 'neutron:device_id': 'ovnmeta-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e23e9510-a780-4254-b7f0-36040139e7db', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:43:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:28.957 104094 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4e85a268-4b8a-4015-a903-2252d696f8f5 in datapath e23e9510-a780-4254-b7f0-36040139e7db updated#033[00m
Nov 29 02:43:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:28.959 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e23e9510-a780-4254-b7f0-36040139e7db, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:43:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:43:28.960 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5a620098-cc88-4c72-86a2-9efecedffa86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:43:29 np0005539505 nova_compute[186958]: 2025-11-29 07:43:29.622 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:31 np0005539505 podman[248304]: 2025-11-29 07:43:31.73260254 +0000 UTC m=+0.057050727 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:43:31 np0005539505 podman[248305]: 2025-11-29 07:43:31.783017775 +0000 UTC m=+0.109882432 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller)
Nov 29 02:43:32 np0005539505 nova_compute[186958]: 2025-11-29 07:43:32.413 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:32 np0005539505 nova_compute[186958]: 2025-11-29 07:43:32.670 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:33 np0005539505 nova_compute[186958]: 2025-11-29 07:43:33.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:33 np0005539505 nova_compute[186958]: 2025-11-29 07:43:33.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:43:34 np0005539505 nova_compute[186958]: 2025-11-29 07:43:34.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:34 np0005539505 nova_compute[186958]: 2025-11-29 07:43:34.626 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:35 np0005539505 podman[248350]: 2025-11-29 07:43:35.713140342 +0000 UTC m=+0.050156169 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:43:35 np0005539505 podman[248351]: 2025-11-29 07:43:35.719995788 +0000 UTC m=+0.051977771 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:43:36 np0005539505 nova_compute[186958]: 2025-11-29 07:43:36.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:37 np0005539505 nova_compute[186958]: 2025-11-29 07:43:37.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:37 np0005539505 nova_compute[186958]: 2025-11-29 07:43:37.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:43:37 np0005539505 nova_compute[186958]: 2025-11-29 07:43:37.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:43:37 np0005539505 nova_compute[186958]: 2025-11-29 07:43:37.621 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:43:37 np0005539505 nova_compute[186958]: 2025-11-29 07:43:37.622 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:43:37 np0005539505 nova_compute[186958]: 2025-11-29 07:43:37.622 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:43:37 np0005539505 nova_compute[186958]: 2025-11-29 07:43:37.622 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:43:37 np0005539505 nova_compute[186958]: 2025-11-29 07:43:37.672 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:39 np0005539505 nova_compute[186958]: 2025-11-29 07:43:39.630 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.240 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.256 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.257 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.258 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.398 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.399 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.399 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.399 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.455 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.507 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.508 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.562 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.677 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.729 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.730 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5560MB free_disk=73.04350662231445GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.731 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.731 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.812 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 1054c168-50b7-42e4-aedb-6ddca8a197a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.813 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.813 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.860 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.878 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.879 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:43:42 np0005539505 nova_compute[186958]: 2025-11-29 07:43:42.880 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:43:44 np0005539505 nova_compute[186958]: 2025-11-29 07:43:44.635 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:46 np0005539505 nova_compute[186958]: 2025-11-29 07:43:46.875 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:46 np0005539505 nova_compute[186958]: 2025-11-29 07:43:46.876 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:43:47 np0005539505 nova_compute[186958]: 2025-11-29 07:43:47.680 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:49 np0005539505 nova_compute[186958]: 2025-11-29 07:43:49.687 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:50 np0005539505 podman[248395]: 2025-11-29 07:43:50.750313648 +0000 UTC m=+0.074002822 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, io.buildah.version=1.33.7, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 29 02:43:50 np0005539505 podman[248396]: 2025-11-29 07:43:50.750672279 +0000 UTC m=+0.070341368 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:43:52 np0005539505 nova_compute[186958]: 2025-11-29 07:43:52.682 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:54 np0005539505 nova_compute[186958]: 2025-11-29 07:43:54.691 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:54 np0005539505 podman[248441]: 2025-11-29 07:43:54.754513131 +0000 UTC m=+0.080636013 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:43:57 np0005539505 nova_compute[186958]: 2025-11-29 07:43:57.683 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:43:59 np0005539505 nova_compute[186958]: 2025-11-29 07:43:59.694 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:02 np0005539505 nova_compute[186958]: 2025-11-29 07:44:02.686 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:02 np0005539505 podman[248457]: 2025-11-29 07:44:02.727020385 +0000 UTC m=+0.060402773 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:44:02 np0005539505 podman[248458]: 2025-11-29 07:44:02.783455603 +0000 UTC m=+0.107799272 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible)
Nov 29 02:44:04 np0005539505 nova_compute[186958]: 2025-11-29 07:44:04.697 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:06 np0005539505 podman[248507]: 2025-11-29 07:44:06.711981923 +0000 UTC m=+0.049733467 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd)
Nov 29 02:44:06 np0005539505 podman[248508]: 2025-11-29 07:44:06.718993924 +0000 UTC m=+0.051092946 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:44:07 np0005539505 nova_compute[186958]: 2025-11-29 07:44:07.690 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:09 np0005539505 nova_compute[186958]: 2025-11-29 07:44:09.700 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.134 186962 DEBUG oslo_concurrency.lockutils [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "interface-1054c168-50b7-42e4-aedb-6ddca8a197a4-f8129917-8284-46b6-9714-dc84a6cb4b04" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.134 186962 DEBUG oslo_concurrency.lockutils [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "interface-1054c168-50b7-42e4-aedb-6ddca8a197a4-f8129917-8284-46b6-9714-dc84a6cb4b04" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.148 186962 DEBUG nova.objects.instance [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'flavor' on Instance uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.176 186962 DEBUG nova.virt.libvirt.vif [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:41:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2110898837',display_name='tempest-TestNetworkBasicOps-server-2110898837',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2110898837',id=159,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM7UKFuj7QMrD2gwvRJwF1C5ODafQoalw3wt4tl3CWp1E3Ov5Wq4NHZjxUJq9RlkJeCJZQ+vSdDu4+Tn4UobItS+wK5vemrE3fQE4FqHuLj0BbgXviF1Wn0sGulLRgX6cA==',key_name='tempest-TestNetworkBasicOps-1286988878',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:41:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-yxnrze2m',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:41:27Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1054c168-50b7-42e4-aedb-6ddca8a197a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.177 186962 DEBUG nova.network.os_vif_util [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.178 186962 DEBUG nova.network.os_vif_util [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.180 186962 DEBUG nova.virt.libvirt.guest [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3b:cc:f7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf8129917-82"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.183 186962 DEBUG nova.virt.libvirt.guest [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3b:cc:f7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf8129917-82"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.186 186962 DEBUG nova.virt.libvirt.driver [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Attempting to detach device tapf8129917-82 from instance 1054c168-50b7-42e4-aedb-6ddca8a197a4 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.186 186962 DEBUG nova.virt.libvirt.guest [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:3b:cc:f7"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <target dev="tapf8129917-82"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.532 186962 DEBUG nova.virt.libvirt.guest [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3b:cc:f7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf8129917-82"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.536 186962 DEBUG nova.virt.libvirt.guest [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3b:cc:f7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf8129917-82"/></interface>not found in domain: <domain type='kvm' id='79'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <name>instance-0000009f</name>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <uuid>1054c168-50b7-42e4-aedb-6ddca8a197a4</uuid>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:name>tempest-TestNetworkBasicOps-server-2110898837</nova:name>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:42:25</nova:creationTime>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:port uuid="7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:port uuid="f8129917-8284-46b6-9714-dc84a6cb4b04">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.46" ipVersion="4"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='serial'>1054c168-50b7-42e4-aedb-6ddca8a197a4</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='uuid'>1054c168-50b7-42e4-aedb-6ddca8a197a4</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk' index='2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.config' index='1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:0d:fa:32'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target dev='tap7a99c6a7-ba'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:3b:cc:f7'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target dev='tapf8129917-82'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='net1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/console.log' append='off'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/console.log' append='off'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c56,c432</label>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c56,c432</imagelabel>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.537 186962 INFO nova.virt.libvirt.driver [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully detached device tapf8129917-82 from instance 1054c168-50b7-42e4-aedb-6ddca8a197a4 from the persistent domain config.#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.538 186962 DEBUG nova.virt.libvirt.driver [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] (1/8): Attempting to detach device tapf8129917-82 with device alias net1 from instance 1054c168-50b7-42e4-aedb-6ddca8a197a4 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.538 186962 DEBUG nova.virt.libvirt.guest [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] detach device xml: <interface type="ethernet">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <mac address="fa:16:3e:3b:cc:f7"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <model type="virtio"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <mtu size="1442"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <target dev="tapf8129917-82"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: </interface>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 29 02:44:11 np0005539505 kernel: tapf8129917-82 (unregistering): left promiscuous mode
Nov 29 02:44:11 np0005539505 NetworkManager[55134]: <info>  [1764402251.5922] device (tapf8129917-82): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:44:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:44:11Z|00760|binding|INFO|Releasing lport f8129917-8284-46b6-9714-dc84a6cb4b04 from this chassis (sb_readonly=0)
Nov 29 02:44:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:44:11Z|00761|binding|INFO|Setting lport f8129917-8284-46b6-9714-dc84a6cb4b04 down in Southbound
Nov 29 02:44:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:44:11Z|00762|binding|INFO|Removing iface tapf8129917-82 ovn-installed in OVS
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.602 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.609 186962 DEBUG nova.virt.libvirt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Received event <DeviceRemovedEvent: 1764402251.6089082, 1054c168-50b7-42e4-aedb-6ddca8a197a4 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.610 186962 DEBUG nova.virt.libvirt.driver [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Start waiting for the detach event from libvirt for device tapf8129917-82 with device alias net1 for instance 1054c168-50b7-42e4-aedb-6ddca8a197a4 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.611 186962 DEBUG nova.virt.libvirt.guest [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3b:cc:f7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf8129917-82"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.614 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:11.614 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:cc:f7 10.100.0.46', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.46/28', 'neutron:device_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79da6174-0485-4e06-8898-c13055f8ac79, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=f8129917-8284-46b6-9714-dc84a6cb4b04) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.615 186962 DEBUG nova.virt.libvirt.guest [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3b:cc:f7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf8129917-82"/></interface>not found in domain: <domain type='kvm' id='79'>
Nov 29 02:44:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:11.615 104094 INFO neutron.agent.ovn.metadata.agent [-] Port f8129917-8284-46b6-9714-dc84a6cb4b04 in datapath 6f1be974-bcaa-4b93-ab01-8adab0060f10 unbound from our chassis#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <name>instance-0000009f</name>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <uuid>1054c168-50b7-42e4-aedb-6ddca8a197a4</uuid>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:name>tempest-TestNetworkBasicOps-server-2110898837</nova:name>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:42:25</nova:creationTime>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:port uuid="7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:port uuid="f8129917-8284-46b6-9714-dc84a6cb4b04">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.46" ipVersion="4"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='serial'>1054c168-50b7-42e4-aedb-6ddca8a197a4</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='uuid'>1054c168-50b7-42e4-aedb-6ddca8a197a4</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk' index='2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.config' index='1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:44:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:11.617 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6f1be974-bcaa-4b93-ab01-8adab0060f10, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:0d:fa:32'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target dev='tap7a99c6a7-ba'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/console.log' append='off'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/console.log' append='off'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c56,c432</label>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c56,c432</imagelabel>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.615 186962 INFO nova.virt.libvirt.driver [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully detached device tapf8129917-82 from instance 1054c168-50b7-42e4-aedb-6ddca8a197a4 from the live domain config.#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.615 186962 DEBUG nova.virt.libvirt.vif [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:41:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2110898837',display_name='tempest-TestNetworkBasicOps-server-2110898837',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2110898837',id=159,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM7UKFuj7QMrD2gwvRJwF1C5ODafQoalw3wt4tl3CWp1E3Ov5Wq4NHZjxUJq9RlkJeCJZQ+vSdDu4+Tn4UobItS+wK5vemrE3fQE4FqHuLj0BbgXviF1Wn0sGulLRgX6cA==',key_name='tempest-TestNetworkBasicOps-1286988878',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:41:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-yxnrze2m',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:41:27Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1054c168-50b7-42e4-aedb-6ddca8a197a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.616 186962 DEBUG nova.network.os_vif_util [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.616 186962 DEBUG nova.network.os_vif_util [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.617 186962 DEBUG os_vif [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.618 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:11.618 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b8257ca5-acac-48ea-a3af-fec9358c4f52]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:11.618 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10 namespace which is not needed anymore#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.619 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8129917-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.620 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.622 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.626 186962 INFO os_vif [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82')#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.627 186962 DEBUG nova.virt.libvirt.guest [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:name>tempest-TestNetworkBasicOps-server-2110898837</nova:name>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:44:11</nova:creationTime>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    <nova:port uuid="7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d">
Nov 29 02:44:11 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:44:11 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:44:11 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.804 186962 DEBUG nova.compute.manager [req-885387cc-f712-46d4-aa29-289dcdb8b504 req-21196cbb-c64a-41ca-b5f3-0c5deb8b116c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-vif-unplugged-f8129917-8284-46b6-9714-dc84a6cb4b04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.805 186962 DEBUG oslo_concurrency.lockutils [req-885387cc-f712-46d4-aa29-289dcdb8b504 req-21196cbb-c64a-41ca-b5f3-0c5deb8b116c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.805 186962 DEBUG oslo_concurrency.lockutils [req-885387cc-f712-46d4-aa29-289dcdb8b504 req-21196cbb-c64a-41ca-b5f3-0c5deb8b116c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.805 186962 DEBUG oslo_concurrency.lockutils [req-885387cc-f712-46d4-aa29-289dcdb8b504 req-21196cbb-c64a-41ca-b5f3-0c5deb8b116c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.806 186962 DEBUG nova.compute.manager [req-885387cc-f712-46d4-aa29-289dcdb8b504 req-21196cbb-c64a-41ca-b5f3-0c5deb8b116c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] No waiting events found dispatching network-vif-unplugged-f8129917-8284-46b6-9714-dc84a6cb4b04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:44:11 np0005539505 nova_compute[186958]: 2025-11-29 07:44:11.806 186962 WARNING nova.compute.manager [req-885387cc-f712-46d4-aa29-289dcdb8b504 req-21196cbb-c64a-41ca-b5f3-0c5deb8b116c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received unexpected event network-vif-unplugged-f8129917-8284-46b6-9714-dc84a6cb4b04 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.174 186962 DEBUG oslo_concurrency.lockutils [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.174 186962 DEBUG oslo_concurrency.lockutils [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.174 186962 DEBUG nova.network.neutron [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:44:12 np0005539505 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[247972]: [NOTICE]   (247976) : haproxy version is 2.8.14-c23fe91
Nov 29 02:44:12 np0005539505 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[247972]: [NOTICE]   (247976) : path to executable is /usr/sbin/haproxy
Nov 29 02:44:12 np0005539505 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[247972]: [WARNING]  (247976) : Exiting Master process...
Nov 29 02:44:12 np0005539505 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[247972]: [ALERT]    (247976) : Current worker (247978) exited with code 143 (Terminated)
Nov 29 02:44:12 np0005539505 neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10[247972]: [WARNING]  (247976) : All workers exited. Exiting... (0)
Nov 29 02:44:12 np0005539505 systemd[1]: libpod-6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873.scope: Deactivated successfully.
Nov 29 02:44:12 np0005539505 podman[248570]: 2025-11-29 07:44:12.228702669 +0000 UTC m=+0.522990396 container died 6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.295 186962 DEBUG nova.compute.manager [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-vif-deleted-f8129917-8284-46b6-9714-dc84a6cb4b04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.295 186962 INFO nova.compute.manager [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Neutron deleted interface f8129917-8284-46b6-9714-dc84a6cb4b04; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.295 186962 DEBUG nova.network.neutron [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.316 186962 DEBUG nova.objects.instance [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lazy-loading 'system_metadata' on Instance uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.338 186962 DEBUG nova.objects.instance [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lazy-loading 'flavor' on Instance uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.364 186962 DEBUG nova.virt.libvirt.vif [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:41:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2110898837',display_name='tempest-TestNetworkBasicOps-server-2110898837',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2110898837',id=159,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM7UKFuj7QMrD2gwvRJwF1C5ODafQoalw3wt4tl3CWp1E3Ov5Wq4NHZjxUJq9RlkJeCJZQ+vSdDu4+Tn4UobItS+wK5vemrE3fQE4FqHuLj0BbgXviF1Wn0sGulLRgX6cA==',key_name='tempest-TestNetworkBasicOps-1286988878',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:41:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-yxnrze2m',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:41:27Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1054c168-50b7-42e4-aedb-6ddca8a197a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.365 186962 DEBUG nova.network.os_vif_util [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converting VIF {"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.365 186962 DEBUG nova.network.os_vif_util [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.367 186962 DEBUG nova.virt.libvirt.guest [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3b:cc:f7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf8129917-82"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.370 186962 DEBUG nova.virt.libvirt.guest [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3b:cc:f7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf8129917-82"/></interface>not found in domain: <domain type='kvm' id='79'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <name>instance-0000009f</name>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <uuid>1054c168-50b7-42e4-aedb-6ddca8a197a4</uuid>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:name>tempest-TestNetworkBasicOps-server-2110898837</nova:name>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:44:11</nova:creationTime>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:port uuid="7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d">
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:44:12 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='serial'>1054c168-50b7-42e4-aedb-6ddca8a197a4</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='uuid'>1054c168-50b7-42e4-aedb-6ddca8a197a4</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk' index='2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.config' index='1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:0d:fa:32'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target dev='tap7a99c6a7-ba'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/console.log' append='off'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/console.log' append='off'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c56,c432</label>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c56,c432</imagelabel>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:44:12 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:44:12 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.371 186962 DEBUG nova.virt.libvirt.guest [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3b:cc:f7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf8129917-82"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.374 186962 DEBUG nova.virt.libvirt.guest [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3b:cc:f7"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapf8129917-82"/></interface>not found in domain: <domain type='kvm' id='79'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <name>instance-0000009f</name>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <uuid>1054c168-50b7-42e4-aedb-6ddca8a197a4</uuid>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:name>tempest-TestNetworkBasicOps-server-2110898837</nova:name>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:44:11</nova:creationTime>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:port uuid="7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d">
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:44:12 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <memory unit='KiB'>131072</memory>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <vcpu placement='static'>1</vcpu>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <resource>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <partition>/machine</partition>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </resource>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <sysinfo type='smbios'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='manufacturer'>RDO</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='product'>OpenStack Compute</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='serial'>1054c168-50b7-42e4-aedb-6ddca8a197a4</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='uuid'>1054c168-50b7-42e4-aedb-6ddca8a197a4</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <entry name='family'>Virtual Machine</entry>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <boot dev='hd'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <smbios mode='sysinfo'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <vmcoreinfo state='on'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <cpu mode='custom' match='exact' check='full'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <model fallback='forbid'>Nehalem</model>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <feature policy='require' name='x2apic'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <feature policy='require' name='hypervisor'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <feature policy='require' name='vme'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <clock offset='utc'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <timer name='pit' tickpolicy='delay'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <timer name='hpet' present='no'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <on_poweroff>destroy</on_poweroff>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <on_reboot>restart</on_reboot>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <on_crash>destroy</on_crash>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <disk type='file' device='disk'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk' index='2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <backingStore type='file' index='3'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:        <format type='raw'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:        <source file='/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:        <backingStore/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      </backingStore>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target dev='vda' bus='virtio'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='virtio-disk0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <disk type='file' device='cdrom'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <driver name='qemu' type='raw' cache='none'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <source file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/disk.config' index='1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <backingStore/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target dev='sda' bus='sata'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <readonly/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='sata0-0-0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='0' model='pcie-root'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pcie.0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='1' port='0x10'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='2' port='0x11'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='3' port='0x12'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.3'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='4' port='0x13'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.4'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='5' port='0x14'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.5'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='6' port='0x15'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.6'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='7' port='0x16'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.7'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='8' port='0x17'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.8'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='9' port='0x18'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.9'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='10' port='0x19'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.10'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='11' port='0x1a'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.11'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='12' port='0x1b'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.12'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='13' port='0x1c'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.13'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='14' port='0x1d'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.14'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='15' port='0x1e'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.15'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='16' port='0x1f'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.16'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='17' port='0x20'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.17'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='18' port='0x21'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.18'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='19' port='0x22'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.19'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='20' port='0x23'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.20'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='21' port='0x24'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.21'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='22' port='0x25'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.22'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='23' port='0x26'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.23'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='24' port='0x27'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.24'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-root-port'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target chassis='25' port='0x28'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.25'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model name='pcie-pci-bridge'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='pci.26'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='usb'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <controller type='sata' index='0'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='ide'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </controller>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <interface type='ethernet'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <mac address='fa:16:3e:0d:fa:32'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target dev='tap7a99c6a7-ba'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model type='virtio'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <driver name='vhost' rx_queue_size='512'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <mtu size='1442'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='net0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <serial type='pty'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/console.log' append='off'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target type='isa-serial' port='0'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:        <model name='isa-serial'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      </target>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <console type='pty' tty='/dev/pts/0'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <source path='/dev/pts/0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <log file='/var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4/console.log' append='off'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <target type='serial' port='0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='serial0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </console>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <input type='tablet' bus='usb'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='input0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='usb' bus='0' port='1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <input type='mouse' bus='ps2'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='input1'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <input type='keyboard' bus='ps2'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='input2'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </input>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <listen type='address' address='::0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </graphics>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <audio id='1' type='none'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <model type='virtio' heads='1' primary='yes'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='video0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <watchdog model='itco' action='reset'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='watchdog0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </watchdog>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <memballoon model='virtio'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <stats period='10'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='balloon0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <rng model='virtio'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <backend model='random'>/dev/urandom</backend>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <alias name='rng0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <label>system_u:system_r:svirt_t:s0:c56,c432</label>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c56,c432</imagelabel>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <label>+107:+107</label>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <imagelabel>+107:+107</imagelabel>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </seclabel>
Nov 29 02:44:12 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:44:12 np0005539505 nova_compute[186958]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.374 186962 WARNING nova.virt.libvirt.driver [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Detaching interface fa:16:3e:3b:cc:f7 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapf8129917-82' not found.#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.375 186962 DEBUG nova.virt.libvirt.vif [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:41:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2110898837',display_name='tempest-TestNetworkBasicOps-server-2110898837',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2110898837',id=159,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM7UKFuj7QMrD2gwvRJwF1C5ODafQoalw3wt4tl3CWp1E3Ov5Wq4NHZjxUJq9RlkJeCJZQ+vSdDu4+Tn4UobItS+wK5vemrE3fQE4FqHuLj0BbgXviF1Wn0sGulLRgX6cA==',key_name='tempest-TestNetworkBasicOps-1286988878',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:41:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-yxnrze2m',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:41:27Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1054c168-50b7-42e4-aedb-6ddca8a197a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.375 186962 DEBUG nova.network.os_vif_util [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converting VIF {"id": "f8129917-8284-46b6-9714-dc84a6cb4b04", "address": "fa:16:3e:3b:cc:f7", "network": {"id": "6f1be974-bcaa-4b93-ab01-8adab0060f10", "bridge": "br-int", "label": "tempest-network-smoke--1149019234", "subnets": [{"cidr": "10.100.0.32/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.46", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8129917-82", "ovs_interfaceid": "f8129917-8284-46b6-9714-dc84a6cb4b04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.376 186962 DEBUG nova.network.os_vif_util [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.376 186962 DEBUG os_vif [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.377 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.378 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8129917-82, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.378 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.379 186962 INFO os_vif [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:cc:f7,bridge_name='br-int',has_traffic_filtering=True,id=f8129917-8284-46b6-9714-dc84a6cb4b04,network=Network(6f1be974-bcaa-4b93-ab01-8adab0060f10),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8129917-82')#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.380 186962 DEBUG nova.virt.libvirt.guest [req-039aca36-703d-4b21-a64d-6b5a5799e89b req-b790d75b-9179-4b7a-8aed-e51fc4d31336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:name>tempest-TestNetworkBasicOps-server-2110898837</nova:name>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:creationTime>2025-11-29 07:44:12</nova:creationTime>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:flavor name="m1.nano">
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:memory>128</nova:memory>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:disk>1</nova:disk>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:swap>0</nova:swap>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:vcpus>1</nova:vcpus>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </nova:flavor>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:owner>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </nova:owner>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  <nova:ports>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    <nova:port uuid="7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d">
Nov 29 02:44:12 np0005539505 nova_compute[186958]:      <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:    </nova:port>
Nov 29 02:44:12 np0005539505 nova_compute[186958]:  </nova:ports>
Nov 29 02:44:12 np0005539505 nova_compute[186958]: </nova:instance>
Nov 29 02:44:12 np0005539505 nova_compute[186958]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.745 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:12 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:12.851 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.853 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:12 np0005539505 nova_compute[186958]: 2025-11-29 07:44:12.932 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:44:13Z|00763|binding|INFO|Releasing lport 3e62b79d-3423-4f59-a770-7cbfebfe062a from this chassis (sb_readonly=0)
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.123 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.283 186962 INFO nova.network.neutron [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Port f8129917-8284-46b6-9714-dc84a6cb4b04 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.284 186962 DEBUG nova.network.neutron [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.299 186962 DEBUG oslo_concurrency.lockutils [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.331 186962 DEBUG oslo_concurrency.lockutils [None req-c365d45a-6984-4484-b09e-6575b37bba6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "interface-1054c168-50b7-42e4-aedb-6ddca8a197a4-f8129917-8284-46b6-9714-dc84a6cb4b04" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:13 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873-userdata-shm.mount: Deactivated successfully.
Nov 29 02:44:13 np0005539505 systemd[1]: var-lib-containers-storage-overlay-ec088acc589728e063b7da2c3bb90c76d29b7e4aa4df3d6b39e9ab55f2fa2acf-merged.mount: Deactivated successfully.
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.669 186962 DEBUG oslo_concurrency.lockutils [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.669 186962 DEBUG oslo_concurrency.lockutils [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.670 186962 DEBUG oslo_concurrency.lockutils [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.670 186962 DEBUG oslo_concurrency.lockutils [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.670 186962 DEBUG oslo_concurrency.lockutils [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:13 np0005539505 podman[248570]: 2025-11-29 07:44:13.672611746 +0000 UTC m=+1.966899463 container cleanup 6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.683 186962 INFO nova.compute.manager [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Terminating instance#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.696 186962 DEBUG nova.compute.manager [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:44:13 np0005539505 systemd[1]: libpod-conmon-6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873.scope: Deactivated successfully.
Nov 29 02:44:13 np0005539505 kernel: tap7a99c6a7-ba (unregistering): left promiscuous mode
Nov 29 02:44:13 np0005539505 NetworkManager[55134]: <info>  [1764402253.7239] device (tap7a99c6a7-ba): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:44:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:44:13Z|00764|binding|INFO|Releasing lport 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d from this chassis (sb_readonly=0)
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.731 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:44:13Z|00765|binding|INFO|Setting lport 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d down in Southbound
Nov 29 02:44:13 np0005539505 ovn_controller[95143]: 2025-11-29T07:44:13Z|00766|binding|INFO|Removing iface tap7a99c6a7-ba ovn-installed in OVS
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.734 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.740 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:fa:32 10.100.0.12'], port_security=['fa:16:3e:0d:fa:32 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '1054c168-50b7-42e4-aedb-6ddca8a197a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-699261cc-4df3-4556-934e-8ab5d6d3f144', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a4dc7eec-7c17-458c-a1f4-6b636722c8d4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dd9dc223-0c14-4e63-a38e-bc90457ac64b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.749 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:13 np0005539505 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d0000009f.scope: Deactivated successfully.
Nov 29 02:44:13 np0005539505 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d0000009f.scope: Consumed 21.620s CPU time.
Nov 29 02:44:13 np0005539505 systemd-machined[153285]: Machine qemu-79-instance-0000009f terminated.
Nov 29 02:44:13 np0005539505 podman[248599]: 2025-11-29 07:44:13.883918064 +0000 UTC m=+0.191198522 container remove 6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.890 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc956e3-b240-401d-abbf-22b3678d8f96]: (4, ('Sat Nov 29 07:44:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10 (6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873)\n6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873\nSat Nov 29 07:44:13 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10 (6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873)\n6b55ba26da66e45b0760dfec20739177409ce96f652d5dbef812ce078c43e873\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.892 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e03e63ee-675d-4b7a-865e-ac67e7d969f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.893 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6f1be974-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.894 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:13 np0005539505 kernel: tap6f1be974-b0: left promiscuous mode
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.899 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.900 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[20660c7b-54c8-4ebe-a39b-5d450e7b4d4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.917 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dee25588-5d88-40e8-9f50-989b3102868e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.918 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bdee4392-0e85-43f0-9d6b-11a2093a3af6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.934 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3689af4c-3d7a-4bdd-9d5f-b252d7ee3673]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 759231, 'reachable_time': 17372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248629, 'error': None, 'target': 'ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.936 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6f1be974-bcaa-4b93-ab01-8adab0060f10 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.936 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[7adabc54-f37a-4824-a838-faaafd26c5a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.937 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:44:13 np0005539505 systemd[1]: run-netns-ovnmeta\x2d6f1be974\x2dbcaa\x2d4b93\x2dab01\x2d8adab0060f10.mount: Deactivated successfully.
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.938 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d in datapath 699261cc-4df3-4556-934e-8ab5d6d3f144 unbound from our chassis#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.939 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 699261cc-4df3-4556-934e-8ab5d6d3f144, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.940 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd442a8-986b-4f85-aefb-464721474efa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:13 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:13.941 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144 namespace which is not needed anymore#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.961 186962 INFO nova.virt.libvirt.driver [-] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Instance destroyed successfully.#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.961 186962 DEBUG nova.objects.instance [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid 1054c168-50b7-42e4-aedb-6ddca8a197a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.980 186962 DEBUG nova.virt.libvirt.vif [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:41:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2110898837',display_name='tempest-TestNetworkBasicOps-server-2110898837',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2110898837',id=159,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM7UKFuj7QMrD2gwvRJwF1C5ODafQoalw3wt4tl3CWp1E3Ov5Wq4NHZjxUJq9RlkJeCJZQ+vSdDu4+Tn4UobItS+wK5vemrE3fQE4FqHuLj0BbgXviF1Wn0sGulLRgX6cA==',key_name='tempest-TestNetworkBasicOps-1286988878',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:41:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-yxnrze2m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:41:27Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1054c168-50b7-42e4-aedb-6ddca8a197a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.981 186962 DEBUG nova.network.os_vif_util [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.981 186962 DEBUG nova.network.os_vif_util [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d,network=Network(699261cc-4df3-4556-934e-8ab5d6d3f144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a99c6a7-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.982 186962 DEBUG os_vif [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d,network=Network(699261cc-4df3-4556-934e-8ab5d6d3f144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a99c6a7-ba') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.983 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.983 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a99c6a7-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.984 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.985 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.987 186962 INFO os_vif [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:fa:32,bridge_name='br-int',has_traffic_filtering=True,id=7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d,network=Network(699261cc-4df3-4556-934e-8ab5d6d3f144),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a99c6a7-ba')#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.988 186962 INFO nova.virt.libvirt.driver [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Deleting instance files /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4_del#033[00m
Nov 29 02:44:13 np0005539505 nova_compute[186958]: 2025-11-29 07:44:13.988 186962 INFO nova.virt.libvirt.driver [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Deletion of /var/lib/nova/instances/1054c168-50b7-42e4-aedb-6ddca8a197a4_del complete#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.016 186962 DEBUG nova.compute.manager [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-vif-plugged-f8129917-8284-46b6-9714-dc84a6cb4b04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.016 186962 DEBUG oslo_concurrency.lockutils [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.017 186962 DEBUG oslo_concurrency.lockutils [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.017 186962 DEBUG oslo_concurrency.lockutils [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.017 186962 DEBUG nova.compute.manager [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] No waiting events found dispatching network-vif-plugged-f8129917-8284-46b6-9714-dc84a6cb4b04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.018 186962 WARNING nova.compute.manager [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received unexpected event network-vif-plugged-f8129917-8284-46b6-9714-dc84a6cb4b04 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.018 186962 DEBUG nova.compute.manager [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-changed-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.018 186962 DEBUG nova.compute.manager [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Refreshing instance network info cache due to event network-changed-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.018 186962 DEBUG oslo_concurrency.lockutils [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.018 186962 DEBUG oslo_concurrency.lockutils [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.018 186962 DEBUG nova.network.neutron [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Refreshing network info cache for port 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:44:14 np0005539505 neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144[247552]: [NOTICE]   (247587) : haproxy version is 2.8.14-c23fe91
Nov 29 02:44:14 np0005539505 neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144[247552]: [NOTICE]   (247587) : path to executable is /usr/sbin/haproxy
Nov 29 02:44:14 np0005539505 neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144[247552]: [WARNING]  (247587) : Exiting Master process...
Nov 29 02:44:14 np0005539505 neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144[247552]: [ALERT]    (247587) : Current worker (247598) exited with code 143 (Terminated)
Nov 29 02:44:14 np0005539505 neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144[247552]: [WARNING]  (247587) : All workers exited. Exiting... (0)
Nov 29 02:44:14 np0005539505 systemd[1]: libpod-204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d.scope: Deactivated successfully.
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.074 186962 INFO nova.compute.manager [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.075 186962 DEBUG oslo.service.loopingcall [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.076 186962 DEBUG nova.compute.manager [-] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.076 186962 DEBUG nova.network.neutron [-] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:44:14 np0005539505 podman[248655]: 2025-11-29 07:44:14.081487109 +0000 UTC m=+0.061823964 container died 204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:44:14 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d-userdata-shm.mount: Deactivated successfully.
Nov 29 02:44:14 np0005539505 systemd[1]: var-lib-containers-storage-overlay-bc5284d99314774fb41c7ff8fedba3829dc8a84836e22d985964efdb503197c4-merged.mount: Deactivated successfully.
Nov 29 02:44:14 np0005539505 podman[248655]: 2025-11-29 07:44:14.189939728 +0000 UTC m=+0.170276573 container cleanup 204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 02:44:14 np0005539505 systemd[1]: libpod-conmon-204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d.scope: Deactivated successfully.
Nov 29 02:44:14 np0005539505 podman[248687]: 2025-11-29 07:44:14.252803941 +0000 UTC m=+0.043865869 container remove 204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:44:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:14.258 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6fb3f1-d651-4f32-9215-ce89568b4c47]: (4, ('Sat Nov 29 07:44:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144 (204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d)\n204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d\nSat Nov 29 07:44:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144 (204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d)\n204b64692429b6da56d0c1c3f7c43c054ab9f149bf774459d52c69001576932d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:14.259 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[48d60d3f-4083-4bb7-95d0-b757c90e76a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:14.260 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap699261cc-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:14 np0005539505 kernel: tap699261cc-40: left promiscuous mode
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.262 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.274 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:14.275 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f8eb17d3-7cd8-40e2-b182-19e8be9651b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:14.290 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b515276e-606a-46fa-a42d-1cebe7bd42db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:14.292 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e7e936-244f-4660-86a4-771edbdeff91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:14.307 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cd42a70b-4087-4f83-b7e4-79bb47f63603]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 753447, 'reachable_time': 35514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248701, 'error': None, 'target': 'ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:14.309 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-699261cc-4df3-4556-934e-8ab5d6d3f144 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:44:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:14.309 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[4ce25305-3124-4972-8ab3-15cba3e7a61e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:44:14 np0005539505 systemd[1]: run-netns-ovnmeta\x2d699261cc\x2d4df3\x2d4556\x2d934e\x2d8ab5d6d3f144.mount: Deactivated successfully.
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.911 186962 DEBUG nova.network.neutron [-] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.928 186962 INFO nova.compute.manager [-] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Took 0.85 seconds to deallocate network for instance.#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.997 186962 DEBUG oslo_concurrency.lockutils [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:14 np0005539505 nova_compute[186958]: 2025-11-29 07:44:14.998 186962 DEBUG oslo_concurrency.lockutils [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:15 np0005539505 nova_compute[186958]: 2025-11-29 07:44:15.047 186962 DEBUG nova.compute.provider_tree [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:44:15 np0005539505 nova_compute[186958]: 2025-11-29 07:44:15.063 186962 DEBUG nova.scheduler.client.report [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:44:15 np0005539505 nova_compute[186958]: 2025-11-29 07:44:15.086 186962 DEBUG oslo_concurrency.lockutils [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:15 np0005539505 nova_compute[186958]: 2025-11-29 07:44:15.107 186962 INFO nova.scheduler.client.report [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance 1054c168-50b7-42e4-aedb-6ddca8a197a4#033[00m
Nov 29 02:44:15 np0005539505 nova_compute[186958]: 2025-11-29 07:44:15.229 186962 DEBUG oslo_concurrency.lockutils [None req-02e6bb06-a2f9-4f0b-aa66-95a2cc34d46b 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.560s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:15 np0005539505 nova_compute[186958]: 2025-11-29 07:44:15.358 186962 DEBUG nova.network.neutron [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updated VIF entry in instance network info cache for port 7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:44:15 np0005539505 nova_compute[186958]: 2025-11-29 07:44:15.358 186962 DEBUG nova.network.neutron [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Updating instance_info_cache with network_info: [{"id": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "address": "fa:16:3e:0d:fa:32", "network": {"id": "699261cc-4df3-4556-934e-8ab5d6d3f144", "bridge": "br-int", "label": "tempest-network-smoke--1002143632", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a99c6a7-ba", "ovs_interfaceid": "7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:44:15 np0005539505 nova_compute[186958]: 2025-11-29 07:44:15.382 186962 DEBUG oslo_concurrency.lockutils [req-39aed80a-9705-4997-97e8-2fbf1395f55e req-5b60cd7c-2265-4d24-ae85-c1c8be81403b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1054c168-50b7-42e4-aedb-6ddca8a197a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.113 186962 DEBUG nova.compute.manager [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-vif-unplugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.113 186962 DEBUG oslo_concurrency.lockutils [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.114 186962 DEBUG oslo_concurrency.lockutils [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.114 186962 DEBUG oslo_concurrency.lockutils [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.114 186962 DEBUG nova.compute.manager [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] No waiting events found dispatching network-vif-unplugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.115 186962 WARNING nova.compute.manager [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received unexpected event network-vif-unplugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.115 186962 DEBUG nova.compute.manager [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-vif-plugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.115 186962 DEBUG oslo_concurrency.lockutils [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.116 186962 DEBUG oslo_concurrency.lockutils [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.116 186962 DEBUG oslo_concurrency.lockutils [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1054c168-50b7-42e4-aedb-6ddca8a197a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.116 186962 DEBUG nova.compute.manager [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] No waiting events found dispatching network-vif-plugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.116 186962 WARNING nova.compute.manager [req-1b7fca79-c6f9-4bb9-a5ff-bd4c95465bbf req-ecd684e4-bad1-4fa7-866e-12258586d702 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received unexpected event network-vif-plugged-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:44:16 np0005539505 nova_compute[186958]: 2025-11-29 07:44:16.588 186962 DEBUG nova.compute.manager [req-7de6e117-d859-4ada-968b-5aae5fe45b67 req-dfdbd017-c218-4cc4-acf9-0984006293f1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Received event network-vif-deleted-7a99c6a7-ba75-451c-89bf-a0fdf1e52f0d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:17 np0005539505 nova_compute[186958]: 2025-11-29 07:44:17.747 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:18 np0005539505 nova_compute[186958]: 2025-11-29 07:44:18.986 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:19 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:19.939 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:44:20 np0005539505 nova_compute[186958]: 2025-11-29 07:44:20.758 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:20 np0005539505 nova_compute[186958]: 2025-11-29 07:44:20.973 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:21 np0005539505 podman[248704]: 2025-11-29 07:44:21.72504324 +0000 UTC m=+0.055784800 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:44:21 np0005539505 podman[248703]: 2025-11-29 07:44:21.738084114 +0000 UTC m=+0.070366559 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_id=edpm, architecture=x86_64, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:44:22 np0005539505 nova_compute[186958]: 2025-11-29 07:44:22.800 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:23 np0005539505 nova_compute[186958]: 2025-11-29 07:44:23.990 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:25 np0005539505 nova_compute[186958]: 2025-11-29 07:44:25.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:25 np0005539505 podman[248751]: 2025-11-29 07:44:25.720470248 +0000 UTC m=+0.053882154 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 29 02:44:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:27.527 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:27.528 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:44:27.528 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:27 np0005539505 nova_compute[186958]: 2025-11-29 07:44:27.802 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:28 np0005539505 nova_compute[186958]: 2025-11-29 07:44:28.961 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402253.9594045, 1054c168-50b7-42e4-aedb-6ddca8a197a4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:44:28 np0005539505 nova_compute[186958]: 2025-11-29 07:44:28.961 186962 INFO nova.compute.manager [-] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:44:28 np0005539505 nova_compute[186958]: 2025-11-29 07:44:28.994 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:30 np0005539505 nova_compute[186958]: 2025-11-29 07:44:30.038 186962 DEBUG nova.compute.manager [None req-f6371615-8ed9-40e3-85b7-2b7d8c5a86c7 - - - - - -] [instance: 1054c168-50b7-42e4-aedb-6ddca8a197a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:44:32 np0005539505 nova_compute[186958]: 2025-11-29 07:44:32.804 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:33 np0005539505 podman[248770]: 2025-11-29 07:44:33.717114092 +0000 UTC m=+0.050483058 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:44:33 np0005539505 podman[248771]: 2025-11-29 07:44:33.751882899 +0000 UTC m=+0.080727465 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 02:44:33 np0005539505 nova_compute[186958]: 2025-11-29 07:44:33.997 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:34 np0005539505 nova_compute[186958]: 2025-11-29 07:44:34.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:34 np0005539505 nova_compute[186958]: 2025-11-29 07:44:34.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:34 np0005539505 nova_compute[186958]: 2025-11-29 07:44:34.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:34 np0005539505 nova_compute[186958]: 2025-11-29 07:44:34.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:44:37 np0005539505 nova_compute[186958]: 2025-11-29 07:44:37.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:37 np0005539505 nova_compute[186958]: 2025-11-29 07:44:37.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:44:37 np0005539505 nova_compute[186958]: 2025-11-29 07:44:37.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:44:37 np0005539505 nova_compute[186958]: 2025-11-29 07:44:37.409 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:44:37 np0005539505 nova_compute[186958]: 2025-11-29 07:44:37.409 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:37 np0005539505 podman[248819]: 2025-11-29 07:44:37.725039239 +0000 UTC m=+0.053214537 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:44:37 np0005539505 podman[248818]: 2025-11-29 07:44:37.745464195 +0000 UTC m=+0.078990986 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Nov 29 02:44:37 np0005539505 nova_compute[186958]: 2025-11-29 07:44:37.807 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:39 np0005539505 nova_compute[186958]: 2025-11-29 07:44:38.999 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:40 np0005539505 nova_compute[186958]: 2025-11-29 07:44:40.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.438 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.439 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.439 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.439 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.595 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.597 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5729MB free_disk=73.07309341430664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.598 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.598 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.755 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.756 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.808 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.834 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.868 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.891 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:44:42 np0005539505 nova_compute[186958]: 2025-11-29 07:44:42.892 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:44 np0005539505 nova_compute[186958]: 2025-11-29 07:44:44.002 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:46 np0005539505 nova_compute[186958]: 2025-11-29 07:44:46.888 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:47 np0005539505 nova_compute[186958]: 2025-11-29 07:44:47.810 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:44:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:44:48 np0005539505 nova_compute[186958]: 2025-11-29 07:44:48.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:44:49 np0005539505 nova_compute[186958]: 2025-11-29 07:44:49.005 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:52 np0005539505 podman[248862]: 2025-11-29 07:44:52.714606201 +0000 UTC m=+0.048947264 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:44:52 np0005539505 podman[248861]: 2025-11-29 07:44:52.724988679 +0000 UTC m=+0.059082795 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 02:44:52 np0005539505 nova_compute[186958]: 2025-11-29 07:44:52.811 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:54 np0005539505 nova_compute[186958]: 2025-11-29 07:44:54.007 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:56 np0005539505 podman[248905]: 2025-11-29 07:44:56.749923094 +0000 UTC m=+0.075719962 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:44:57 np0005539505 nova_compute[186958]: 2025-11-29 07:44:57.270 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "1186f0c5-bea2-440e-a805-9b10c97141d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:57 np0005539505 nova_compute[186958]: 2025-11-29 07:44:57.270 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:57 np0005539505 nova_compute[186958]: 2025-11-29 07:44:57.289 186962 DEBUG nova.compute.manager [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:44:57 np0005539505 nova_compute[186958]: 2025-11-29 07:44:57.390 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:57 np0005539505 nova_compute[186958]: 2025-11-29 07:44:57.390 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:57 np0005539505 nova_compute[186958]: 2025-11-29 07:44:57.398 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:44:57 np0005539505 nova_compute[186958]: 2025-11-29 07:44:57.398 186962 INFO nova.compute.claims [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:44:57 np0005539505 nova_compute[186958]: 2025-11-29 07:44:57.851 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.122 186962 DEBUG nova.compute.provider_tree [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.141 186962 DEBUG nova.scheduler.client.report [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.167 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.777s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.168 186962 DEBUG nova.compute.manager [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.243 186962 DEBUG nova.compute.manager [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.244 186962 DEBUG nova.network.neutron [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.264 186962 INFO nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.289 186962 DEBUG nova.compute.manager [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.445 186962 DEBUG nova.compute.manager [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.447 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.447 186962 INFO nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Creating image(s)#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.448 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.448 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.448 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.460 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.534 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.535 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.536 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.548 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.599 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.600 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:58 np0005539505 nova_compute[186958]: 2025-11-29 07:44:58.779 186962 DEBUG nova.policy [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.010 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.052 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk 1073741824" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.054 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.518s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.055 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.143 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.145 186962 DEBUG nova.virt.disk.api [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.146 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.243 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.244 186962 DEBUG nova.virt.disk.api [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.245 186962 DEBUG nova.objects.instance [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 1186f0c5-bea2-440e-a805-9b10c97141d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.260 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.261 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Ensure instance console log exists: /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.261 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.262 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.262 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.826 186962 DEBUG nova.network.neutron [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Successfully updated port: 2d4d5100-ecb5-4399-baa7-6c217e5618d4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.847 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-1186f0c5-bea2-440e-a805-9b10c97141d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.847 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-1186f0c5-bea2-440e-a805-9b10c97141d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.847 186962 DEBUG nova.network.neutron [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.940 186962 DEBUG nova.compute.manager [req-b6010502-5c4c-42cb-8376-405af9b2986d req-18b9e5d4-70a7-48f8-b5f1-413d515146df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Received event network-changed-2d4d5100-ecb5-4399-baa7-6c217e5618d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.940 186962 DEBUG nova.compute.manager [req-b6010502-5c4c-42cb-8376-405af9b2986d req-18b9e5d4-70a7-48f8-b5f1-413d515146df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Refreshing instance network info cache due to event network-changed-2d4d5100-ecb5-4399-baa7-6c217e5618d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:44:59 np0005539505 nova_compute[186958]: 2025-11-29 07:44:59.940 186962 DEBUG oslo_concurrency.lockutils [req-b6010502-5c4c-42cb-8376-405af9b2986d req-18b9e5d4-70a7-48f8-b5f1-413d515146df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1186f0c5-bea2-440e-a805-9b10c97141d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:45:00 np0005539505 nova_compute[186958]: 2025-11-29 07:45:00.029 186962 DEBUG nova.network.neutron [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.493 186962 DEBUG nova.network.neutron [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Updating instance_info_cache with network_info: [{"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.873 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-1186f0c5-bea2-440e-a805-9b10c97141d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.873 186962 DEBUG nova.compute.manager [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Instance network_info: |[{"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.874 186962 DEBUG oslo_concurrency.lockutils [req-b6010502-5c4c-42cb-8376-405af9b2986d req-18b9e5d4-70a7-48f8-b5f1-413d515146df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1186f0c5-bea2-440e-a805-9b10c97141d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.874 186962 DEBUG nova.network.neutron [req-b6010502-5c4c-42cb-8376-405af9b2986d req-18b9e5d4-70a7-48f8-b5f1-413d515146df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Refreshing network info cache for port 2d4d5100-ecb5-4399-baa7-6c217e5618d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.880 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Start _get_guest_xml network_info=[{"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.884 186962 WARNING nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.889 186962 DEBUG nova.virt.libvirt.host [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.890 186962 DEBUG nova.virt.libvirt.host [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.892 186962 DEBUG nova.virt.libvirt.host [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.893 186962 DEBUG nova.virt.libvirt.host [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.894 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.895 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.895 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.895 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.895 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.896 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.896 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.896 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.896 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.897 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.897 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.897 186962 DEBUG nova.virt.hardware [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.901 186962 DEBUG nova.virt.libvirt.vif [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:44:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1015188603',display_name='tempest-TestNetworkBasicOps-server-1015188603',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1015188603',id=164,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAhseXUdjfAvLqKD1zIa0pPpJoGcUMy89F99veWf+LlOCe0nfMtz7vPl1fK/1GFr4CHeyQygvqqJSK0C3yi8SBKl9DywNFZI7M4aRAiTi396ssrwSMEKhQtO0VY/qblnvw==',key_name='tempest-TestNetworkBasicOps-1251454727',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-m3w63sjf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:44:58Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1186f0c5-bea2-440e-a805-9b10c97141d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.901 186962 DEBUG nova.network.os_vif_util [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.902 186962 DEBUG nova.network.os_vif_util [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.902 186962 DEBUG nova.objects.instance [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1186f0c5-bea2-440e-a805-9b10c97141d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.946 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  <uuid>1186f0c5-bea2-440e-a805-9b10c97141d5</uuid>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  <name>instance-000000a4</name>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestNetworkBasicOps-server-1015188603</nova:name>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:45:01</nova:creationTime>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:        <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:        <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:        <nova:port uuid="2d4d5100-ecb5-4399-baa7-6c217e5618d4">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <entry name="serial">1186f0c5-bea2-440e-a805-9b10c97141d5</entry>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <entry name="uuid">1186f0c5-bea2-440e-a805-9b10c97141d5</entry>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk.config"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:5e:0e:8a"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <target dev="tap2d4d5100-ec"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/console.log" append="off"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:45:01 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:45:01 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:45:01 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:45:01 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.947 186962 DEBUG nova.compute.manager [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Preparing to wait for external event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.948 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.948 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.948 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.949 186962 DEBUG nova.virt.libvirt.vif [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:44:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1015188603',display_name='tempest-TestNetworkBasicOps-server-1015188603',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1015188603',id=164,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAhseXUdjfAvLqKD1zIa0pPpJoGcUMy89F99veWf+LlOCe0nfMtz7vPl1fK/1GFr4CHeyQygvqqJSK0C3yi8SBKl9DywNFZI7M4aRAiTi396ssrwSMEKhQtO0VY/qblnvw==',key_name='tempest-TestNetworkBasicOps-1251454727',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-m3w63sjf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:44:58Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1186f0c5-bea2-440e-a805-9b10c97141d5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.949 186962 DEBUG nova.network.os_vif_util [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.950 186962 DEBUG nova.network.os_vif_util [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.950 186962 DEBUG os_vif [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.951 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.951 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.952 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.955 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.955 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2d4d5100-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.955 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2d4d5100-ec, col_values=(('external_ids', {'iface-id': '2d4d5100-ecb5-4399-baa7-6c217e5618d4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:0e:8a', 'vm-uuid': '1186f0c5-bea2-440e-a805-9b10c97141d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.957 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:01 np0005539505 NetworkManager[55134]: <info>  [1764402301.9579] manager: (tap2d4d5100-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/374)
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.959 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.962 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:01 np0005539505 nova_compute[186958]: 2025-11-29 07:45:01.963 186962 INFO os_vif [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec')#033[00m
Nov 29 02:45:02 np0005539505 nova_compute[186958]: 2025-11-29 07:45:02.127 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:45:02 np0005539505 nova_compute[186958]: 2025-11-29 07:45:02.128 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:45:02 np0005539505 nova_compute[186958]: 2025-11-29 07:45:02.128 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:5e:0e:8a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:45:02 np0005539505 nova_compute[186958]: 2025-11-29 07:45:02.130 186962 INFO nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Using config drive#033[00m
Nov 29 02:45:02 np0005539505 nova_compute[186958]: 2025-11-29 07:45:02.855 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:04 np0005539505 nova_compute[186958]: 2025-11-29 07:45:04.638 186962 INFO nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Creating config drive at /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk.config#033[00m
Nov 29 02:45:04 np0005539505 nova_compute[186958]: 2025-11-29 07:45:04.648 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3vr_ll3b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:45:04 np0005539505 podman[248941]: 2025-11-29 07:45:04.741845142 +0000 UTC m=+0.074818496 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:45:04 np0005539505 podman[248943]: 2025-11-29 07:45:04.764010968 +0000 UTC m=+0.090700541 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 02:45:04 np0005539505 nova_compute[186958]: 2025-11-29 07:45:04.791 186962 DEBUG oslo_concurrency.processutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3vr_ll3b" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:45:04 np0005539505 kernel: tap2d4d5100-ec: entered promiscuous mode
Nov 29 02:45:04 np0005539505 NetworkManager[55134]: <info>  [1764402304.8570] manager: (tap2d4d5100-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/375)
Nov 29 02:45:04 np0005539505 nova_compute[186958]: 2025-11-29 07:45:04.857 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:04Z|00767|binding|INFO|Claiming lport 2d4d5100-ecb5-4399-baa7-6c217e5618d4 for this chassis.
Nov 29 02:45:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:04Z|00768|binding|INFO|2d4d5100-ecb5-4399-baa7-6c217e5618d4: Claiming fa:16:3e:5e:0e:8a 10.100.0.9
Nov 29 02:45:04 np0005539505 systemd-udevd[249010]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:45:04 np0005539505 systemd-machined[153285]: New machine qemu-80-instance-000000a4.
Nov 29 02:45:04 np0005539505 NetworkManager[55134]: <info>  [1764402304.9031] device (tap2d4d5100-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:45:04 np0005539505 NetworkManager[55134]: <info>  [1764402304.9046] device (tap2d4d5100-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:45:04 np0005539505 systemd[1]: Started Virtual Machine qemu-80-instance-000000a4.
Nov 29 02:45:04 np0005539505 nova_compute[186958]: 2025-11-29 07:45:04.941 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:04 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:04Z|00769|binding|INFO|Setting lport 2d4d5100-ecb5-4399-baa7-6c217e5618d4 ovn-installed in OVS
Nov 29 02:45:04 np0005539505 nova_compute[186958]: 2025-11-29 07:45:04.947 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.232 186962 DEBUG nova.network.neutron [req-b6010502-5c4c-42cb-8376-405af9b2986d req-18b9e5d4-70a7-48f8-b5f1-413d515146df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Updated VIF entry in instance network info cache for port 2d4d5100-ecb5-4399-baa7-6c217e5618d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.233 186962 DEBUG nova.network.neutron [req-b6010502-5c4c-42cb-8376-405af9b2986d req-18b9e5d4-70a7-48f8-b5f1-413d515146df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Updating instance_info_cache with network_info: [{"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:45:05 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:05Z|00770|binding|INFO|Setting lport 2d4d5100-ecb5-4399-baa7-6c217e5618d4 up in Southbound
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.279 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:0e:8a 10.100.0.9'], port_security=['fa:16:3e:5e:0e:8a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1665291496', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1186f0c5-bea2-440e-a805-9b10c97141d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45898937-6ca6-4da8-a84a-53586a0780e5', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1665291496', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '046111b1-8479-4ebb-8db5-573e164c575e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=77dd2b4e-807c-4149-a535-57e12b7bc161, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=2d4d5100-ecb5-4399-baa7-6c217e5618d4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.280 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 2d4d5100-ecb5-4399-baa7-6c217e5618d4 in datapath 45898937-6ca6-4da8-a84a-53586a0780e5 bound to our chassis#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.282 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 45898937-6ca6-4da8-a84a-53586a0780e5#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.295 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e77a9d-f9d1-4ad2-b653-53b0952946c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.296 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap45898937-61 in ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.300 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap45898937-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.300 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8fe7b2d8-17c5-4329-8464-d407ae71720a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.302 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0e8781dd-8319-496e-9f78-576a125da89f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.306 186962 DEBUG oslo_concurrency.lockutils [req-b6010502-5c4c-42cb-8376-405af9b2986d req-18b9e5d4-70a7-48f8-b5f1-413d515146df 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1186f0c5-bea2-440e-a805-9b10c97141d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.314 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[cbe9965b-4d6e-47e1-824d-3c63c95b8fd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.343 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec62c88-7873-4b03-b88b-525416902773]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.377 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c8c847-dd6a-4499-9d77-17f7f5e0f969]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 NetworkManager[55134]: <info>  [1764402305.3852] manager: (tap45898937-60): new Veth device (/org/freedesktop/NetworkManager/Devices/376)
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.387 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fa90b3d8-a900-492f-9c98-a3b808e646b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.417 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[ff0515a7-e582-45a7-839f-0397faf302ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.421 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[8fe28262-f5d7-4fd1-baa4-730874d858e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 NetworkManager[55134]: <info>  [1764402305.4419] device (tap45898937-60): carrier: link connected
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.447 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[4b94e169-9091-445a-85c8-462a911ed891]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.466 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c9b65444-430e-4097-a3c0-3edfac209384]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap45898937-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:26:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775302, 'reachable_time': 32499, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249046, 'error': None, 'target': 'ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.480 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fad39887-5e63-434b-844d-c9b3a1dad9bf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:2607'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 775302, 'tstamp': 775302}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249047, 'error': None, 'target': 'ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.498 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[030455f5-0a63-4b23-93ec-2ccc09a7db3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap45898937-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:26:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 238], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775302, 'reachable_time': 32499, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249048, 'error': None, 'target': 'ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.530 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3745a8c6-3eb6-4e07-bb24-713ef747c24c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.563 186962 DEBUG nova.compute.manager [req-f3918a57-de47-43f2-8dac-28f8feb073fb req-4611579d-be43-4fb8-a598-a9d70475957e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Received event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.564 186962 DEBUG oslo_concurrency.lockutils [req-f3918a57-de47-43f2-8dac-28f8feb073fb req-4611579d-be43-4fb8-a598-a9d70475957e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.565 186962 DEBUG oslo_concurrency.lockutils [req-f3918a57-de47-43f2-8dac-28f8feb073fb req-4611579d-be43-4fb8-a598-a9d70475957e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.565 186962 DEBUG oslo_concurrency.lockutils [req-f3918a57-de47-43f2-8dac-28f8feb073fb req-4611579d-be43-4fb8-a598-a9d70475957e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.566 186962 DEBUG nova.compute.manager [req-f3918a57-de47-43f2-8dac-28f8feb073fb req-4611579d-be43-4fb8-a598-a9d70475957e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Processing event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.616 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[27482821-4560-467f-a659-8e62b55747da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.617 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45898937-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.617 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.618 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap45898937-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.619 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:05 np0005539505 NetworkManager[55134]: <info>  [1764402305.6201] manager: (tap45898937-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/377)
Nov 29 02:45:05 np0005539505 kernel: tap45898937-60: entered promiscuous mode
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.623 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.624 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap45898937-60, col_values=(('external_ids', {'iface-id': 'fca7ed1d-5327-4a82-9ef6-10233b98308b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.625 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:05 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:05Z|00771|binding|INFO|Releasing lport fca7ed1d-5327-4a82-9ef6-10233b98308b from this chassis (sb_readonly=0)
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.626 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402305.625987, 1186f0c5-bea2-440e-a805-9b10c97141d5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.627 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] VM Started (Lifecycle Event)#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.629 186962 DEBUG nova.compute.manager [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.633 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.637 186962 INFO nova.virt.libvirt.driver [-] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Instance spawned successfully.#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.637 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:45:05 np0005539505 nova_compute[186958]: 2025-11-29 07:45:05.648 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.650 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/45898937-6ca6-4da8-a84a-53586a0780e5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/45898937-6ca6-4da8-a84a-53586a0780e5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.651 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e60a68e7-d98c-4088-8479-0381fdd61139]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.652 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-45898937-6ca6-4da8-a84a-53586a0780e5
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/45898937-6ca6-4da8-a84a-53586a0780e5.pid.haproxy
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 45898937-6ca6-4da8-a84a-53586a0780e5
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:45:05 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:05.652 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5', 'env', 'PROCESS_TAG=haproxy-45898937-6ca6-4da8-a84a-53586a0780e5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/45898937-6ca6-4da8-a84a-53586a0780e5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:45:06 np0005539505 podman[249086]: 2025-11-29 07:45:06.009507105 +0000 UTC m=+0.023103543 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:45:06 np0005539505 podman[249086]: 2025-11-29 07:45:06.126913571 +0000 UTC m=+0.140509989 container create cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:45:06 np0005539505 systemd[1]: Started libpod-conmon-cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f.scope.
Nov 29 02:45:06 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:45:06 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a76eab6b570e875fdd44f9ddfe39df4e9e5ac688b3b64a512301a61be66f5fe0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:45:06 np0005539505 podman[249086]: 2025-11-29 07:45:06.282609435 +0000 UTC m=+0.296205873 container init cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 29 02:45:06 np0005539505 podman[249086]: 2025-11-29 07:45:06.29288517 +0000 UTC m=+0.306481628 container start cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:45:06 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249102]: [NOTICE]   (249106) : New worker (249108) forked
Nov 29 02:45:06 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249102]: [NOTICE]   (249106) : Loading success.
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.676 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.681 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.681 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.682 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.682 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.682 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.683 186962 DEBUG nova.virt.libvirt.driver [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.686 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.714 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.714 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402305.6261196, 1186f0c5-bea2-440e-a805-9b10c97141d5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.714 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.956 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.958 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.961 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402305.6318383, 1186f0c5-bea2-440e-a805-9b10c97141d5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:45:06 np0005539505 nova_compute[186958]: 2025-11-29 07:45:06.961 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:45:07 np0005539505 nova_compute[186958]: 2025-11-29 07:45:07.249 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:45:07 np0005539505 nova_compute[186958]: 2025-11-29 07:45:07.253 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:45:07 np0005539505 nova_compute[186958]: 2025-11-29 07:45:07.346 186962 INFO nova.compute.manager [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Took 8.90 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:45:07 np0005539505 nova_compute[186958]: 2025-11-29 07:45:07.347 186962 DEBUG nova.compute.manager [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:45:07 np0005539505 nova_compute[186958]: 2025-11-29 07:45:07.645 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:45:07 np0005539505 nova_compute[186958]: 2025-11-29 07:45:07.855 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:08 np0005539505 nova_compute[186958]: 2025-11-29 07:45:08.628 186962 INFO nova.compute.manager [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Took 11.28 seconds to build instance.#033[00m
Nov 29 02:45:08 np0005539505 podman[249118]: 2025-11-29 07:45:08.745200018 +0000 UTC m=+0.075037473 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:45:08 np0005539505 podman[249117]: 2025-11-29 07:45:08.74633438 +0000 UTC m=+0.079004036 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 29 02:45:09 np0005539505 nova_compute[186958]: 2025-11-29 07:45:09.200 186962 DEBUG oslo_concurrency.lockutils [None req-b964840e-e9d4-42a8-8575-d1ba84981e6d 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:09 np0005539505 nova_compute[186958]: 2025-11-29 07:45:09.959 186962 DEBUG nova.compute.manager [req-afa407d8-20bd-4c6f-b49f-3e42d901bc82 req-97b34eff-2c99-4e8f-841b-c44866c8f2d0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Received event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:45:09 np0005539505 nova_compute[186958]: 2025-11-29 07:45:09.959 186962 DEBUG oslo_concurrency.lockutils [req-afa407d8-20bd-4c6f-b49f-3e42d901bc82 req-97b34eff-2c99-4e8f-841b-c44866c8f2d0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:09 np0005539505 nova_compute[186958]: 2025-11-29 07:45:09.960 186962 DEBUG oslo_concurrency.lockutils [req-afa407d8-20bd-4c6f-b49f-3e42d901bc82 req-97b34eff-2c99-4e8f-841b-c44866c8f2d0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:09 np0005539505 nova_compute[186958]: 2025-11-29 07:45:09.960 186962 DEBUG oslo_concurrency.lockutils [req-afa407d8-20bd-4c6f-b49f-3e42d901bc82 req-97b34eff-2c99-4e8f-841b-c44866c8f2d0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:09 np0005539505 nova_compute[186958]: 2025-11-29 07:45:09.960 186962 DEBUG nova.compute.manager [req-afa407d8-20bd-4c6f-b49f-3e42d901bc82 req-97b34eff-2c99-4e8f-841b-c44866c8f2d0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] No waiting events found dispatching network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:45:09 np0005539505 nova_compute[186958]: 2025-11-29 07:45:09.961 186962 WARNING nova.compute.manager [req-afa407d8-20bd-4c6f-b49f-3e42d901bc82 req-97b34eff-2c99-4e8f-841b-c44866c8f2d0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Received unexpected event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:45:11 np0005539505 nova_compute[186958]: 2025-11-29 07:45:11.961 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:12 np0005539505 nova_compute[186958]: 2025-11-29 07:45:12.857 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:14.443 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:45:14 np0005539505 nova_compute[186958]: 2025-11-29 07:45:14.443 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:14.446 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:45:14 np0005539505 nova_compute[186958]: 2025-11-29 07:45:14.506 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:14 np0005539505 NetworkManager[55134]: <info>  [1764402314.5076] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/378)
Nov 29 02:45:14 np0005539505 NetworkManager[55134]: <info>  [1764402314.5095] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/379)
Nov 29 02:45:14 np0005539505 nova_compute[186958]: 2025-11-29 07:45:14.635 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:14Z|00772|binding|INFO|Releasing lport fca7ed1d-5327-4a82-9ef6-10233b98308b from this chassis (sb_readonly=0)
Nov 29 02:45:14 np0005539505 nova_compute[186958]: 2025-11-29 07:45:14.647 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:14 np0005539505 nova_compute[186958]: 2025-11-29 07:45:14.976 186962 DEBUG nova.compute.manager [req-f78e73ac-4880-4bae-90db-a075327b982e req-337b1366-cdf3-467e-a87a-e59fe36821ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Received event network-changed-2d4d5100-ecb5-4399-baa7-6c217e5618d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:45:14 np0005539505 nova_compute[186958]: 2025-11-29 07:45:14.977 186962 DEBUG nova.compute.manager [req-f78e73ac-4880-4bae-90db-a075327b982e req-337b1366-cdf3-467e-a87a-e59fe36821ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Refreshing instance network info cache due to event network-changed-2d4d5100-ecb5-4399-baa7-6c217e5618d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:45:14 np0005539505 nova_compute[186958]: 2025-11-29 07:45:14.977 186962 DEBUG oslo_concurrency.lockutils [req-f78e73ac-4880-4bae-90db-a075327b982e req-337b1366-cdf3-467e-a87a-e59fe36821ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-1186f0c5-bea2-440e-a805-9b10c97141d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:45:14 np0005539505 nova_compute[186958]: 2025-11-29 07:45:14.978 186962 DEBUG oslo_concurrency.lockutils [req-f78e73ac-4880-4bae-90db-a075327b982e req-337b1366-cdf3-467e-a87a-e59fe36821ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-1186f0c5-bea2-440e-a805-9b10c97141d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:45:14 np0005539505 nova_compute[186958]: 2025-11-29 07:45:14.978 186962 DEBUG nova.network.neutron [req-f78e73ac-4880-4bae-90db-a075327b982e req-337b1366-cdf3-467e-a87a-e59fe36821ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Refreshing network info cache for port 2d4d5100-ecb5-4399-baa7-6c217e5618d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.167 186962 DEBUG oslo_concurrency.lockutils [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "1186f0c5-bea2-440e-a805-9b10c97141d5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.168 186962 DEBUG oslo_concurrency.lockutils [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.168 186962 DEBUG oslo_concurrency.lockutils [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.168 186962 DEBUG oslo_concurrency.lockutils [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.169 186962 DEBUG oslo_concurrency.lockutils [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.184 186962 INFO nova.compute.manager [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Terminating instance#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.195 186962 DEBUG nova.compute.manager [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:45:15 np0005539505 kernel: tap2d4d5100-ec (unregistering): left promiscuous mode
Nov 29 02:45:15 np0005539505 NetworkManager[55134]: <info>  [1764402315.2220] device (tap2d4d5100-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.230 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:15Z|00773|binding|INFO|Releasing lport 2d4d5100-ecb5-4399-baa7-6c217e5618d4 from this chassis (sb_readonly=0)
Nov 29 02:45:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:15Z|00774|binding|INFO|Setting lport 2d4d5100-ecb5-4399-baa7-6c217e5618d4 down in Southbound
Nov 29 02:45:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:15Z|00775|binding|INFO|Removing iface tap2d4d5100-ec ovn-installed in OVS
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.248 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:15.251 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:0e:8a 10.100.0.9'], port_security=['fa:16:3e:5e:0e:8a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1665291496', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '1186f0c5-bea2-440e-a805-9b10c97141d5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45898937-6ca6-4da8-a84a-53586a0780e5', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1665291496', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '046111b1-8479-4ebb-8db5-573e164c575e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=77dd2b4e-807c-4149-a535-57e12b7bc161, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=2d4d5100-ecb5-4399-baa7-6c217e5618d4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:45:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:15.252 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 2d4d5100-ecb5-4399-baa7-6c217e5618d4 in datapath 45898937-6ca6-4da8-a84a-53586a0780e5 unbound from our chassis#033[00m
Nov 29 02:45:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:15.254 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 45898937-6ca6-4da8-a84a-53586a0780e5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:45:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:15.255 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[43ba2b16-6e26-4215-b05d-6a2fa9932b0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:15 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:15.255 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5 namespace which is not needed anymore#033[00m
Nov 29 02:45:15 np0005539505 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a4.scope: Deactivated successfully.
Nov 29 02:45:15 np0005539505 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a4.scope: Consumed 10.436s CPU time.
Nov 29 02:45:15 np0005539505 systemd-machined[153285]: Machine qemu-80-instance-000000a4 terminated.
Nov 29 02:45:15 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249102]: [NOTICE]   (249106) : haproxy version is 2.8.14-c23fe91
Nov 29 02:45:15 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249102]: [NOTICE]   (249106) : path to executable is /usr/sbin/haproxy
Nov 29 02:45:15 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249102]: [WARNING]  (249106) : Exiting Master process...
Nov 29 02:45:15 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249102]: [ALERT]    (249106) : Current worker (249108) exited with code 143 (Terminated)
Nov 29 02:45:15 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249102]: [WARNING]  (249106) : All workers exited. Exiting... (0)
Nov 29 02:45:15 np0005539505 systemd[1]: libpod-cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f.scope: Deactivated successfully.
Nov 29 02:45:15 np0005539505 podman[249176]: 2025-11-29 07:45:15.455359739 +0000 UTC m=+0.116711297 container died cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.470 186962 INFO nova.virt.libvirt.driver [-] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Instance destroyed successfully.#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.471 186962 DEBUG nova.objects.instance [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid 1186f0c5-bea2-440e-a805-9b10c97141d5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.482 186962 DEBUG nova.virt.libvirt.vif [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:44:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1015188603',display_name='tempest-TestNetworkBasicOps-server-1015188603',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1015188603',id=164,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAhseXUdjfAvLqKD1zIa0pPpJoGcUMy89F99veWf+LlOCe0nfMtz7vPl1fK/1GFr4CHeyQygvqqJSK0C3yi8SBKl9DywNFZI7M4aRAiTi396ssrwSMEKhQtO0VY/qblnvw==',key_name='tempest-TestNetworkBasicOps-1251454727',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:45:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-m3w63sjf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:45:07Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=1186f0c5-bea2-440e-a805-9b10c97141d5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.483 186962 DEBUG nova.network.os_vif_util [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.484 186962 DEBUG nova.network.os_vif_util [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.484 186962 DEBUG os_vif [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.486 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.486 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d4d5100-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.488 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.489 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.492 186962 INFO os_vif [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec')#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.493 186962 INFO nova.virt.libvirt.driver [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Deleting instance files /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5_del#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.493 186962 INFO nova.virt.libvirt.driver [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Deletion of /var/lib/nova/instances/1186f0c5-bea2-440e-a805-9b10c97141d5_del complete#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.553 186962 INFO nova.compute.manager [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.553 186962 DEBUG oslo.service.loopingcall [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.554 186962 DEBUG nova.compute.manager [-] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.554 186962 DEBUG nova.network.neutron [-] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:45:15 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f-userdata-shm.mount: Deactivated successfully.
Nov 29 02:45:15 np0005539505 systemd[1]: var-lib-containers-storage-overlay-a76eab6b570e875fdd44f9ddfe39df4e9e5ac688b3b64a512301a61be66f5fe0-merged.mount: Deactivated successfully.
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.808 186962 DEBUG nova.compute.manager [req-ec684260-549a-4eb5-8ce3-fab09bae3908 req-025c1936-8452-4818-afcd-6bd28263f846 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Received event network-vif-unplugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.809 186962 DEBUG oslo_concurrency.lockutils [req-ec684260-549a-4eb5-8ce3-fab09bae3908 req-025c1936-8452-4818-afcd-6bd28263f846 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.810 186962 DEBUG oslo_concurrency.lockutils [req-ec684260-549a-4eb5-8ce3-fab09bae3908 req-025c1936-8452-4818-afcd-6bd28263f846 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.810 186962 DEBUG oslo_concurrency.lockutils [req-ec684260-549a-4eb5-8ce3-fab09bae3908 req-025c1936-8452-4818-afcd-6bd28263f846 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.811 186962 DEBUG nova.compute.manager [req-ec684260-549a-4eb5-8ce3-fab09bae3908 req-025c1936-8452-4818-afcd-6bd28263f846 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] No waiting events found dispatching network-vif-unplugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:45:15 np0005539505 nova_compute[186958]: 2025-11-29 07:45:15.811 186962 DEBUG nova.compute.manager [req-ec684260-549a-4eb5-8ce3-fab09bae3908 req-025c1936-8452-4818-afcd-6bd28263f846 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Received event network-vif-unplugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:45:15 np0005539505 podman[249176]: 2025-11-29 07:45:15.840923743 +0000 UTC m=+0.502275291 container cleanup cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:45:15 np0005539505 systemd[1]: libpod-conmon-cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f.scope: Deactivated successfully.
Nov 29 02:45:16 np0005539505 podman[249222]: 2025-11-29 07:45:16.146792832 +0000 UTC m=+0.282132770 container remove cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:45:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:16.155 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5f172c75-175d-420b-891d-4a2061129cdc]: (4, ('Sat Nov 29 07:45:15 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5 (cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f)\ncb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f\nSat Nov 29 07:45:15 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5 (cb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f)\ncb64cdf9121f83a7c0f798cd6015a9387d3a36a01a2b65503288c3544963d09f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:16.157 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fd122529-8f4b-4a32-9f77-72a2b5efe3a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:16.160 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45898937-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:16 np0005539505 nova_compute[186958]: 2025-11-29 07:45:16.164 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:16 np0005539505 kernel: tap45898937-60: left promiscuous mode
Nov 29 02:45:16 np0005539505 nova_compute[186958]: 2025-11-29 07:45:16.175 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:16.178 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5af92b00-b5a2-46a6-8944-95652db1292f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:16.190 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[05bb9024-4353-48cb-b15b-4c72587bd41f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:16.192 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2f20220c-07e1-4272-ab5d-584585622d54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:16.206 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[55554c5b-5262-45c6-983e-c75ed0a5c8eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 775295, 'reachable_time': 31242, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249237, 'error': None, 'target': 'ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:16 np0005539505 systemd[1]: run-netns-ovnmeta\x2d45898937\x2d6ca6\x2d4da8\x2da84a\x2d53586a0780e5.mount: Deactivated successfully.
Nov 29 02:45:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:16.211 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:45:16 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:16.211 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee03c6b-e3ea-4470-8d8e-5b2653822648]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:16 np0005539505 nova_compute[186958]: 2025-11-29 07:45:16.872 186962 DEBUG nova.network.neutron [req-f78e73ac-4880-4bae-90db-a075327b982e req-337b1366-cdf3-467e-a87a-e59fe36821ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Updated VIF entry in instance network info cache for port 2d4d5100-ecb5-4399-baa7-6c217e5618d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:45:16 np0005539505 nova_compute[186958]: 2025-11-29 07:45:16.873 186962 DEBUG nova.network.neutron [req-f78e73ac-4880-4bae-90db-a075327b982e req-337b1366-cdf3-467e-a87a-e59fe36821ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Updating instance_info_cache with network_info: [{"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:45:16 np0005539505 nova_compute[186958]: 2025-11-29 07:45:16.898 186962 DEBUG oslo_concurrency.lockutils [req-f78e73ac-4880-4bae-90db-a075327b982e req-337b1366-cdf3-467e-a87a-e59fe36821ce 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-1186f0c5-bea2-440e-a805-9b10c97141d5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.177 186962 DEBUG nova.network.neutron [-] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.196 186962 INFO nova.compute.manager [-] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Took 1.64 seconds to deallocate network for instance.#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.300 186962 DEBUG oslo_concurrency.lockutils [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.301 186962 DEBUG oslo_concurrency.lockutils [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.350 186962 DEBUG nova.compute.provider_tree [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.364 186962 DEBUG nova.scheduler.client.report [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.384 186962 DEBUG oslo_concurrency.lockutils [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.403 186962 INFO nova.scheduler.client.report [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance 1186f0c5-bea2-440e-a805-9b10c97141d5#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.473 186962 DEBUG oslo_concurrency.lockutils [None req-a21393a3-8b43-4b9f-98b9-73fd853f8f54 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.859 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.923 186962 DEBUG nova.compute.manager [req-aa753fbd-9d8b-4d5b-a019-a763385c4048 req-31ba3259-d130-4fb4-a2af-2b03399060e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Received event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.924 186962 DEBUG oslo_concurrency.lockutils [req-aa753fbd-9d8b-4d5b-a019-a763385c4048 req-31ba3259-d130-4fb4-a2af-2b03399060e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.924 186962 DEBUG oslo_concurrency.lockutils [req-aa753fbd-9d8b-4d5b-a019-a763385c4048 req-31ba3259-d130-4fb4-a2af-2b03399060e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.924 186962 DEBUG oslo_concurrency.lockutils [req-aa753fbd-9d8b-4d5b-a019-a763385c4048 req-31ba3259-d130-4fb4-a2af-2b03399060e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "1186f0c5-bea2-440e-a805-9b10c97141d5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.925 186962 DEBUG nova.compute.manager [req-aa753fbd-9d8b-4d5b-a019-a763385c4048 req-31ba3259-d130-4fb4-a2af-2b03399060e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] No waiting events found dispatching network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:45:17 np0005539505 nova_compute[186958]: 2025-11-29 07:45:17.925 186962 WARNING nova.compute.manager [req-aa753fbd-9d8b-4d5b-a019-a763385c4048 req-31ba3259-d130-4fb4-a2af-2b03399060e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Received unexpected event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:45:19 np0005539505 nova_compute[186958]: 2025-11-29 07:45:19.771 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:20.448 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:20 np0005539505 nova_compute[186958]: 2025-11-29 07:45:20.488 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:22 np0005539505 nova_compute[186958]: 2025-11-29 07:45:22.900 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:23 np0005539505 podman[249239]: 2025-11-29 07:45:23.733269716 +0000 UTC m=+0.057904521 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:45:23 np0005539505 podman[249238]: 2025-11-29 07:45:23.766499389 +0000 UTC m=+0.095003815 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Nov 29 02:45:25 np0005539505 nova_compute[186958]: 2025-11-29 07:45:25.389 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:25 np0005539505 nova_compute[186958]: 2025-11-29 07:45:25.490 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:27.101 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:bd:9f 10.100.0.2 2001:db8::f816:3eff:fee1:bd9f'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee1:bd9f/64', 'neutron:device_id': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=045a9acc-370f-460b-b7b5-7c57bd647b8b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=27074c74-d81e-4dc1-9e05-b59b6b9a0624) old=Port_Binding(mac=['fa:16:3e:e1:bd:9f 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b412a37-c227-42ad-9fca-23287613486a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0111c22b4b954ea586ca20d91ed3970f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:45:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:27.102 104094 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 27074c74-d81e-4dc1-9e05-b59b6b9a0624 in datapath 7b412a37-c227-42ad-9fca-23287613486a updated#033[00m
Nov 29 02:45:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:27.104 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b412a37-c227-42ad-9fca-23287613486a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:45:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:27.106 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[480d33e4-9599-4c77-9af1-0712586a5d52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:27.529 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:27.529 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:27.529 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:27 np0005539505 podman[249281]: 2025-11-29 07:45:27.745043974 +0000 UTC m=+0.066943020 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:45:27 np0005539505 nova_compute[186958]: 2025-11-29 07:45:27.935 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:28 np0005539505 nova_compute[186958]: 2025-11-29 07:45:28.795 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "7674bac7-cc80-42e0-8126-6e1f5699fa49" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:28 np0005539505 nova_compute[186958]: 2025-11-29 07:45:28.795 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:28 np0005539505 nova_compute[186958]: 2025-11-29 07:45:28.815 186962 DEBUG nova.compute.manager [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:45:28 np0005539505 nova_compute[186958]: 2025-11-29 07:45:28.951 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:28 np0005539505 nova_compute[186958]: 2025-11-29 07:45:28.951 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:28 np0005539505 nova_compute[186958]: 2025-11-29 07:45:28.959 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:45:28 np0005539505 nova_compute[186958]: 2025-11-29 07:45:28.959 186962 INFO nova.compute.claims [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:45:29 np0005539505 nova_compute[186958]: 2025-11-29 07:45:29.102 186962 DEBUG nova.compute.provider_tree [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:45:29 np0005539505 nova_compute[186958]: 2025-11-29 07:45:29.119 186962 DEBUG nova.scheduler.client.report [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:45:29 np0005539505 nova_compute[186958]: 2025-11-29 07:45:29.149 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:29 np0005539505 nova_compute[186958]: 2025-11-29 07:45:29.150 186962 DEBUG nova.compute.manager [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:45:29 np0005539505 nova_compute[186958]: 2025-11-29 07:45:29.207 186962 DEBUG nova.compute.manager [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:45:29 np0005539505 nova_compute[186958]: 2025-11-29 07:45:29.208 186962 DEBUG nova.network.neutron [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:45:29 np0005539505 nova_compute[186958]: 2025-11-29 07:45:29.245 186962 INFO nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:45:29 np0005539505 nova_compute[186958]: 2025-11-29 07:45:29.538 186962 DEBUG nova.compute.manager [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:45:29 np0005539505 nova_compute[186958]: 2025-11-29 07:45:29.814 186962 DEBUG nova.policy [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:45:30 np0005539505 nova_compute[186958]: 2025-11-29 07:45:30.468 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402315.467193, 1186f0c5-bea2-440e-a805-9b10c97141d5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:45:30 np0005539505 nova_compute[186958]: 2025-11-29 07:45:30.469 186962 INFO nova.compute.manager [-] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:45:30 np0005539505 nova_compute[186958]: 2025-11-29 07:45:30.491 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:32 np0005539505 nova_compute[186958]: 2025-11-29 07:45:32.937 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:34 np0005539505 nova_compute[186958]: 2025-11-29 07:45:34.249 186962 DEBUG nova.compute.manager [None req-f033f535-b01d-43dc-b979-58b7851be99d - - - - - -] [instance: 1186f0c5-bea2-440e-a805-9b10c97141d5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:45:34 np0005539505 nova_compute[186958]: 2025-11-29 07:45:34.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:34 np0005539505 nova_compute[186958]: 2025-11-29 07:45:34.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:34 np0005539505 nova_compute[186958]: 2025-11-29 07:45:34.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:45:34 np0005539505 nova_compute[186958]: 2025-11-29 07:45:34.939 186962 DEBUG nova.compute.manager [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:45:34 np0005539505 nova_compute[186958]: 2025-11-29 07:45:34.940 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:45:34 np0005539505 nova_compute[186958]: 2025-11-29 07:45:34.940 186962 INFO nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Creating image(s)#033[00m
Nov 29 02:45:34 np0005539505 nova_compute[186958]: 2025-11-29 07:45:34.941 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:34 np0005539505 nova_compute[186958]: 2025-11-29 07:45:34.941 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:34 np0005539505 nova_compute[186958]: 2025-11-29 07:45:34.942 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:34 np0005539505 nova_compute[186958]: 2025-11-29 07:45:34.955 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.016 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.017 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.018 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.030 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.095 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.096 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.128 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.129 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.130 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.182 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.183 186962 DEBUG nova.virt.disk.api [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.184 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.239 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.241 186962 DEBUG nova.virt.disk.api [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.241 186962 DEBUG nova.objects.instance [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 7674bac7-cc80-42e0-8126-6e1f5699fa49 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.492 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.603 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.603 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Ensure instance console log exists: /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.604 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.605 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.606 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.666 186962 DEBUG nova.network.neutron [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Successfully updated port: 2d4d5100-ecb5-4399-baa7-6c217e5618d4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.697 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-7674bac7-cc80-42e0-8126-6e1f5699fa49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.698 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-7674bac7-cc80-42e0-8126-6e1f5699fa49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.698 186962 DEBUG nova.network.neutron [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:45:35 np0005539505 podman[249315]: 2025-11-29 07:45:35.709750703 +0000 UTC m=+0.043639783 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:45:35 np0005539505 podman[249316]: 2025-11-29 07:45:35.780930573 +0000 UTC m=+0.109995624 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.800 186962 DEBUG nova.compute.manager [req-b40868ff-72f5-4a92-b850-d461aa7ba56a req-45f7ee09-3734-4174-80af-c3e1baff2336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Received event network-changed-2d4d5100-ecb5-4399-baa7-6c217e5618d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.800 186962 DEBUG nova.compute.manager [req-b40868ff-72f5-4a92-b850-d461aa7ba56a req-45f7ee09-3734-4174-80af-c3e1baff2336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Refreshing instance network info cache due to event network-changed-2d4d5100-ecb5-4399-baa7-6c217e5618d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.800 186962 DEBUG oslo_concurrency.lockutils [req-b40868ff-72f5-4a92-b850-d461aa7ba56a req-45f7ee09-3734-4174-80af-c3e1baff2336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7674bac7-cc80-42e0-8126-6e1f5699fa49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:45:35 np0005539505 nova_compute[186958]: 2025-11-29 07:45:35.903 186962 DEBUG nova.network.neutron [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:45:37 np0005539505 nova_compute[186958]: 2025-11-29 07:45:37.106 186962 DEBUG nova.network.neutron [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Updating instance_info_cache with network_info: [{"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:45:37 np0005539505 nova_compute[186958]: 2025-11-29 07:45:37.938 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.620 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-7674bac7-cc80-42e0-8126-6e1f5699fa49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.621 186962 DEBUG nova.compute.manager [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Instance network_info: |[{"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.622 186962 DEBUG oslo_concurrency.lockutils [req-b40868ff-72f5-4a92-b850-d461aa7ba56a req-45f7ee09-3734-4174-80af-c3e1baff2336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7674bac7-cc80-42e0-8126-6e1f5699fa49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.623 186962 DEBUG nova.network.neutron [req-b40868ff-72f5-4a92-b850-d461aa7ba56a req-45f7ee09-3734-4174-80af-c3e1baff2336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Refreshing network info cache for port 2d4d5100-ecb5-4399-baa7-6c217e5618d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.629 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Start _get_guest_xml network_info=[{"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.632 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.632 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.634 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.639 186962 WARNING nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.649 186962 DEBUG nova.virt.libvirt.host [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.650 186962 DEBUG nova.virt.libvirt.host [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.654 186962 DEBUG nova.virt.libvirt.host [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.655 186962 DEBUG nova.virt.libvirt.host [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.657 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.658 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.658 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.659 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.659 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.659 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.660 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.660 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.660 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.661 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.661 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.662 186962 DEBUG nova.virt.hardware [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.671 186962 DEBUG nova.virt.libvirt.vif [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:45:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-10480375',display_name='tempest-TestNetworkBasicOps-server-10480375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-10480375',id=165,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKiYmR9ib/8BbKKp+y9Gutb6cBupYZLPeT75FfEZuUGxl8Q/1ie7IhQo+tUOU/UwvFhvoU2xScXBm6SU8w4icpafQEOihCZe0MtU8t2NuhN8gR5XJrotfSiFRJFzrX93Ww==',key_name='tempest-TestNetworkBasicOps-1356352426',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-0he4yl0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:45:31Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=7674bac7-cc80-42e0-8126-6e1f5699fa49,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.671 186962 DEBUG nova.network.os_vif_util [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.673 186962 DEBUG nova.network.os_vif_util [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:45:39 np0005539505 nova_compute[186958]: 2025-11-29 07:45:39.675 186962 DEBUG nova.objects.instance [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7674bac7-cc80-42e0-8126-6e1f5699fa49 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:45:39 np0005539505 podman[249363]: 2025-11-29 07:45:39.736714215 +0000 UTC m=+0.071265914 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:45:39 np0005539505 podman[249362]: 2025-11-29 07:45:39.746064153 +0000 UTC m=+0.082516497 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.034 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  <uuid>7674bac7-cc80-42e0-8126-6e1f5699fa49</uuid>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  <name>instance-000000a5</name>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestNetworkBasicOps-server-10480375</nova:name>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:45:39</nova:creationTime>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:        <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:        <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:        <nova:port uuid="2d4d5100-ecb5-4399-baa7-6c217e5618d4">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <entry name="serial">7674bac7-cc80-42e0-8126-6e1f5699fa49</entry>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <entry name="uuid">7674bac7-cc80-42e0-8126-6e1f5699fa49</entry>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk.config"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:5e:0e:8a"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <target dev="tap2d4d5100-ec"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/console.log" append="off"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:45:40 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:45:40 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:45:40 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:45:40 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.036 186962 DEBUG nova.compute.manager [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Preparing to wait for external event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.037 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.037 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.037 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.039 186962 DEBUG nova.virt.libvirt.vif [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:45:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-10480375',display_name='tempest-TestNetworkBasicOps-server-10480375',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-10480375',id=165,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKiYmR9ib/8BbKKp+y9Gutb6cBupYZLPeT75FfEZuUGxl8Q/1ie7IhQo+tUOU/UwvFhvoU2xScXBm6SU8w4icpafQEOihCZe0MtU8t2NuhN8gR5XJrotfSiFRJFzrX93Ww==',key_name='tempest-TestNetworkBasicOps-1356352426',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-0he4yl0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:45:31Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=7674bac7-cc80-42e0-8126-6e1f5699fa49,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.039 186962 DEBUG nova.network.os_vif_util [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.040 186962 DEBUG nova.network.os_vif_util [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.041 186962 DEBUG os_vif [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.042 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.042 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.043 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.047 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.047 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2d4d5100-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.048 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2d4d5100-ec, col_values=(('external_ids', {'iface-id': '2d4d5100-ecb5-4399-baa7-6c217e5618d4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:0e:8a', 'vm-uuid': '7674bac7-cc80-42e0-8126-6e1f5699fa49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.050 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:40 np0005539505 NetworkManager[55134]: <info>  [1764402340.0519] manager: (tap2d4d5100-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/380)
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.054 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.056 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.057 186962 INFO os_vif [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec')#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.170 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.170 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.170 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:5e:0e:8a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:45:40 np0005539505 nova_compute[186958]: 2025-11-29 07:45:40.171 186962 INFO nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Using config drive#033[00m
Nov 29 02:45:41 np0005539505 nova_compute[186958]: 2025-11-29 07:45:41.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:42 np0005539505 nova_compute[186958]: 2025-11-29 07:45:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:45:42 np0005539505 nova_compute[186958]: 2025-11-29 07:45:42.940 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:43 np0005539505 nova_compute[186958]: 2025-11-29 07:45:43.902 186962 INFO nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Creating config drive at /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk.config#033[00m
Nov 29 02:45:43 np0005539505 nova_compute[186958]: 2025-11-29 07:45:43.907 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf3xdsvp4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:45:44 np0005539505 nova_compute[186958]: 2025-11-29 07:45:44.048 186962 DEBUG oslo_concurrency.processutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf3xdsvp4" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:45:44 np0005539505 kernel: tap2d4d5100-ec: entered promiscuous mode
Nov 29 02:45:44 np0005539505 NetworkManager[55134]: <info>  [1764402344.1126] manager: (tap2d4d5100-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/381)
Nov 29 02:45:44 np0005539505 nova_compute[186958]: 2025-11-29 07:45:44.113 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:44Z|00776|binding|INFO|Claiming lport 2d4d5100-ecb5-4399-baa7-6c217e5618d4 for this chassis.
Nov 29 02:45:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:44Z|00777|binding|INFO|2d4d5100-ecb5-4399-baa7-6c217e5618d4: Claiming fa:16:3e:5e:0e:8a 10.100.0.9
Nov 29 02:45:44 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:44Z|00778|binding|INFO|Setting lport 2d4d5100-ecb5-4399-baa7-6c217e5618d4 ovn-installed in OVS
Nov 29 02:45:44 np0005539505 nova_compute[186958]: 2025-11-29 07:45:44.127 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:44 np0005539505 nova_compute[186958]: 2025-11-29 07:45:44.129 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:44 np0005539505 nova_compute[186958]: 2025-11-29 07:45:44.132 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:44 np0005539505 systemd-udevd[249422]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:45:44 np0005539505 systemd-machined[153285]: New machine qemu-81-instance-000000a5.
Nov 29 02:45:44 np0005539505 NetworkManager[55134]: <info>  [1764402344.1513] device (tap2d4d5100-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:45:44 np0005539505 NetworkManager[55134]: <info>  [1764402344.1524] device (tap2d4d5100-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:45:44 np0005539505 systemd[1]: Started Virtual Machine qemu-81-instance-000000a5.
Nov 29 02:45:44 np0005539505 nova_compute[186958]: 2025-11-29 07:45:44.239 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:44 np0005539505 nova_compute[186958]: 2025-11-29 07:45:44.240 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:44 np0005539505 nova_compute[186958]: 2025-11-29 07:45:44.240 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:44 np0005539505 nova_compute[186958]: 2025-11-29 07:45:44.240 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:45:44 np0005539505 nova_compute[186958]: 2025-11-29 07:45:44.437 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402344.436321, 7674bac7-cc80-42e0-8126-6e1f5699fa49 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:45:44 np0005539505 nova_compute[186958]: 2025-11-29 07:45:44.438 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] VM Started (Lifecycle Event)#033[00m
Nov 29 02:45:45 np0005539505 nova_compute[186958]: 2025-11-29 07:45:45.051 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:45 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:45Z|00779|binding|INFO|Setting lport 2d4d5100-ecb5-4399-baa7-6c217e5618d4 up in Southbound
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.072 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:0e:8a 10.100.0.9'], port_security=['fa:16:3e:5e:0e:8a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1665291496', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7674bac7-cc80-42e0-8126-6e1f5699fa49', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45898937-6ca6-4da8-a84a-53586a0780e5', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1665291496', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '7', 'neutron:security_group_ids': '046111b1-8479-4ebb-8db5-573e164c575e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.186'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=77dd2b4e-807c-4149-a535-57e12b7bc161, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=2d4d5100-ecb5-4399-baa7-6c217e5618d4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.073 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 2d4d5100-ecb5-4399-baa7-6c217e5618d4 in datapath 45898937-6ca6-4da8-a84a-53586a0780e5 bound to our chassis#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.075 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 45898937-6ca6-4da8-a84a-53586a0780e5#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.086 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9e4ebaa2-4d9f-44f1-a4ed-52b67d8c02e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.088 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap45898937-61 in ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.089 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap45898937-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.089 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b8659c76-310b-42ad-8c43-592a29dae797]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.090 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c316dba0-93fd-4e6d-9d6d-cb83fd181c2e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.102 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[b477a49b-52d2-4e43-9958-54cd6c3bc4d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.117 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8201bec2-13f7-421f-8255-142eb6d5cc0c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.146 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[768a0ef4-ad9a-4e2f-b42d-0d0f088536c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.152 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[932d3ae3-c608-4839-8924-77a3b2d24770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 NetworkManager[55134]: <info>  [1764402345.1540] manager: (tap45898937-60): new Veth device (/org/freedesktop/NetworkManager/Devices/382)
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.190 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[0ce213d3-4244-47ef-8dc5-9eedbeb9ffd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.193 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f1f6dd2b-52c2-4e80-b8d2-468767ab1603]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 NetworkManager[55134]: <info>  [1764402345.2103] device (tap45898937-60): carrier: link connected
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.214 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1ff4204f-1aaa-44c8-beae-7fd74a2e2091]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.229 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8b847f45-be1d-4cff-bb7d-137cc03f9ef9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap45898937-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:26:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 779279, 'reachable_time': 21371, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249466, 'error': None, 'target': 'ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.242 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4920223a-6416-4762-9b2b-b6450ab1dd5b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:2607'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 779279, 'tstamp': 779279}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249467, 'error': None, 'target': 'ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.261 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[adfb35dd-cc24-4155-8834-04ecc91e1599]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap45898937-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:26:07'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 779279, 'reachable_time': 21371, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249468, 'error': None, 'target': 'ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.294 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3f2fd587-5a30-4ada-b870-b5e65e0081b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.362 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[30ba95e7-6a75-4a6c-8700-a4d41bd9ffb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.365 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45898937-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.366 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.366 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap45898937-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:45 np0005539505 nova_compute[186958]: 2025-11-29 07:45:45.370 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:45 np0005539505 kernel: tap45898937-60: entered promiscuous mode
Nov 29 02:45:45 np0005539505 NetworkManager[55134]: <info>  [1764402345.3714] manager: (tap45898937-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/383)
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.373 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap45898937-60, col_values=(('external_ids', {'iface-id': 'fca7ed1d-5327-4a82-9ef6-10233b98308b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:45:45 np0005539505 ovn_controller[95143]: 2025-11-29T07:45:45Z|00780|binding|INFO|Releasing lport fca7ed1d-5327-4a82-9ef6-10233b98308b from this chassis (sb_readonly=1)
Nov 29 02:45:45 np0005539505 nova_compute[186958]: 2025-11-29 07:45:45.374 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:45 np0005539505 nova_compute[186958]: 2025-11-29 07:45:45.385 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.386 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/45898937-6ca6-4da8-a84a-53586a0780e5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/45898937-6ca6-4da8-a84a-53586a0780e5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.387 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[122df115-2063-4f7e-92f6-b4fe8006f136]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.389 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-45898937-6ca6-4da8-a84a-53586a0780e5
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/45898937-6ca6-4da8-a84a-53586a0780e5.pid.haproxy
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 45898937-6ca6-4da8-a84a-53586a0780e5
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:45:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:45:45.389 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5', 'env', 'PROCESS_TAG=haproxy-45898937-6ca6-4da8-a84a-53586a0780e5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/45898937-6ca6-4da8-a84a-53586a0780e5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:45:45 np0005539505 podman[249501]: 2025-11-29 07:45:45.788508541 +0000 UTC m=+0.054978767 container create a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 02:45:45 np0005539505 systemd[1]: Started libpod-conmon-a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc.scope.
Nov 29 02:45:45 np0005539505 podman[249501]: 2025-11-29 07:45:45.760593791 +0000 UTC m=+0.027064037 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:45:45 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:45:45 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/21e95a02f788ecf1833272fed9d93e6abed717a8357c3b6b01c9758980a77a1c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:45:45 np0005539505 podman[249501]: 2025-11-29 07:45:45.876208035 +0000 UTC m=+0.142678291 container init a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 29 02:45:45 np0005539505 podman[249501]: 2025-11-29 07:45:45.882540107 +0000 UTC m=+0.149010333 container start a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:45:45 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249516]: [NOTICE]   (249520) : New worker (249522) forked
Nov 29 02:45:45 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249516]: [NOTICE]   (249520) : Loading success.
Nov 29 02:45:47 np0005539505 nova_compute[186958]: 2025-11-29 07:45:47.943 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:50 np0005539505 nova_compute[186958]: 2025-11-29 07:45:50.055 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.279 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.283 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402344.4365966, 7674bac7-cc80-42e0-8126-6e1f5699fa49 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.283 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.418 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.490 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.491 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.509 186962 DEBUG nova.network.neutron [req-b40868ff-72f5-4a92-b850-d461aa7ba56a req-45f7ee09-3734-4174-80af-c3e1baff2336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Updated VIF entry in instance network info cache for port 2d4d5100-ecb5-4399-baa7-6c217e5618d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.510 186962 DEBUG nova.network.neutron [req-b40868ff-72f5-4a92-b850-d461aa7ba56a req-45f7ee09-3734-4174-80af-c3e1baff2336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Updating instance_info_cache with network_info: [{"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.548 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.598 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.602 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.613 186962 DEBUG oslo_concurrency.lockutils [req-b40868ff-72f5-4a92-b850-d461aa7ba56a req-45f7ee09-3734-4174-80af-c3e1baff2336 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7674bac7-cc80-42e0-8126-6e1f5699fa49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.721 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.722 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5676MB free_disk=73.0722541809082GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.722 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:51 np0005539505 nova_compute[186958]: 2025-11-29 07:45:51.722 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:52 np0005539505 nova_compute[186958]: 2025-11-29 07:45:52.946 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:54 np0005539505 podman[249540]: 2025-11-29 07:45:54.722779038 +0000 UTC m=+0.053607078 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:45:54 np0005539505 podman[249539]: 2025-11-29 07:45:54.733183336 +0000 UTC m=+0.065535330 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7)
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.905 186962 DEBUG nova.compute.manager [req-83f4a9ec-07f2-4a3d-933c-25d190ea875c req-9f019bca-56ed-4eb1-a53b-ab0e1f24aee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Received event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.906 186962 DEBUG oslo_concurrency.lockutils [req-83f4a9ec-07f2-4a3d-933c-25d190ea875c req-9f019bca-56ed-4eb1-a53b-ab0e1f24aee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.906 186962 DEBUG oslo_concurrency.lockutils [req-83f4a9ec-07f2-4a3d-933c-25d190ea875c req-9f019bca-56ed-4eb1-a53b-ab0e1f24aee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.906 186962 DEBUG oslo_concurrency.lockutils [req-83f4a9ec-07f2-4a3d-933c-25d190ea875c req-9f019bca-56ed-4eb1-a53b-ab0e1f24aee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.906 186962 DEBUG nova.compute.manager [req-83f4a9ec-07f2-4a3d-933c-25d190ea875c req-9f019bca-56ed-4eb1-a53b-ab0e1f24aee9 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Processing event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.907 186962 DEBUG nova.compute.manager [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Instance event wait completed in 10 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.911 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.914 186962 INFO nova.virt.libvirt.driver [-] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Instance spawned successfully.#033[00m
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.914 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.917 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.917 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402354.9103825, 7674bac7-cc80-42e0-8126-6e1f5699fa49 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:45:54 np0005539505 nova_compute[186958]: 2025-11-29 07:45:54.918 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.003 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.004 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.004 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.005 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.005 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.005 186962 DEBUG nova.virt.libvirt.driver [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.059 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.072 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.077 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.113 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 7674bac7-cc80-42e0-8126-6e1f5699fa49 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.114 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.114 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.146 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.161 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.174 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.174 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.210 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.238 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:45:55 np0005539505 nova_compute[186958]: 2025-11-29 07:45:55.310 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:45:57 np0005539505 nova_compute[186958]: 2025-11-29 07:45:57.078 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:45:57 np0005539505 nova_compute[186958]: 2025-11-29 07:45:57.948 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:45:58 np0005539505 podman[249584]: 2025-11-29 07:45:58.733978389 +0000 UTC m=+0.052242528 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:46:00 np0005539505 nova_compute[186958]: 2025-11-29 07:46:00.063 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:00 np0005539505 nova_compute[186958]: 2025-11-29 07:46:00.602 186962 INFO nova.compute.manager [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Took 25.66 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:46:00 np0005539505 nova_compute[186958]: 2025-11-29 07:46:00.603 186962 DEBUG nova.compute.manager [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:46:00 np0005539505 nova_compute[186958]: 2025-11-29 07:46:00.618 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:46:00 np0005539505 nova_compute[186958]: 2025-11-29 07:46:00.619 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 8.897s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:00 np0005539505 nova_compute[186958]: 2025-11-29 07:46:00.768 186962 INFO nova.compute.manager [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Took 31.87 seconds to build instance.#033[00m
Nov 29 02:46:00 np0005539505 nova_compute[186958]: 2025-11-29 07:46:00.793 186962 DEBUG oslo_concurrency.lockutils [None req-4220d853-b5a8-4b2e-b846-4e00ce064a64 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 31.998s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:02 np0005539505 nova_compute[186958]: 2025-11-29 07:46:02.989 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:03.068 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:46:03 np0005539505 nova_compute[186958]: 2025-11-29 07:46:03.068 186962 DEBUG nova.compute.manager [req-7a0fa74b-7986-4f46-a396-a0e52e912539 req-e139cfa7-bc3c-4e5a-8177-8612fba4e2d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Received event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:46:03 np0005539505 nova_compute[186958]: 2025-11-29 07:46:03.068 186962 DEBUG oslo_concurrency.lockutils [req-7a0fa74b-7986-4f46-a396-a0e52e912539 req-e139cfa7-bc3c-4e5a-8177-8612fba4e2d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:03 np0005539505 nova_compute[186958]: 2025-11-29 07:46:03.069 186962 DEBUG oslo_concurrency.lockutils [req-7a0fa74b-7986-4f46-a396-a0e52e912539 req-e139cfa7-bc3c-4e5a-8177-8612fba4e2d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:03 np0005539505 nova_compute[186958]: 2025-11-29 07:46:03.069 186962 DEBUG oslo_concurrency.lockutils [req-7a0fa74b-7986-4f46-a396-a0e52e912539 req-e139cfa7-bc3c-4e5a-8177-8612fba4e2d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:03 np0005539505 nova_compute[186958]: 2025-11-29 07:46:03.069 186962 DEBUG nova.compute.manager [req-7a0fa74b-7986-4f46-a396-a0e52e912539 req-e139cfa7-bc3c-4e5a-8177-8612fba4e2d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] No waiting events found dispatching network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:46:03 np0005539505 nova_compute[186958]: 2025-11-29 07:46:03.069 186962 WARNING nova.compute.manager [req-7a0fa74b-7986-4f46-a396-a0e52e912539 req-e139cfa7-bc3c-4e5a-8177-8612fba4e2d5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Received unexpected event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:46:03 np0005539505 nova_compute[186958]: 2025-11-29 07:46:03.069 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:03.070 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:46:05 np0005539505 nova_compute[186958]: 2025-11-29 07:46:05.067 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:06 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:06.074 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:46:06 np0005539505 nova_compute[186958]: 2025-11-29 07:46:06.615 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:06 np0005539505 nova_compute[186958]: 2025-11-29 07:46:06.615 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:06 np0005539505 podman[249618]: 2025-11-29 07:46:06.75390347 +0000 UTC m=+0.081866818 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:46:06 np0005539505 podman[249619]: 2025-11-29 07:46:06.770015372 +0000 UTC m=+0.086250244 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:46:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:46:07Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5e:0e:8a 10.100.0.9
Nov 29 02:46:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:46:07Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5e:0e:8a 10.100.0.9
Nov 29 02:46:07 np0005539505 nova_compute[186958]: 2025-11-29 07:46:07.995 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:10 np0005539505 nova_compute[186958]: 2025-11-29 07:46:10.071 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:10 np0005539505 podman[249668]: 2025-11-29 07:46:10.736792949 +0000 UTC m=+0.066328733 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 29 02:46:10 np0005539505 podman[249669]: 2025-11-29 07:46:10.738371014 +0000 UTC m=+0.063197863 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.490 186962 DEBUG oslo_concurrency.lockutils [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "7674bac7-cc80-42e0-8126-6e1f5699fa49" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.490 186962 DEBUG oslo_concurrency.lockutils [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.491 186962 DEBUG oslo_concurrency.lockutils [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.491 186962 DEBUG oslo_concurrency.lockutils [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.491 186962 DEBUG oslo_concurrency.lockutils [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.509 186962 INFO nova.compute.manager [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Terminating instance#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.534 186962 DEBUG nova.compute.manager [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:46:11 np0005539505 kernel: tap2d4d5100-ec (unregistering): left promiscuous mode
Nov 29 02:46:11 np0005539505 NetworkManager[55134]: <info>  [1764402371.5562] device (tap2d4d5100-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.566 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:46:11Z|00781|binding|INFO|Releasing lport 2d4d5100-ecb5-4399-baa7-6c217e5618d4 from this chassis (sb_readonly=0)
Nov 29 02:46:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:46:11Z|00782|binding|INFO|Setting lport 2d4d5100-ecb5-4399-baa7-6c217e5618d4 down in Southbound
Nov 29 02:46:11 np0005539505 ovn_controller[95143]: 2025-11-29T07:46:11Z|00783|binding|INFO|Removing iface tap2d4d5100-ec ovn-installed in OVS
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.569 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.574 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:0e:8a 10.100.0.9'], port_security=['fa:16:3e:5e:0e:8a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1665291496', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7674bac7-cc80-42e0-8126-6e1f5699fa49', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45898937-6ca6-4da8-a84a-53586a0780e5', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1665291496', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '9', 'neutron:security_group_ids': '046111b1-8479-4ebb-8db5-573e164c575e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.186', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=77dd2b4e-807c-4149-a535-57e12b7bc161, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=2d4d5100-ecb5-4399-baa7-6c217e5618d4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.575 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 2d4d5100-ecb5-4399-baa7-6c217e5618d4 in datapath 45898937-6ca6-4da8-a84a-53586a0780e5 unbound from our chassis#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.577 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 45898937-6ca6-4da8-a84a-53586a0780e5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.579 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc8b127-3e08-4512-849e-02667bc54226]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.579 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5 namespace which is not needed anymore#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.592 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:11 np0005539505 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a5.scope: Deactivated successfully.
Nov 29 02:46:11 np0005539505 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000a5.scope: Consumed 12.927s CPU time.
Nov 29 02:46:11 np0005539505 systemd-machined[153285]: Machine qemu-81-instance-000000a5 terminated.
Nov 29 02:46:11 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249516]: [NOTICE]   (249520) : haproxy version is 2.8.14-c23fe91
Nov 29 02:46:11 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249516]: [NOTICE]   (249520) : path to executable is /usr/sbin/haproxy
Nov 29 02:46:11 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249516]: [WARNING]  (249520) : Exiting Master process...
Nov 29 02:46:11 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249516]: [ALERT]    (249520) : Current worker (249522) exited with code 143 (Terminated)
Nov 29 02:46:11 np0005539505 neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5[249516]: [WARNING]  (249520) : All workers exited. Exiting... (0)
Nov 29 02:46:11 np0005539505 systemd[1]: libpod-a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc.scope: Deactivated successfully.
Nov 29 02:46:11 np0005539505 podman[249731]: 2025-11-29 07:46:11.714553152 +0000 UTC m=+0.048459011 container died a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:46:11 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc-userdata-shm.mount: Deactivated successfully.
Nov 29 02:46:11 np0005539505 systemd[1]: var-lib-containers-storage-overlay-21e95a02f788ecf1833272fed9d93e6abed717a8357c3b6b01c9758980a77a1c-merged.mount: Deactivated successfully.
Nov 29 02:46:11 np0005539505 podman[249731]: 2025-11-29 07:46:11.759750667 +0000 UTC m=+0.093656526 container cleanup a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.793 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:11 np0005539505 systemd[1]: libpod-conmon-a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc.scope: Deactivated successfully.
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.836 186962 INFO nova.virt.libvirt.driver [-] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Instance destroyed successfully.#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.836 186962 DEBUG nova.objects.instance [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid 7674bac7-cc80-42e0-8126-6e1f5699fa49 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:46:11 np0005539505 podman[249772]: 2025-11-29 07:46:11.8533254 +0000 UTC m=+0.042190490 container remove a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.858 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0a44695c-d140-4a2c-9d3a-e7070c128fbf]: (4, ('Sat Nov 29 07:46:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5 (a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc)\na24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc\nSat Nov 29 07:46:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5 (a24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc)\na24a1dd2fdc9f1fc39bff64c032e1b106111380fc85c6422907a75f9858fb1cc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.860 186962 DEBUG nova.virt.libvirt.vif [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:45:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-10480375',display_name='tempest-TestNetworkBasicOps-server-10480375',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-10480375',id=165,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKiYmR9ib/8BbKKp+y9Gutb6cBupYZLPeT75FfEZuUGxl8Q/1ie7IhQo+tUOU/UwvFhvoU2xScXBm6SU8w4icpafQEOihCZe0MtU8t2NuhN8gR5XJrotfSiFRJFzrX93Ww==',key_name='tempest-TestNetworkBasicOps-1356352426',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:46:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-0he4yl0e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:46:00Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=7674bac7-cc80-42e0-8126-6e1f5699fa49,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.861 186962 DEBUG nova.network.os_vif_util [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "address": "fa:16:3e:5e:0e:8a", "network": {"id": "45898937-6ca6-4da8-a84a-53586a0780e5", "bridge": "br-int", "label": "tempest-network-smoke--258511330", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2d4d5100-ec", "ovs_interfaceid": "2d4d5100-ecb5-4399-baa7-6c217e5618d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.861 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e4d912d7-1156-4cc8-af13-859a746ede88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.861 186962 DEBUG nova.network.os_vif_util [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.862 186962 DEBUG os_vif [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.862 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45898937-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.864 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.864 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d4d5100-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:46:11 np0005539505 kernel: tap45898937-60: left promiscuous mode
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.867 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.879 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.879 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.882 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4be44edc-73c2-4cb2-8048-0f9f4bdbd158]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.883 186962 INFO os_vif [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:0e:8a,bridge_name='br-int',has_traffic_filtering=True,id=2d4d5100-ecb5-4399-baa7-6c217e5618d4,network=Network(45898937-6ca6-4da8-a84a-53586a0780e5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap2d4d5100-ec')#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.883 186962 INFO nova.virt.libvirt.driver [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Deleting instance files /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49_del#033[00m
Nov 29 02:46:11 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.884 186962 INFO nova.virt.libvirt.driver [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Deletion of /var/lib/nova/instances/7674bac7-cc80-42e0-8126-6e1f5699fa49_del complete#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.895 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dc4a25c6-4420-45ce-b87e-aebfda692ad8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.896 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[10ee9f54-c3de-45d0-8393-277b9b1b4af0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.913 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ca613d-c64c-44b5-92e0-56b2c554a2c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 779272, 'reachable_time': 16779, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249792, 'error': None, 'target': 'ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.916 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-45898937-6ca6-4da8-a84a-53586a0780e5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:46:11 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:11.917 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5d5d22-360f-4ad5-97ee-b74cdbcd642a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:46:11 np0005539505 systemd[1]: run-netns-ovnmeta\x2d45898937\x2d6ca6\x2d4da8\x2da84a\x2d53586a0780e5.mount: Deactivated successfully.
Nov 29 02:46:12 np0005539505 nova_compute[186958]: 2025-11-29 07:46:11.999 186962 DEBUG nova.compute.manager [req-43eeffa5-2a88-46c1-96d6-7dea1b8512ce req-11c23c7d-165e-4a73-8163-f2dd0adab456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Received event network-vif-unplugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:46:12 np0005539505 nova_compute[186958]: 2025-11-29 07:46:12.000 186962 DEBUG oslo_concurrency.lockutils [req-43eeffa5-2a88-46c1-96d6-7dea1b8512ce req-11c23c7d-165e-4a73-8163-f2dd0adab456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:12 np0005539505 nova_compute[186958]: 2025-11-29 07:46:12.000 186962 DEBUG oslo_concurrency.lockutils [req-43eeffa5-2a88-46c1-96d6-7dea1b8512ce req-11c23c7d-165e-4a73-8163-f2dd0adab456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:12 np0005539505 nova_compute[186958]: 2025-11-29 07:46:12.001 186962 DEBUG oslo_concurrency.lockutils [req-43eeffa5-2a88-46c1-96d6-7dea1b8512ce req-11c23c7d-165e-4a73-8163-f2dd0adab456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:12 np0005539505 nova_compute[186958]: 2025-11-29 07:46:12.001 186962 DEBUG nova.compute.manager [req-43eeffa5-2a88-46c1-96d6-7dea1b8512ce req-11c23c7d-165e-4a73-8163-f2dd0adab456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] No waiting events found dispatching network-vif-unplugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:46:12 np0005539505 nova_compute[186958]: 2025-11-29 07:46:12.001 186962 DEBUG nova.compute.manager [req-43eeffa5-2a88-46c1-96d6-7dea1b8512ce req-11c23c7d-165e-4a73-8163-f2dd0adab456 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Received event network-vif-unplugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:46:12 np0005539505 nova_compute[186958]: 2025-11-29 07:46:12.017 186962 INFO nova.compute.manager [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Took 0.48 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:46:12 np0005539505 nova_compute[186958]: 2025-11-29 07:46:12.018 186962 DEBUG oslo.service.loopingcall [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:46:12 np0005539505 nova_compute[186958]: 2025-11-29 07:46:12.019 186962 DEBUG nova.compute.manager [-] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:46:12 np0005539505 nova_compute[186958]: 2025-11-29 07:46:12.019 186962 DEBUG nova.network.neutron [-] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:46:12 np0005539505 nova_compute[186958]: 2025-11-29 07:46:12.998 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:14 np0005539505 nova_compute[186958]: 2025-11-29 07:46:14.396 186962 DEBUG nova.compute.manager [req-d854c6fd-2e91-4303-92c6-64b68320521c req-c76b6e6f-b999-4796-9365-f5913006c072 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Received event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:46:14 np0005539505 nova_compute[186958]: 2025-11-29 07:46:14.397 186962 DEBUG oslo_concurrency.lockutils [req-d854c6fd-2e91-4303-92c6-64b68320521c req-c76b6e6f-b999-4796-9365-f5913006c072 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:14 np0005539505 nova_compute[186958]: 2025-11-29 07:46:14.397 186962 DEBUG oslo_concurrency.lockutils [req-d854c6fd-2e91-4303-92c6-64b68320521c req-c76b6e6f-b999-4796-9365-f5913006c072 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:14 np0005539505 nova_compute[186958]: 2025-11-29 07:46:14.398 186962 DEBUG oslo_concurrency.lockutils [req-d854c6fd-2e91-4303-92c6-64b68320521c req-c76b6e6f-b999-4796-9365-f5913006c072 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:14 np0005539505 nova_compute[186958]: 2025-11-29 07:46:14.398 186962 DEBUG nova.compute.manager [req-d854c6fd-2e91-4303-92c6-64b68320521c req-c76b6e6f-b999-4796-9365-f5913006c072 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] No waiting events found dispatching network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:46:14 np0005539505 nova_compute[186958]: 2025-11-29 07:46:14.398 186962 WARNING nova.compute.manager [req-d854c6fd-2e91-4303-92c6-64b68320521c req-c76b6e6f-b999-4796-9365-f5913006c072 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Received unexpected event network-vif-plugged-2d4d5100-ecb5-4399-baa7-6c217e5618d4 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:46:15 np0005539505 nova_compute[186958]: 2025-11-29 07:46:15.233 186962 DEBUG nova.network.neutron [-] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:46:15 np0005539505 nova_compute[186958]: 2025-11-29 07:46:15.259 186962 INFO nova.compute.manager [-] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Took 3.24 seconds to deallocate network for instance.#033[00m
Nov 29 02:46:15 np0005539505 nova_compute[186958]: 2025-11-29 07:46:15.365 186962 DEBUG oslo_concurrency.lockutils [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:15 np0005539505 nova_compute[186958]: 2025-11-29 07:46:15.365 186962 DEBUG oslo_concurrency.lockutils [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:15 np0005539505 nova_compute[186958]: 2025-11-29 07:46:15.445 186962 DEBUG nova.compute.provider_tree [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:46:15 np0005539505 nova_compute[186958]: 2025-11-29 07:46:15.488 186962 DEBUG nova.scheduler.client.report [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:46:15 np0005539505 nova_compute[186958]: 2025-11-29 07:46:15.524 186962 DEBUG oslo_concurrency.lockutils [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:15 np0005539505 nova_compute[186958]: 2025-11-29 07:46:15.567 186962 INFO nova.scheduler.client.report [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance 7674bac7-cc80-42e0-8126-6e1f5699fa49#033[00m
Nov 29 02:46:15 np0005539505 nova_compute[186958]: 2025-11-29 07:46:15.652 186962 DEBUG oslo_concurrency.lockutils [None req-c8ee748d-4381-433a-8933-0c563ca0ae0a 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "7674bac7-cc80-42e0-8126-6e1f5699fa49" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:16 np0005539505 nova_compute[186958]: 2025-11-29 07:46:16.867 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:17 np0005539505 nova_compute[186958]: 2025-11-29 07:46:17.998 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:21 np0005539505 nova_compute[186958]: 2025-11-29 07:46:21.869 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:22 np0005539505 nova_compute[186958]: 2025-11-29 07:46:22.817 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:23 np0005539505 nova_compute[186958]: 2025-11-29 07:46:22.999 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:25 np0005539505 podman[249795]: 2025-11-29 07:46:25.731171749 +0000 UTC m=+0.055148393 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:46:25 np0005539505 podman[249794]: 2025-11-29 07:46:25.734660429 +0000 UTC m=+0.061972408 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 02:46:26 np0005539505 nova_compute[186958]: 2025-11-29 07:46:26.835 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402371.8330882, 7674bac7-cc80-42e0-8126-6e1f5699fa49 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:46:26 np0005539505 nova_compute[186958]: 2025-11-29 07:46:26.836 186962 INFO nova.compute.manager [-] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:46:26 np0005539505 nova_compute[186958]: 2025-11-29 07:46:26.856 186962 DEBUG nova.compute.manager [None req-a67e2c72-9fbb-4c32-abd6-80c308eff592 - - - - - -] [instance: 7674bac7-cc80-42e0-8126-6e1f5699fa49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:46:26 np0005539505 nova_compute[186958]: 2025-11-29 07:46:26.871 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:27 np0005539505 nova_compute[186958]: 2025-11-29 07:46:27.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:27.530 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:27.531 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:46:27.531 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:28 np0005539505 nova_compute[186958]: 2025-11-29 07:46:28.003 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:29 np0005539505 podman[249836]: 2025-11-29 07:46:29.726030912 +0000 UTC m=+0.054940767 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:46:31 np0005539505 nova_compute[186958]: 2025-11-29 07:46:31.874 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:33 np0005539505 nova_compute[186958]: 2025-11-29 07:46:33.005 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:34 np0005539505 nova_compute[186958]: 2025-11-29 07:46:34.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:35 np0005539505 nova_compute[186958]: 2025-11-29 07:46:35.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:35 np0005539505 nova_compute[186958]: 2025-11-29 07:46:35.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:35 np0005539505 nova_compute[186958]: 2025-11-29 07:46:35.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:46:36 np0005539505 nova_compute[186958]: 2025-11-29 07:46:36.877 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:37 np0005539505 podman[249855]: 2025-11-29 07:46:37.740431866 +0000 UTC m=+0.065839729 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:46:37 np0005539505 podman[249856]: 2025-11-29 07:46:37.760046348 +0000 UTC m=+0.090076664 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:46:38 np0005539505 nova_compute[186958]: 2025-11-29 07:46:38.008 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:40 np0005539505 nova_compute[186958]: 2025-11-29 07:46:40.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:40 np0005539505 nova_compute[186958]: 2025-11-29 07:46:40.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:46:40 np0005539505 nova_compute[186958]: 2025-11-29 07:46:40.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:46:40 np0005539505 nova_compute[186958]: 2025-11-29 07:46:40.711 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:46:41 np0005539505 nova_compute[186958]: 2025-11-29 07:46:41.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:41 np0005539505 podman[249903]: 2025-11-29 07:46:41.766198314 +0000 UTC m=+0.088725234 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:46:41 np0005539505 podman[249904]: 2025-11-29 07:46:41.784715815 +0000 UTC m=+0.100850491 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:46:41 np0005539505 nova_compute[186958]: 2025-11-29 07:46:41.879 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:42 np0005539505 nova_compute[186958]: 2025-11-29 07:46:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:43 np0005539505 nova_compute[186958]: 2025-11-29 07:46:43.034 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:43 np0005539505 nova_compute[186958]: 2025-11-29 07:46:43.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:43 np0005539505 nova_compute[186958]: 2025-11-29 07:46:43.582 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:43 np0005539505 nova_compute[186958]: 2025-11-29 07:46:43.583 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:43 np0005539505 nova_compute[186958]: 2025-11-29 07:46:43.583 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:43 np0005539505 nova_compute[186958]: 2025-11-29 07:46:43.583 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:46:43 np0005539505 nova_compute[186958]: 2025-11-29 07:46:43.762 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:46:43 np0005539505 nova_compute[186958]: 2025-11-29 07:46:43.763 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5692MB free_disk=73.073974609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:46:43 np0005539505 nova_compute[186958]: 2025-11-29 07:46:43.763 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:43 np0005539505 nova_compute[186958]: 2025-11-29 07:46:43.764 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:44 np0005539505 nova_compute[186958]: 2025-11-29 07:46:44.021 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:46:44 np0005539505 nova_compute[186958]: 2025-11-29 07:46:44.021 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:46:44 np0005539505 nova_compute[186958]: 2025-11-29 07:46:44.062 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:46:44 np0005539505 nova_compute[186958]: 2025-11-29 07:46:44.078 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:46:44 np0005539505 nova_compute[186958]: 2025-11-29 07:46:44.466 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:46:44 np0005539505 nova_compute[186958]: 2025-11-29 07:46:44.466 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:46 np0005539505 nova_compute[186958]: 2025-11-29 07:46:46.881 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:48 np0005539505 nova_compute[186958]: 2025-11-29 07:46:48.036 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:46:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:46:50 np0005539505 nova_compute[186958]: 2025-11-29 07:46:50.462 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:50 np0005539505 nova_compute[186958]: 2025-11-29 07:46:50.552 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:50 np0005539505 nova_compute[186958]: 2025-11-29 07:46:50.552 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:50 np0005539505 nova_compute[186958]: 2025-11-29 07:46:50.730 186962 DEBUG nova.compute.manager [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:46:51 np0005539505 nova_compute[186958]: 2025-11-29 07:46:51.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:46:51 np0005539505 nova_compute[186958]: 2025-11-29 07:46:51.883 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:52 np0005539505 nova_compute[186958]: 2025-11-29 07:46:52.027 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:52 np0005539505 nova_compute[186958]: 2025-11-29 07:46:52.028 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:52 np0005539505 nova_compute[186958]: 2025-11-29 07:46:52.034 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:46:52 np0005539505 nova_compute[186958]: 2025-11-29 07:46:52.034 186962 INFO nova.compute.claims [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:46:53 np0005539505 nova_compute[186958]: 2025-11-29 07:46:53.037 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:53 np0005539505 nova_compute[186958]: 2025-11-29 07:46:53.387 186962 DEBUG nova.compute.provider_tree [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:46:53 np0005539505 nova_compute[186958]: 2025-11-29 07:46:53.451 186962 DEBUG nova.scheduler.client.report [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:46:53 np0005539505 nova_compute[186958]: 2025-11-29 07:46:53.613 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:53 np0005539505 nova_compute[186958]: 2025-11-29 07:46:53.614 186962 DEBUG nova.compute.manager [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.095 186962 DEBUG nova.compute.manager [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.096 186962 DEBUG nova.network.neutron [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.186 186962 INFO nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.493 186962 DEBUG nova.compute.manager [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.609 186962 DEBUG nova.compute.manager [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.611 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.612 186962 INFO nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Creating image(s)#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.613 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "/var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.613 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.615 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "/var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.639 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.715 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.718 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.719 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.746 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.771 186962 DEBUG nova.policy [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1dd0a7f5aaff402eb032cd5e60540dcb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec8b80be17a14d1caf666636283749d0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.808 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:54 np0005539505 nova_compute[186958]: 2025-11-29 07:46:54.809 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.496 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk 1073741824" returned: 0 in 0.687s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.498 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.778s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.499 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.593 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.594 186962 DEBUG nova.virt.disk.api [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Checking if we can resize image /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.595 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.664 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.665 186962 DEBUG nova.virt.disk.api [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Cannot resize image /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.666 186962 DEBUG nova.objects.instance [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'migration_context' on Instance uuid 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.775 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.776 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Ensure instance console log exists: /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.776 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.777 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:46:55 np0005539505 nova_compute[186958]: 2025-11-29 07:46:55.777 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:46:56 np0005539505 podman[249959]: 2025-11-29 07:46:56.733508178 +0000 UTC m=+0.056566313 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=edpm, architecture=x86_64, vcs-type=git)
Nov 29 02:46:56 np0005539505 podman[249960]: 2025-11-29 07:46:56.753546982 +0000 UTC m=+0.062205694 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:46:56 np0005539505 nova_compute[186958]: 2025-11-29 07:46:56.886 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:46:57 np0005539505 nova_compute[186958]: 2025-11-29 07:46:57.974 186962 DEBUG nova.network.neutron [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Successfully created port: b2eac748-00a1-49c4-847a-c446b4e6149b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:46:58 np0005539505 nova_compute[186958]: 2025-11-29 07:46:58.039 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:00 np0005539505 nova_compute[186958]: 2025-11-29 07:47:00.236 186962 DEBUG nova.network.neutron [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Successfully updated port: b2eac748-00a1-49c4-847a-c446b4e6149b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:47:00 np0005539505 podman[250003]: 2025-11-29 07:47:00.747092828 +0000 UTC m=+0.074175658 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 02:47:01 np0005539505 nova_compute[186958]: 2025-11-29 07:47:01.238 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:47:01 np0005539505 nova_compute[186958]: 2025-11-29 07:47:01.238 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquired lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:47:01 np0005539505 nova_compute[186958]: 2025-11-29 07:47:01.238 186962 DEBUG nova.network.neutron [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:47:01 np0005539505 nova_compute[186958]: 2025-11-29 07:47:01.889 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:01 np0005539505 nova_compute[186958]: 2025-11-29 07:47:01.970 186962 DEBUG nova.network.neutron [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:47:03 np0005539505 nova_compute[186958]: 2025-11-29 07:47:03.041 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:03 np0005539505 nova_compute[186958]: 2025-11-29 07:47:03.659 186962 DEBUG nova.compute.manager [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Received event network-changed-b2eac748-00a1-49c4-847a-c446b4e6149b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:03 np0005539505 nova_compute[186958]: 2025-11-29 07:47:03.659 186962 DEBUG nova.compute.manager [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Refreshing instance network info cache due to event network-changed-b2eac748-00a1-49c4-847a-c446b4e6149b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:47:03 np0005539505 nova_compute[186958]: 2025-11-29 07:47:03.660 186962 DEBUG oslo_concurrency.lockutils [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:47:06 np0005539505 nova_compute[186958]: 2025-11-29 07:47:06.753 186962 DEBUG nova.network.neutron [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Updating instance_info_cache with network_info: [{"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:47:06 np0005539505 nova_compute[186958]: 2025-11-29 07:47:06.891 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.279 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Releasing lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.279 186962 DEBUG nova.compute.manager [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Instance network_info: |[{"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.280 186962 DEBUG oslo_concurrency.lockutils [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.280 186962 DEBUG nova.network.neutron [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Refreshing network info cache for port b2eac748-00a1-49c4-847a-c446b4e6149b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.283 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Start _get_guest_xml network_info=[{"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.288 186962 WARNING nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.299 186962 DEBUG nova.virt.libvirt.host [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.300 186962 DEBUG nova.virt.libvirt.host [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.306 186962 DEBUG nova.virt.libvirt.host [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.306 186962 DEBUG nova.virt.libvirt.host [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.307 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.308 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.308 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.308 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.308 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.309 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.309 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.309 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.309 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.309 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.309 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.310 186962 DEBUG nova.virt.hardware [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.313 186962 DEBUG nova.virt.libvirt.vif [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:46:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1848280602',display_name='tempest-TestNetworkBasicOps-server-1848280602',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1848280602',id=168,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKuAI6G4e10qXoY3lrFstC9/JrGmGx5HNROpZ+HVF30JLHpW0IRd8xFUUHjWa+4hkeT4LDlnugY6EC18c9DPpabAOdbctgDgdlq2B/acxzm/dLBKVujfLUNTmKNvN0S4kQ==',key_name='tempest-TestNetworkBasicOps-2091602780',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-fyqwdgad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:46:54Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.314 186962 DEBUG nova.network.os_vif_util [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.314 186962 DEBUG nova.network.os_vif_util [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:8d:56,bridge_name='br-int',has_traffic_filtering=True,id=b2eac748-00a1-49c4-847a-c446b4e6149b,network=Network(6a4724ea-f380-4fe0-baf0-3c9adaf0ad69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2eac748-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.315 186962 DEBUG nova.objects.instance [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.762 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  <uuid>53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd</uuid>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  <name>instance-000000a8</name>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestNetworkBasicOps-server-1848280602</nova:name>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:47:07</nova:creationTime>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:        <nova:user uuid="1dd0a7f5aaff402eb032cd5e60540dcb">tempest-TestNetworkBasicOps-1587012976-project-member</nova:user>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:        <nova:project uuid="ec8b80be17a14d1caf666636283749d0">tempest-TestNetworkBasicOps-1587012976</nova:project>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:        <nova:port uuid="b2eac748-00a1-49c4-847a-c446b4e6149b">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <entry name="serial">53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd</entry>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <entry name="uuid">53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd</entry>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk.config"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:6f:8d:56"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <target dev="tapb2eac748-00"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/console.log" append="off"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:47:07 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:47:07 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:47:07 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:47:07 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.763 186962 DEBUG nova.compute.manager [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Preparing to wait for external event network-vif-plugged-b2eac748-00a1-49c4-847a-c446b4e6149b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.763 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.763 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.764 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.764 186962 DEBUG nova.virt.libvirt.vif [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:46:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1848280602',display_name='tempest-TestNetworkBasicOps-server-1848280602',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1848280602',id=168,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKuAI6G4e10qXoY3lrFstC9/JrGmGx5HNROpZ+HVF30JLHpW0IRd8xFUUHjWa+4hkeT4LDlnugY6EC18c9DPpabAOdbctgDgdlq2B/acxzm/dLBKVujfLUNTmKNvN0S4kQ==',key_name='tempest-TestNetworkBasicOps-2091602780',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-fyqwdgad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:46:54Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.765 186962 DEBUG nova.network.os_vif_util [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.766 186962 DEBUG nova.network.os_vif_util [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:8d:56,bridge_name='br-int',has_traffic_filtering=True,id=b2eac748-00a1-49c4-847a-c446b4e6149b,network=Network(6a4724ea-f380-4fe0-baf0-3c9adaf0ad69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2eac748-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.766 186962 DEBUG os_vif [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:8d:56,bridge_name='br-int',has_traffic_filtering=True,id=b2eac748-00a1-49c4-847a-c446b4e6149b,network=Network(6a4724ea-f380-4fe0-baf0-3c9adaf0ad69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2eac748-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.767 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.767 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.768 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.770 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.770 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2eac748-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.771 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb2eac748-00, col_values=(('external_ids', {'iface-id': 'b2eac748-00a1-49c4-847a-c446b4e6149b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6f:8d:56', 'vm-uuid': '53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.773 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:07 np0005539505 NetworkManager[55134]: <info>  [1764402427.7740] manager: (tapb2eac748-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/384)
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.775 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.781 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:07 np0005539505 nova_compute[186958]: 2025-11-29 07:47:07.783 186962 INFO os_vif [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:8d:56,bridge_name='br-int',has_traffic_filtering=True,id=b2eac748-00a1-49c4-847a-c446b4e6149b,network=Network(6a4724ea-f380-4fe0-baf0-3c9adaf0ad69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2eac748-00')#033[00m
Nov 29 02:47:08 np0005539505 nova_compute[186958]: 2025-11-29 07:47:08.004 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:47:08 np0005539505 nova_compute[186958]: 2025-11-29 07:47:08.004 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:47:08 np0005539505 nova_compute[186958]: 2025-11-29 07:47:08.005 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] No VIF found with MAC fa:16:3e:6f:8d:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:47:08 np0005539505 nova_compute[186958]: 2025-11-29 07:47:08.005 186962 INFO nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Using config drive#033[00m
Nov 29 02:47:08 np0005539505 nova_compute[186958]: 2025-11-29 07:47:08.044 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:08 np0005539505 podman[250024]: 2025-11-29 07:47:08.722124383 +0000 UTC m=+0.054132213 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:47:08 np0005539505 podman[250025]: 2025-11-29 07:47:08.767287508 +0000 UTC m=+0.095281913 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:47:08 np0005539505 nova_compute[186958]: 2025-11-29 07:47:08.912 186962 INFO nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Creating config drive at /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk.config#033[00m
Nov 29 02:47:08 np0005539505 nova_compute[186958]: 2025-11-29 07:47:08.919 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgcoguqmi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.050 186962 DEBUG oslo_concurrency.processutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgcoguqmi" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:47:09 np0005539505 kernel: tapb2eac748-00: entered promiscuous mode
Nov 29 02:47:09 np0005539505 NetworkManager[55134]: <info>  [1764402429.1025] manager: (tapb2eac748-00): new Tun device (/org/freedesktop/NetworkManager/Devices/385)
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.104 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.108 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:09Z|00784|binding|INFO|Claiming lport b2eac748-00a1-49c4-847a-c446b4e6149b for this chassis.
Nov 29 02:47:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:09Z|00785|binding|INFO|b2eac748-00a1-49c4-847a-c446b4e6149b: Claiming fa:16:3e:6f:8d:56 10.100.0.6
Nov 29 02:47:09 np0005539505 systemd-machined[153285]: New machine qemu-82-instance-000000a8.
Nov 29 02:47:09 np0005539505 systemd-udevd[250090]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.159 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:09 np0005539505 NetworkManager[55134]: <info>  [1764402429.1636] device (tapb2eac748-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:47:09 np0005539505 NetworkManager[55134]: <info>  [1764402429.1644] device (tapb2eac748-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:47:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:09Z|00786|binding|INFO|Setting lport b2eac748-00a1-49c4-847a-c446b4e6149b ovn-installed in OVS
Nov 29 02:47:09 np0005539505 systemd[1]: Started Virtual Machine qemu-82-instance-000000a8.
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.166 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:09Z|00787|binding|INFO|Setting lport b2eac748-00a1-49c4-847a-c446b4e6149b up in Southbound
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.239 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:8d:56 10.100.0.6'], port_security=['fa:16:3e:6f:8d:56 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd54ea4b2-8fd9-432a-a17e-06c22d030374', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ea582bf-1e13-4bde-a32f-d40e4d0b5b4f, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b2eac748-00a1-49c4-847a-c446b4e6149b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.241 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b2eac748-00a1-49c4-847a-c446b4e6149b in datapath 6a4724ea-f380-4fe0-baf0-3c9adaf0ad69 bound to our chassis#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.243 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a4724ea-f380-4fe0-baf0-3c9adaf0ad69#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.254 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab68a66-a615-441b-ba2f-00f1f1d8529e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.255 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a4724ea-f1 in ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.257 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a4724ea-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.257 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7507da1a-b9d5-4783-8e21-eef6fe51df85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.258 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5bfbfd83-e5d3-4651-b7dd-15860711c35e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.269 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[06abac07-3d22-40f3-9c0e-90a028ed777c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.280 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c7d062b7-5605-4adf-8e41-bec282a8042a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.312 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[6d857232-b549-48a2-b168-5188773744da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.318 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf2a488-d3a1-4b40-a9f2-07597738ce99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 NetworkManager[55134]: <info>  [1764402429.3199] manager: (tap6a4724ea-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/386)
Nov 29 02:47:09 np0005539505 systemd-udevd[250092]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.366 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[b8f113b8-2090-4262-a51a-f0cf5ae95b50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.368 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6a79c7-7ec3-40a8-b817-33df9cbc0f44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 NetworkManager[55134]: <info>  [1764402429.3928] device (tap6a4724ea-f0): carrier: link connected
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.400 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb3b17d-f9d9-4ae2-ae40-7a2deea5139d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.421 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5fba3fdc-a7b5-4759-ac06-f0c5770dc4c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4724ea-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:9a:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 244], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 787697, 'reachable_time': 33378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250123, 'error': None, 'target': 'ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.444 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e2dfe5-3c50-495b-b413-7c14322cf066]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:9a0c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 787697, 'tstamp': 787697}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250124, 'error': None, 'target': 'ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.466 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6dea8bd9-ac54-49e7-9b78-f86626e02f51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a4724ea-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:9a:0c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 244], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 787697, 'reachable_time': 33378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250125, 'error': None, 'target': 'ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.505 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7c21305a-581d-489e-9bc3-27cf734635d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.584 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1a9b6f0d-60c0-40ac-afe8-09da485e9078]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.586 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4724ea-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.586 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.587 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a4724ea-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:09 np0005539505 NetworkManager[55134]: <info>  [1764402429.5897] manager: (tap6a4724ea-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/387)
Nov 29 02:47:09 np0005539505 kernel: tap6a4724ea-f0: entered promiscuous mode
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.590 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.592 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a4724ea-f0, col_values=(('external_ids', {'iface-id': 'a0159633-cdfd-455e-8859-4bf02f43afa1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.594 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:09 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:09Z|00788|binding|INFO|Releasing lport a0159633-cdfd-455e-8859-4bf02f43afa1 from this chassis (sb_readonly=0)
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.594 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.595 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a4724ea-f380-4fe0-baf0-3c9adaf0ad69.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a4724ea-f380-4fe0-baf0-3c9adaf0ad69.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.596 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[df4a99ef-a9f4-4571-9a2a-b260b37989dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.597 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/6a4724ea-f380-4fe0-baf0-3c9adaf0ad69.pid.haproxy
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 6a4724ea-f380-4fe0-baf0-3c9adaf0ad69
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:47:09 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:09.598 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69', 'env', 'PROCESS_TAG=haproxy-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a4724ea-f380-4fe0-baf0-3c9adaf0ad69.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.606 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.622 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402429.6219728, 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.623 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] VM Started (Lifecycle Event)#033[00m
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.689 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.695 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402429.6235497, 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:47:09 np0005539505 nova_compute[186958]: 2025-11-29 07:47:09.695 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:47:10 np0005539505 podman[250163]: 2025-11-29 07:47:09.941088741 +0000 UTC m=+0.020368355 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.042 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.046 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:47:10 np0005539505 podman[250163]: 2025-11-29 07:47:10.183493161 +0000 UTC m=+0.262772765 container create c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:47:10 np0005539505 systemd[1]: Started libpod-conmon-c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958.scope.
Nov 29 02:47:10 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:47:10 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc38caba359d0a622161f7646b5bbceec42191db82cac50a057b53e5d2528918/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:47:10 np0005539505 podman[250163]: 2025-11-29 07:47:10.373417195 +0000 UTC m=+0.452696809 container init c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:47:10 np0005539505 podman[250163]: 2025-11-29 07:47:10.3791649 +0000 UTC m=+0.458444504 container start c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 29 02:47:10 np0005539505 neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69[250178]: [NOTICE]   (250182) : New worker (250184) forked
Nov 29 02:47:10 np0005539505 neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69[250178]: [NOTICE]   (250182) : Loading success.
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.400 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.577 186962 DEBUG nova.network.neutron [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Updated VIF entry in instance network info cache for port b2eac748-00a1-49c4-847a-c446b4e6149b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.577 186962 DEBUG nova.network.neutron [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Updating instance_info_cache with network_info: [{"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.640 186962 DEBUG oslo_concurrency.lockutils [req-4411a639-bcbd-42af-81ea-2beb39f48257 req-88b9eddf-a133-4190-b0e5-02f374849171 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.993 186962 DEBUG nova.compute.manager [req-26c29aba-afdf-4cfe-86fb-18f5361fd6e0 req-a211fe4e-8e2b-4206-9ce9-836de5378b62 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Received event network-vif-plugged-b2eac748-00a1-49c4-847a-c446b4e6149b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.994 186962 DEBUG oslo_concurrency.lockutils [req-26c29aba-afdf-4cfe-86fb-18f5361fd6e0 req-a211fe4e-8e2b-4206-9ce9-836de5378b62 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.995 186962 DEBUG oslo_concurrency.lockutils [req-26c29aba-afdf-4cfe-86fb-18f5361fd6e0 req-a211fe4e-8e2b-4206-9ce9-836de5378b62 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.996 186962 DEBUG oslo_concurrency.lockutils [req-26c29aba-afdf-4cfe-86fb-18f5361fd6e0 req-a211fe4e-8e2b-4206-9ce9-836de5378b62 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.996 186962 DEBUG nova.compute.manager [req-26c29aba-afdf-4cfe-86fb-18f5361fd6e0 req-a211fe4e-8e2b-4206-9ce9-836de5378b62 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Processing event network-vif-plugged-b2eac748-00a1-49c4-847a-c446b4e6149b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:47:10 np0005539505 nova_compute[186958]: 2025-11-29 07:47:10.997 186962 DEBUG nova.compute.manager [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.002 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402431.0017893, 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.002 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.005 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.009 186962 INFO nova.virt.libvirt.driver [-] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Instance spawned successfully.#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.010 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.061 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.067 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.070 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.071 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.071 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.071 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.072 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.072 186962 DEBUG nova.virt.libvirt.driver [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.159 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.201 186962 INFO nova.compute.manager [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Took 16.59 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.202 186962 DEBUG nova.compute.manager [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:47:11 np0005539505 nova_compute[186958]: 2025-11-29 07:47:11.872 186962 INFO nova.compute.manager [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Took 20.00 seconds to build instance.#033[00m
Nov 29 02:47:12 np0005539505 nova_compute[186958]: 2025-11-29 07:47:12.275 186962 DEBUG oslo_concurrency.lockutils [None req-89bffb13-a704-4217-ba35-d03d45856c6f 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:12 np0005539505 podman[250194]: 2025-11-29 07:47:12.741663453 +0000 UTC m=+0.065324164 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 29 02:47:12 np0005539505 podman[250193]: 2025-11-29 07:47:12.750873437 +0000 UTC m=+0.078081349 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:47:12 np0005539505 nova_compute[186958]: 2025-11-29 07:47:12.775 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:13 np0005539505 nova_compute[186958]: 2025-11-29 07:47:13.046 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:13 np0005539505 nova_compute[186958]: 2025-11-29 07:47:13.243 186962 DEBUG nova.compute.manager [req-a1ad5f10-8b8f-407f-9720-9d137c04d6aa req-a4eb57d0-20f7-4633-92c0-3abd9bb48f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Received event network-vif-plugged-b2eac748-00a1-49c4-847a-c446b4e6149b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:13 np0005539505 nova_compute[186958]: 2025-11-29 07:47:13.244 186962 DEBUG oslo_concurrency.lockutils [req-a1ad5f10-8b8f-407f-9720-9d137c04d6aa req-a4eb57d0-20f7-4633-92c0-3abd9bb48f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:13 np0005539505 nova_compute[186958]: 2025-11-29 07:47:13.244 186962 DEBUG oslo_concurrency.lockutils [req-a1ad5f10-8b8f-407f-9720-9d137c04d6aa req-a4eb57d0-20f7-4633-92c0-3abd9bb48f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:13 np0005539505 nova_compute[186958]: 2025-11-29 07:47:13.245 186962 DEBUG oslo_concurrency.lockutils [req-a1ad5f10-8b8f-407f-9720-9d137c04d6aa req-a4eb57d0-20f7-4633-92c0-3abd9bb48f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:13 np0005539505 nova_compute[186958]: 2025-11-29 07:47:13.245 186962 DEBUG nova.compute.manager [req-a1ad5f10-8b8f-407f-9720-9d137c04d6aa req-a4eb57d0-20f7-4633-92c0-3abd9bb48f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] No waiting events found dispatching network-vif-plugged-b2eac748-00a1-49c4-847a-c446b4e6149b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:47:13 np0005539505 nova_compute[186958]: 2025-11-29 07:47:13.246 186962 WARNING nova.compute.manager [req-a1ad5f10-8b8f-407f-9720-9d137c04d6aa req-a4eb57d0-20f7-4633-92c0-3abd9bb48f9c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Received unexpected event network-vif-plugged-b2eac748-00a1-49c4-847a-c446b4e6149b for instance with vm_state active and task_state None.#033[00m
Nov 29 02:47:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:14.004 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:47:14 np0005539505 nova_compute[186958]: 2025-11-29 07:47:14.005 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:14 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:14.006 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:47:14 np0005539505 nova_compute[186958]: 2025-11-29 07:47:14.767 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:14 np0005539505 NetworkManager[55134]: <info>  [1764402434.7684] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/388)
Nov 29 02:47:14 np0005539505 NetworkManager[55134]: <info>  [1764402434.7690] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/389)
Nov 29 02:47:14 np0005539505 nova_compute[186958]: 2025-11-29 07:47:14.895 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:14Z|00789|binding|INFO|Releasing lport a0159633-cdfd-455e-8859-4bf02f43afa1 from this chassis (sb_readonly=0)
Nov 29 02:47:14 np0005539505 nova_compute[186958]: 2025-11-29 07:47:14.915 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:15 np0005539505 nova_compute[186958]: 2025-11-29 07:47:15.500 186962 DEBUG nova.compute.manager [req-0cb3f2d1-b4ee-4cc2-849d-3ffeb9d90883 req-224d3ebb-7e19-456b-b165-a4884498229d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Received event network-changed-b2eac748-00a1-49c4-847a-c446b4e6149b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:15 np0005539505 nova_compute[186958]: 2025-11-29 07:47:15.501 186962 DEBUG nova.compute.manager [req-0cb3f2d1-b4ee-4cc2-849d-3ffeb9d90883 req-224d3ebb-7e19-456b-b165-a4884498229d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Refreshing instance network info cache due to event network-changed-b2eac748-00a1-49c4-847a-c446b4e6149b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:47:15 np0005539505 nova_compute[186958]: 2025-11-29 07:47:15.501 186962 DEBUG oslo_concurrency.lockutils [req-0cb3f2d1-b4ee-4cc2-849d-3ffeb9d90883 req-224d3ebb-7e19-456b-b165-a4884498229d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:47:15 np0005539505 nova_compute[186958]: 2025-11-29 07:47:15.501 186962 DEBUG oslo_concurrency.lockutils [req-0cb3f2d1-b4ee-4cc2-849d-3ffeb9d90883 req-224d3ebb-7e19-456b-b165-a4884498229d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:47:15 np0005539505 nova_compute[186958]: 2025-11-29 07:47:15.502 186962 DEBUG nova.network.neutron [req-0cb3f2d1-b4ee-4cc2-849d-3ffeb9d90883 req-224d3ebb-7e19-456b-b165-a4884498229d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Refreshing network info cache for port b2eac748-00a1-49c4-847a-c446b4e6149b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:47:17 np0005539505 nova_compute[186958]: 2025-11-29 07:47:17.778 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:18 np0005539505 nova_compute[186958]: 2025-11-29 07:47:18.048 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:19 np0005539505 nova_compute[186958]: 2025-11-29 07:47:19.783 186962 DEBUG nova.network.neutron [req-0cb3f2d1-b4ee-4cc2-849d-3ffeb9d90883 req-224d3ebb-7e19-456b-b165-a4884498229d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Updated VIF entry in instance network info cache for port b2eac748-00a1-49c4-847a-c446b4e6149b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:47:19 np0005539505 nova_compute[186958]: 2025-11-29 07:47:19.784 186962 DEBUG nova.network.neutron [req-0cb3f2d1-b4ee-4cc2-849d-3ffeb9d90883 req-224d3ebb-7e19-456b-b165-a4884498229d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Updating instance_info_cache with network_info: [{"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:47:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:20.008 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:20 np0005539505 nova_compute[186958]: 2025-11-29 07:47:20.175 186962 DEBUG oslo_concurrency.lockutils [req-0cb3f2d1-b4ee-4cc2-849d-3ffeb9d90883 req-224d3ebb-7e19-456b-b165-a4884498229d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:47:22 np0005539505 nova_compute[186958]: 2025-11-29 07:47:22.782 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:23 np0005539505 nova_compute[186958]: 2025-11-29 07:47:23.050 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:25Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6f:8d:56 10.100.0.6
Nov 29 02:47:25 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:25Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6f:8d:56 10.100.0.6
Nov 29 02:47:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:27.532 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:27.533 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:27.534 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:27 np0005539505 podman[250252]: 2025-11-29 07:47:27.720232821 +0000 UTC m=+0.048538763 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:47:27 np0005539505 podman[250251]: 2025-11-29 07:47:27.725127341 +0000 UTC m=+0.056314086 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Nov 29 02:47:27 np0005539505 nova_compute[186958]: 2025-11-29 07:47:27.787 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:28 np0005539505 nova_compute[186958]: 2025-11-29 07:47:28.051 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:31 np0005539505 podman[250295]: 2025-11-29 07:47:31.721022704 +0000 UTC m=+0.053684311 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:47:31 np0005539505 nova_compute[186958]: 2025-11-29 07:47:31.944 186962 INFO nova.compute.manager [None req-35459942-4cb7-4987-a4f4-aa56adfe45cd 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Get console output#033[00m
Nov 29 02:47:31 np0005539505 nova_compute[186958]: 2025-11-29 07:47:31.949 213540 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 29 02:47:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:32Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6f:8d:56 10.100.0.6
Nov 29 02:47:32 np0005539505 nova_compute[186958]: 2025-11-29 07:47:32.792 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:33 np0005539505 nova_compute[186958]: 2025-11-29 07:47:33.053 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:34Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6f:8d:56 10.100.0.6
Nov 29 02:47:35 np0005539505 nova_compute[186958]: 2025-11-29 07:47:35.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:35 np0005539505 nova_compute[186958]: 2025-11-29 07:47:35.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:35 np0005539505 nova_compute[186958]: 2025-11-29 07:47:35.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:47:35 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:35Z|00790|binding|INFO|Releasing lport a0159633-cdfd-455e-8859-4bf02f43afa1 from this chassis (sb_readonly=0)
Nov 29 02:47:35 np0005539505 nova_compute[186958]: 2025-11-29 07:47:35.933 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:36 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:36Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6f:8d:56 10.100.0.6
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.294 186962 DEBUG nova.compute.manager [req-c79e5085-6525-4dc1-afd7-0aa0d2c9656a req-c4013300-3b5f-4ae2-83de-e6469b48d71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Received event network-changed-b2eac748-00a1-49c4-847a-c446b4e6149b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.294 186962 DEBUG nova.compute.manager [req-c79e5085-6525-4dc1-afd7-0aa0d2c9656a req-c4013300-3b5f-4ae2-83de-e6469b48d71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Refreshing instance network info cache due to event network-changed-b2eac748-00a1-49c4-847a-c446b4e6149b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.295 186962 DEBUG oslo_concurrency.lockutils [req-c79e5085-6525-4dc1-afd7-0aa0d2c9656a req-c4013300-3b5f-4ae2-83de-e6469b48d71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.295 186962 DEBUG oslo_concurrency.lockutils [req-c79e5085-6525-4dc1-afd7-0aa0d2c9656a req-c4013300-3b5f-4ae2-83de-e6469b48d71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.295 186962 DEBUG nova.network.neutron [req-c79e5085-6525-4dc1-afd7-0aa0d2c9656a req-c4013300-3b5f-4ae2-83de-e6469b48d71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Refreshing network info cache for port b2eac748-00a1-49c4-847a-c446b4e6149b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.391 186962 DEBUG oslo_concurrency.lockutils [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.392 186962 DEBUG oslo_concurrency.lockutils [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.392 186962 DEBUG oslo_concurrency.lockutils [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.392 186962 DEBUG oslo_concurrency.lockutils [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.392 186962 DEBUG oslo_concurrency.lockutils [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.408 186962 INFO nova.compute.manager [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Terminating instance#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.422 186962 DEBUG nova.compute.manager [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:47:37 np0005539505 kernel: tapb2eac748-00 (unregistering): left promiscuous mode
Nov 29 02:47:37 np0005539505 NetworkManager[55134]: <info>  [1764402457.5333] device (tapb2eac748-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.547 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:37Z|00791|binding|INFO|Releasing lport b2eac748-00a1-49c4-847a-c446b4e6149b from this chassis (sb_readonly=0)
Nov 29 02:47:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:37Z|00792|binding|INFO|Setting lport b2eac748-00a1-49c4-847a-c446b4e6149b down in Southbound
Nov 29 02:47:37 np0005539505 ovn_controller[95143]: 2025-11-29T07:47:37Z|00793|binding|INFO|Removing iface tapb2eac748-00 ovn-installed in OVS
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.549 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.564 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:37 np0005539505 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a8.scope: Deactivated successfully.
Nov 29 02:47:37 np0005539505 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000a8.scope: Consumed 13.405s CPU time.
Nov 29 02:47:37 np0005539505 systemd-machined[153285]: Machine qemu-82-instance-000000a8 terminated.
Nov 29 02:47:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:37.634 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:8d:56 10.100.0.6'], port_security=['fa:16:3e:6f:8d:56 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ec8b80be17a14d1caf666636283749d0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd54ea4b2-8fd9-432a-a17e-06c22d030374', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4ea582bf-1e13-4bde-a32f-d40e4d0b5b4f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=b2eac748-00a1-49c4-847a-c446b4e6149b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:47:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:37.635 104094 INFO neutron.agent.ovn.metadata.agent [-] Port b2eac748-00a1-49c4-847a-c446b4e6149b in datapath 6a4724ea-f380-4fe0-baf0-3c9adaf0ad69 unbound from our chassis#033[00m
Nov 29 02:47:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:37.637 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a4724ea-f380-4fe0-baf0-3c9adaf0ad69, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:47:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:37.638 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[82fc17b7-e4cc-4737-83e9-1fc7d6b116dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:37.639 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69 namespace which is not needed anymore#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.652 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.657 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.697 186962 INFO nova.virt.libvirt.driver [-] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Instance destroyed successfully.#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.698 186962 DEBUG nova.objects.instance [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lazy-loading 'resources' on Instance uuid 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.793 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:37 np0005539505 neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69[250178]: [NOTICE]   (250182) : haproxy version is 2.8.14-c23fe91
Nov 29 02:47:37 np0005539505 neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69[250178]: [NOTICE]   (250182) : path to executable is /usr/sbin/haproxy
Nov 29 02:47:37 np0005539505 neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69[250178]: [WARNING]  (250182) : Exiting Master process...
Nov 29 02:47:37 np0005539505 neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69[250178]: [ALERT]    (250182) : Current worker (250184) exited with code 143 (Terminated)
Nov 29 02:47:37 np0005539505 neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69[250178]: [WARNING]  (250182) : All workers exited. Exiting... (0)
Nov 29 02:47:37 np0005539505 systemd[1]: libpod-c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958.scope: Deactivated successfully.
Nov 29 02:47:37 np0005539505 podman[250357]: 2025-11-29 07:47:37.810168681 +0000 UTC m=+0.081826587 container died c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.836 186962 DEBUG nova.virt.libvirt.vif [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:46:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1848280602',display_name='tempest-TestNetworkBasicOps-server-1848280602',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1848280602',id=168,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKuAI6G4e10qXoY3lrFstC9/JrGmGx5HNROpZ+HVF30JLHpW0IRd8xFUUHjWa+4hkeT4LDlnugY6EC18c9DPpabAOdbctgDgdlq2B/acxzm/dLBKVujfLUNTmKNvN0S4kQ==',key_name='tempest-TestNetworkBasicOps-2091602780',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:47:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ec8b80be17a14d1caf666636283749d0',ramdisk_id='',reservation_id='r-fyqwdgad',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1587012976',owner_user_name='tempest-TestNetworkBasicOps-1587012976-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:47:11Z,user_data=None,user_id='1dd0a7f5aaff402eb032cd5e60540dcb',uuid=53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.837 186962 DEBUG nova.network.os_vif_util [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converting VIF {"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.838 186962 DEBUG nova.network.os_vif_util [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6f:8d:56,bridge_name='br-int',has_traffic_filtering=True,id=b2eac748-00a1-49c4-847a-c446b4e6149b,network=Network(6a4724ea-f380-4fe0-baf0-3c9adaf0ad69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2eac748-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.839 186962 DEBUG os_vif [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:8d:56,bridge_name='br-int',has_traffic_filtering=True,id=b2eac748-00a1-49c4-847a-c446b4e6149b,network=Network(6a4724ea-f380-4fe0-baf0-3c9adaf0ad69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2eac748-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.842 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.843 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2eac748-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.847 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.850 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.855 186962 INFO os_vif [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:8d:56,bridge_name='br-int',has_traffic_filtering=True,id=b2eac748-00a1-49c4-847a-c446b4e6149b,network=Network(6a4724ea-f380-4fe0-baf0-3c9adaf0ad69),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2eac748-00')#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.855 186962 INFO nova.virt.libvirt.driver [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Deleting instance files /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd_del#033[00m
Nov 29 02:47:37 np0005539505 nova_compute[186958]: 2025-11-29 07:47:37.856 186962 INFO nova.virt.libvirt.driver [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Deletion of /var/lib/nova/instances/53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd_del complete#033[00m
Nov 29 02:47:38 np0005539505 nova_compute[186958]: 2025-11-29 07:47:38.056 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:38 np0005539505 nova_compute[186958]: 2025-11-29 07:47:38.087 186962 INFO nova.compute.manager [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Took 0.66 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:47:38 np0005539505 nova_compute[186958]: 2025-11-29 07:47:38.088 186962 DEBUG oslo.service.loopingcall [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:47:38 np0005539505 nova_compute[186958]: 2025-11-29 07:47:38.088 186962 DEBUG nova.compute.manager [-] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:47:38 np0005539505 nova_compute[186958]: 2025-11-29 07:47:38.089 186962 DEBUG nova.network.neutron [-] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:47:38 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958-userdata-shm.mount: Deactivated successfully.
Nov 29 02:47:38 np0005539505 systemd[1]: var-lib-containers-storage-overlay-dc38caba359d0a622161f7646b5bbceec42191db82cac50a057b53e5d2528918-merged.mount: Deactivated successfully.
Nov 29 02:47:38 np0005539505 nova_compute[186958]: 2025-11-29 07:47:38.555 186962 DEBUG nova.compute.manager [req-80488c21-d690-479a-972d-6f6100c7aab9 req-6ad2e833-421c-4d41-85c6-f53076377a66 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Received event network-vif-unplugged-b2eac748-00a1-49c4-847a-c446b4e6149b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:38 np0005539505 nova_compute[186958]: 2025-11-29 07:47:38.555 186962 DEBUG oslo_concurrency.lockutils [req-80488c21-d690-479a-972d-6f6100c7aab9 req-6ad2e833-421c-4d41-85c6-f53076377a66 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:38 np0005539505 nova_compute[186958]: 2025-11-29 07:47:38.555 186962 DEBUG oslo_concurrency.lockutils [req-80488c21-d690-479a-972d-6f6100c7aab9 req-6ad2e833-421c-4d41-85c6-f53076377a66 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:38 np0005539505 nova_compute[186958]: 2025-11-29 07:47:38.555 186962 DEBUG oslo_concurrency.lockutils [req-80488c21-d690-479a-972d-6f6100c7aab9 req-6ad2e833-421c-4d41-85c6-f53076377a66 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:38 np0005539505 nova_compute[186958]: 2025-11-29 07:47:38.556 186962 DEBUG nova.compute.manager [req-80488c21-d690-479a-972d-6f6100c7aab9 req-6ad2e833-421c-4d41-85c6-f53076377a66 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] No waiting events found dispatching network-vif-unplugged-b2eac748-00a1-49c4-847a-c446b4e6149b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:47:38 np0005539505 nova_compute[186958]: 2025-11-29 07:47:38.556 186962 DEBUG nova.compute.manager [req-80488c21-d690-479a-972d-6f6100c7aab9 req-6ad2e833-421c-4d41-85c6-f53076377a66 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Received event network-vif-unplugged-b2eac748-00a1-49c4-847a-c446b4e6149b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:47:38 np0005539505 podman[250357]: 2025-11-29 07:47:38.834854569 +0000 UTC m=+1.106512445 container cleanup c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:47:38 np0005539505 systemd[1]: libpod-conmon-c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958.scope: Deactivated successfully.
Nov 29 02:47:39 np0005539505 nova_compute[186958]: 2025-11-29 07:47:39.360 186962 DEBUG nova.network.neutron [req-c79e5085-6525-4dc1-afd7-0aa0d2c9656a req-c4013300-3b5f-4ae2-83de-e6469b48d71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Updated VIF entry in instance network info cache for port b2eac748-00a1-49c4-847a-c446b4e6149b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:47:39 np0005539505 nova_compute[186958]: 2025-11-29 07:47:39.361 186962 DEBUG nova.network.neutron [req-c79e5085-6525-4dc1-afd7-0aa0d2c9656a req-c4013300-3b5f-4ae2-83de-e6469b48d71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Updating instance_info_cache with network_info: [{"id": "b2eac748-00a1-49c4-847a-c446b4e6149b", "address": "fa:16:3e:6f:8d:56", "network": {"id": "6a4724ea-f380-4fe0-baf0-3c9adaf0ad69", "bridge": "br-int", "label": "tempest-network-smoke--1656840589", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ec8b80be17a14d1caf666636283749d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2eac748-00", "ovs_interfaceid": "b2eac748-00a1-49c4-847a-c446b4e6149b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:47:39 np0005539505 nova_compute[186958]: 2025-11-29 07:47:39.405 186962 DEBUG oslo_concurrency.lockutils [req-c79e5085-6525-4dc1-afd7-0aa0d2c9656a req-c4013300-3b5f-4ae2-83de-e6469b48d71a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:47:39 np0005539505 nova_compute[186958]: 2025-11-29 07:47:39.407 186962 DEBUG nova.network.neutron [-] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:47:39 np0005539505 nova_compute[186958]: 2025-11-29 07:47:39.429 186962 INFO nova.compute.manager [-] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Took 1.34 seconds to deallocate network for instance.#033[00m
Nov 29 02:47:39 np0005539505 podman[250389]: 2025-11-29 07:47:39.429949959 +0000 UTC m=+0.567853950 container remove c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:47:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:39.435 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c0a5a2dc-a44b-4067-b7dc-b4a8cdb99dc4]: (4, ('Sat Nov 29 07:47:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69 (c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958)\nc125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958\nSat Nov 29 07:47:38 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69 (c125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958)\nc125d529f337a4d11fae79e85c739438fc769fb5747ed9a419c3a9de15b00958\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:39.439 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[19450b32-b7b8-4795-8a27-14123fb71b0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:39.441 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a4724ea-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:47:39 np0005539505 nova_compute[186958]: 2025-11-29 07:47:39.443 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:39 np0005539505 kernel: tap6a4724ea-f0: left promiscuous mode
Nov 29 02:47:39 np0005539505 nova_compute[186958]: 2025-11-29 07:47:39.457 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:39.462 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d19f66cd-7f13-4c3a-8b17-bffbf4dd0ede]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:39.473 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dde3a74d-c9a3-4f37-a1be-d8c767d5d80f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:39.475 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[82f47716-f1c0-4f4e-9e2f-bbee5a362295]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:39.496 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ea01599c-2658-4a7b-b386-3e85f7a9eb49]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 787689, 'reachable_time': 32645, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250448, 'error': None, 'target': 'ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:39.499 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a4724ea-f380-4fe0-baf0-3c9adaf0ad69 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:47:39 np0005539505 systemd[1]: run-netns-ovnmeta\x2d6a4724ea\x2df380\x2d4fe0\x2dbaf0\x2d3c9adaf0ad69.mount: Deactivated successfully.
Nov 29 02:47:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:47:39.500 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[5b0aa1b3-03ff-4063-ba1d-d9e3fba957b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:47:39 np0005539505 podman[250390]: 2025-11-29 07:47:39.503385165 +0000 UTC m=+0.630068985 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:47:39 np0005539505 podman[250396]: 2025-11-29 07:47:39.504910718 +0000 UTC m=+0.624727421 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Nov 29 02:47:39 np0005539505 nova_compute[186958]: 2025-11-29 07:47:39.664 186962 DEBUG nova.compute.manager [req-e7a9b67d-3778-46fd-8fb7-134a1f97635c req-fb051169-e6b0-4d8d-9878-b889db323d90 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Received event network-vif-deleted-b2eac748-00a1-49c4-847a-c446b4e6149b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:40 np0005539505 nova_compute[186958]: 2025-11-29 07:47:40.197 186962 DEBUG oslo_concurrency.lockutils [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:40 np0005539505 nova_compute[186958]: 2025-11-29 07:47:40.198 186962 DEBUG oslo_concurrency.lockutils [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:40 np0005539505 nova_compute[186958]: 2025-11-29 07:47:40.266 186962 DEBUG nova.compute.provider_tree [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:47:40 np0005539505 nova_compute[186958]: 2025-11-29 07:47:40.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:40 np0005539505 nova_compute[186958]: 2025-11-29 07:47:40.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:47:40 np0005539505 nova_compute[186958]: 2025-11-29 07:47:40.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.050 186962 DEBUG nova.compute.manager [req-079b8189-10a8-4bc2-a02f-5c8467483489 req-c1c8440e-31f4-4fde-a391-16a3addeda02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Received event network-vif-plugged-b2eac748-00a1-49c4-847a-c446b4e6149b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.052 186962 DEBUG oslo_concurrency.lockutils [req-079b8189-10a8-4bc2-a02f-5c8467483489 req-c1c8440e-31f4-4fde-a391-16a3addeda02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.052 186962 DEBUG oslo_concurrency.lockutils [req-079b8189-10a8-4bc2-a02f-5c8467483489 req-c1c8440e-31f4-4fde-a391-16a3addeda02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.053 186962 DEBUG oslo_concurrency.lockutils [req-079b8189-10a8-4bc2-a02f-5c8467483489 req-c1c8440e-31f4-4fde-a391-16a3addeda02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.053 186962 DEBUG nova.compute.manager [req-079b8189-10a8-4bc2-a02f-5c8467483489 req-c1c8440e-31f4-4fde-a391-16a3addeda02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] No waiting events found dispatching network-vif-plugged-b2eac748-00a1-49c4-847a-c446b4e6149b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.054 186962 WARNING nova.compute.manager [req-079b8189-10a8-4bc2-a02f-5c8467483489 req-c1c8440e-31f4-4fde-a391-16a3addeda02 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Received unexpected event network-vif-plugged-b2eac748-00a1-49c4-847a-c446b4e6149b for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.058 186962 DEBUG nova.scheduler.client.report [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.083 186962 DEBUG oslo_concurrency.lockutils [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.885s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.119 186962 INFO nova.scheduler.client.report [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Deleted allocations for instance 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.223 186962 DEBUG oslo_concurrency.lockutils [None req-53cf1a35-44a5-4d01-89c9-dd916e77fb58 1dd0a7f5aaff402eb032cd5e60540dcb ec8b80be17a14d1caf666636283749d0 - - default default] Lock "53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.831s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.229 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.229 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.229 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.229 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.280 186962 DEBUG nova.compute.utils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Can not refresh info_cache because instance was not found refresh_info_cache_for_instance /usr/lib/python3.9/site-packages/nova/compute/utils.py:1010#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.465 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.838 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.881 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.882 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:47:41 np0005539505 nova_compute[186958]: 2025-11-29 07:47:41.882 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:42 np0005539505 nova_compute[186958]: 2025-11-29 07:47:42.849 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:43 np0005539505 nova_compute[186958]: 2025-11-29 07:47:43.058 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:43 np0005539505 podman[250453]: 2025-11-29 07:47:43.721360095 +0000 UTC m=+0.055556174 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 29 02:47:43 np0005539505 podman[250452]: 2025-11-29 07:47:43.746091214 +0000 UTC m=+0.080227631 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.402 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.403 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.403 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.403 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.573 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.575 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5665MB free_disk=73.07304000854492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.575 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.575 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.640 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.641 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.668 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.682 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.701 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:47:44 np0005539505 nova_compute[186958]: 2025-11-29 07:47:44.701 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:47:46 np0005539505 nova_compute[186958]: 2025-11-29 07:47:46.938 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:47 np0005539505 nova_compute[186958]: 2025-11-29 07:47:47.167 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:47 np0005539505 nova_compute[186958]: 2025-11-29 07:47:47.851 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:48 np0005539505 nova_compute[186958]: 2025-11-29 07:47:48.059 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:50 np0005539505 nova_compute[186958]: 2025-11-29 07:47:50.696 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:51 np0005539505 nova_compute[186958]: 2025-11-29 07:47:51.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:52 np0005539505 nova_compute[186958]: 2025-11-29 07:47:52.696 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402457.6947854, 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:47:52 np0005539505 nova_compute[186958]: 2025-11-29 07:47:52.696 186962 INFO nova.compute.manager [-] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:47:52 np0005539505 nova_compute[186958]: 2025-11-29 07:47:52.720 186962 DEBUG nova.compute.manager [None req-49bb84ba-751e-406b-b46c-2b35882c6ebe - - - - - -] [instance: 53ea27aa-26a1-4d2d-bb59-2b2c650fd2bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:47:52 np0005539505 nova_compute[186958]: 2025-11-29 07:47:52.854 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:53 np0005539505 nova_compute[186958]: 2025-11-29 07:47:53.061 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:56 np0005539505 nova_compute[186958]: 2025-11-29 07:47:56.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:47:57 np0005539505 nova_compute[186958]: 2025-11-29 07:47:57.858 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:47:57 np0005539505 podman[250490]: 2025-11-29 07:47:57.943121673 +0000 UTC m=+0.059604760 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Nov 29 02:47:57 np0005539505 podman[250491]: 2025-11-29 07:47:57.953008067 +0000 UTC m=+0.065104228 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:47:58 np0005539505 nova_compute[186958]: 2025-11-29 07:47:58.063 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:02 np0005539505 podman[250536]: 2025-11-29 07:48:02.71870467 +0000 UTC m=+0.055455821 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:48:02 np0005539505 nova_compute[186958]: 2025-11-29 07:48:02.861 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:03 np0005539505 nova_compute[186958]: 2025-11-29 07:48:03.064 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:07 np0005539505 nova_compute[186958]: 2025-11-29 07:48:07.866 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:08 np0005539505 nova_compute[186958]: 2025-11-29 07:48:08.068 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:09 np0005539505 podman[250555]: 2025-11-29 07:48:09.763279706 +0000 UTC m=+0.089343342 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:48:09 np0005539505 podman[250556]: 2025-11-29 07:48:09.773699995 +0000 UTC m=+0.088637382 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 29 02:48:10 np0005539505 nova_compute[186958]: 2025-11-29 07:48:10.396 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:10 np0005539505 nova_compute[186958]: 2025-11-29 07:48:10.397 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:48:12 np0005539505 nova_compute[186958]: 2025-11-29 07:48:12.870 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:13 np0005539505 nova_compute[186958]: 2025-11-29 07:48:13.112 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:14 np0005539505 nova_compute[186958]: 2025-11-29 07:48:14.397 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:14 np0005539505 nova_compute[186958]: 2025-11-29 07:48:14.397 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:48:14 np0005539505 nova_compute[186958]: 2025-11-29 07:48:14.703 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:48:14 np0005539505 podman[250599]: 2025-11-29 07:48:14.757008207 +0000 UTC m=+0.082544308 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:48:14 np0005539505 podman[250600]: 2025-11-29 07:48:14.766144318 +0000 UTC m=+0.095483928 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm)
Nov 29 02:48:17 np0005539505 nova_compute[186958]: 2025-11-29 07:48:17.873 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:18 np0005539505 nova_compute[186958]: 2025-11-29 07:48:18.113 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:22 np0005539505 nova_compute[186958]: 2025-11-29 07:48:22.877 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:23 np0005539505 nova_compute[186958]: 2025-11-29 07:48:23.171 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:24 np0005539505 nova_compute[186958]: 2025-11-29 07:48:24.511 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:24.512 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:48:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:24.513 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:48:24 np0005539505 ovn_controller[95143]: 2025-11-29T07:48:24Z|00794|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 29 02:48:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:27.533 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:27.534 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:27 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:48:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:27.534 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:27 np0005539505 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 29 02:48:27 np0005539505 nova_compute[186958]: 2025-11-29 07:48:27.680 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:27 np0005539505 nova_compute[186958]: 2025-11-29 07:48:27.881 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:28 np0005539505 nova_compute[186958]: 2025-11-29 07:48:28.173 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:28.515 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:28 np0005539505 podman[250640]: 2025-11-29 07:48:28.726528323 +0000 UTC m=+0.054468593 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:48:28 np0005539505 podman[250641]: 2025-11-29 07:48:28.748337168 +0000 UTC m=+0.066810416 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:48:30 np0005539505 nova_compute[186958]: 2025-11-29 07:48:30.457 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "55f06dbd-4385-42ee-b258-fb14baca55e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:30 np0005539505 nova_compute[186958]: 2025-11-29 07:48:30.458 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:30 np0005539505 nova_compute[186958]: 2025-11-29 07:48:30.623 186962 DEBUG nova.compute.manager [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:48:31 np0005539505 nova_compute[186958]: 2025-11-29 07:48:31.171 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:31 np0005539505 nova_compute[186958]: 2025-11-29 07:48:31.172 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:31 np0005539505 nova_compute[186958]: 2025-11-29 07:48:31.181 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:48:31 np0005539505 nova_compute[186958]: 2025-11-29 07:48:31.182 186962 INFO nova.compute.claims [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:48:31 np0005539505 nova_compute[186958]: 2025-11-29 07:48:31.592 186962 DEBUG nova.compute.provider_tree [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:48:31 np0005539505 nova_compute[186958]: 2025-11-29 07:48:31.950 186962 DEBUG nova.scheduler.client.report [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.046 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.046 186962 DEBUG nova.compute.manager [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.127 186962 DEBUG nova.compute.manager [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.128 186962 DEBUG nova.network.neutron [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.161 186962 INFO nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.185 186962 DEBUG nova.compute.manager [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.384 186962 DEBUG nova.compute.manager [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.386 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.386 186962 INFO nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Creating image(s)#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.387 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "/var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.387 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "/var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.388 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "/var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.401 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.465 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.466 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.466 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.478 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.546 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.547 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.587 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.588 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.589 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.646 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.647 186962 DEBUG nova.virt.disk.api [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Checking if we can resize image /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.647 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.706 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.708 186962 DEBUG nova.virt.disk.api [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Cannot resize image /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.708 186962 DEBUG nova.objects.instance [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lazy-loading 'migration_context' on Instance uuid 55f06dbd-4385-42ee-b258-fb14baca55e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.726 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.727 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Ensure instance console log exists: /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.727 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.727 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.728 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:32 np0005539505 nova_compute[186958]: 2025-11-29 07:48:32.884 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:33 np0005539505 nova_compute[186958]: 2025-11-29 07:48:33.230 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:33 np0005539505 podman[250701]: 2025-11-29 07:48:33.711412719 +0000 UTC m=+0.046140074 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:48:34 np0005539505 nova_compute[186958]: 2025-11-29 07:48:34.223 186962 DEBUG nova.network.neutron [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Successfully created port: 5461b2f6-84ea-439e-a902-72bbf9a5aa30 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:48:36 np0005539505 nova_compute[186958]: 2025-11-29 07:48:36.025 186962 DEBUG nova.network.neutron [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Successfully updated port: 5461b2f6-84ea-439e-a902-72bbf9a5aa30 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:48:36 np0005539505 nova_compute[186958]: 2025-11-29 07:48:36.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:36 np0005539505 nova_compute[186958]: 2025-11-29 07:48:36.500 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "refresh_cache-55f06dbd-4385-42ee-b258-fb14baca55e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:48:36 np0005539505 nova_compute[186958]: 2025-11-29 07:48:36.500 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquired lock "refresh_cache-55f06dbd-4385-42ee-b258-fb14baca55e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:48:36 np0005539505 nova_compute[186958]: 2025-11-29 07:48:36.500 186962 DEBUG nova.network.neutron [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:48:37 np0005539505 nova_compute[186958]: 2025-11-29 07:48:37.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:37 np0005539505 nova_compute[186958]: 2025-11-29 07:48:37.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:37 np0005539505 nova_compute[186958]: 2025-11-29 07:48:37.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:48:37 np0005539505 nova_compute[186958]: 2025-11-29 07:48:37.888 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:38 np0005539505 nova_compute[186958]: 2025-11-29 07:48:38.072 186962 DEBUG nova.compute.manager [req-7d865c54-a719-4ed6-8633-36a0456f18ba req-a3d0d95b-ed76-4c47-a8a9-62fe3545d5fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Received event network-changed-5461b2f6-84ea-439e-a902-72bbf9a5aa30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:38 np0005539505 nova_compute[186958]: 2025-11-29 07:48:38.072 186962 DEBUG nova.compute.manager [req-7d865c54-a719-4ed6-8633-36a0456f18ba req-a3d0d95b-ed76-4c47-a8a9-62fe3545d5fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Refreshing instance network info cache due to event network-changed-5461b2f6-84ea-439e-a902-72bbf9a5aa30. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:48:38 np0005539505 nova_compute[186958]: 2025-11-29 07:48:38.073 186962 DEBUG oslo_concurrency.lockutils [req-7d865c54-a719-4ed6-8633-36a0456f18ba req-a3d0d95b-ed76-4c47-a8a9-62fe3545d5fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-55f06dbd-4385-42ee-b258-fb14baca55e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:48:38 np0005539505 nova_compute[186958]: 2025-11-29 07:48:38.233 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:38 np0005539505 nova_compute[186958]: 2025-11-29 07:48:38.518 186962 DEBUG nova.network.neutron [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.949 186962 DEBUG nova.network.neutron [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Updating instance_info_cache with network_info: [{"id": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "address": "fa:16:3e:96:6a:76", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5461b2f6-84", "ovs_interfaceid": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.970 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Releasing lock "refresh_cache-55f06dbd-4385-42ee-b258-fb14baca55e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.970 186962 DEBUG nova.compute.manager [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Instance network_info: |[{"id": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "address": "fa:16:3e:96:6a:76", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5461b2f6-84", "ovs_interfaceid": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.971 186962 DEBUG oslo_concurrency.lockutils [req-7d865c54-a719-4ed6-8633-36a0456f18ba req-a3d0d95b-ed76-4c47-a8a9-62fe3545d5fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-55f06dbd-4385-42ee-b258-fb14baca55e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.971 186962 DEBUG nova.network.neutron [req-7d865c54-a719-4ed6-8633-36a0456f18ba req-a3d0d95b-ed76-4c47-a8a9-62fe3545d5fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Refreshing network info cache for port 5461b2f6-84ea-439e-a902-72bbf9a5aa30 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.974 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Start _get_guest_xml network_info=[{"id": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "address": "fa:16:3e:96:6a:76", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5461b2f6-84", "ovs_interfaceid": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.978 186962 WARNING nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.982 186962 DEBUG nova.virt.libvirt.host [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.983 186962 DEBUG nova.virt.libvirt.host [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.986 186962 DEBUG nova.virt.libvirt.host [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.986 186962 DEBUG nova.virt.libvirt.host [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.988 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.988 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.988 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.989 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.989 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.989 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.989 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.990 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.990 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.990 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.990 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.991 186962 DEBUG nova.virt.hardware [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.995 186962 DEBUG nova.virt.libvirt.vif [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:48:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1302198804',display_name='tempest-TestServerMultinode-server-1302198804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1302198804',id=172,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='220340bd80db4bf5af391eb2e4247a6c',ramdisk_id='',reservation_id='r-w8zfmtgw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-521650901',owner_user_name='tempest-TestServerMultinode-521650901-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:48:32Z,user_data=None,user_id='b79809b822b248ae8be15d0233f5896e',uuid=55f06dbd-4385-42ee-b258-fb14baca55e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "address": "fa:16:3e:96:6a:76", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5461b2f6-84", "ovs_interfaceid": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.995 186962 DEBUG nova.network.os_vif_util [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converting VIF {"id": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "address": "fa:16:3e:96:6a:76", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5461b2f6-84", "ovs_interfaceid": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.996 186962 DEBUG nova.network.os_vif_util [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:6a:76,bridge_name='br-int',has_traffic_filtering=True,id=5461b2f6-84ea-439e-a902-72bbf9a5aa30,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5461b2f6-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:48:39 np0005539505 nova_compute[186958]: 2025-11-29 07:48:39.997 186962 DEBUG nova.objects.instance [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lazy-loading 'pci_devices' on Instance uuid 55f06dbd-4385-42ee-b258-fb14baca55e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.013 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  <uuid>55f06dbd-4385-42ee-b258-fb14baca55e4</uuid>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  <name>instance-000000ac</name>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestServerMultinode-server-1302198804</nova:name>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:48:39</nova:creationTime>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:        <nova:user uuid="b79809b822b248ae8be15d0233f5896e">tempest-TestServerMultinode-521650901-project-admin</nova:user>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:        <nova:project uuid="220340bd80db4bf5af391eb2e4247a6c">tempest-TestServerMultinode-521650901</nova:project>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:        <nova:port uuid="5461b2f6-84ea-439e-a902-72bbf9a5aa30">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <entry name="serial">55f06dbd-4385-42ee-b258-fb14baca55e4</entry>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <entry name="uuid">55f06dbd-4385-42ee-b258-fb14baca55e4</entry>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk.config"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:96:6a:76"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <target dev="tap5461b2f6-84"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/console.log" append="off"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:48:40 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:48:40 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:48:40 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:48:40 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.014 186962 DEBUG nova.compute.manager [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Preparing to wait for external event network-vif-plugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.015 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.015 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.015 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.016 186962 DEBUG nova.virt.libvirt.vif [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:48:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1302198804',display_name='tempest-TestServerMultinode-server-1302198804',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1302198804',id=172,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='220340bd80db4bf5af391eb2e4247a6c',ramdisk_id='',reservation_id='r-w8zfmtgw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-521650901',owner_user_name='tempest-TestServerMultinode-521650901-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:48:32Z,user_data=None,user_id='b79809b822b248ae8be15d0233f5896e',uuid=55f06dbd-4385-42ee-b258-fb14baca55e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "address": "fa:16:3e:96:6a:76", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5461b2f6-84", "ovs_interfaceid": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.016 186962 DEBUG nova.network.os_vif_util [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converting VIF {"id": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "address": "fa:16:3e:96:6a:76", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5461b2f6-84", "ovs_interfaceid": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.017 186962 DEBUG nova.network.os_vif_util [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:6a:76,bridge_name='br-int',has_traffic_filtering=True,id=5461b2f6-84ea-439e-a902-72bbf9a5aa30,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5461b2f6-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.017 186962 DEBUG os_vif [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:6a:76,bridge_name='br-int',has_traffic_filtering=True,id=5461b2f6-84ea-439e-a902-72bbf9a5aa30,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5461b2f6-84') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.018 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.018 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.019 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.022 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.022 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5461b2f6-84, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.023 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5461b2f6-84, col_values=(('external_ids', {'iface-id': '5461b2f6-84ea-439e-a902-72bbf9a5aa30', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:96:6a:76', 'vm-uuid': '55f06dbd-4385-42ee-b258-fb14baca55e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.024 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:40 np0005539505 NetworkManager[55134]: <info>  [1764402520.0256] manager: (tap5461b2f6-84): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/390)
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.027 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.032 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.033 186962 INFO os_vif [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:6a:76,bridge_name='br-int',has_traffic_filtering=True,id=5461b2f6-84ea-439e-a902-72bbf9a5aa30,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5461b2f6-84')#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.122 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.123 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.123 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] No VIF found with MAC fa:16:3e:96:6a:76, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.124 186962 INFO nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Using config drive#033[00m
Nov 29 02:48:40 np0005539505 podman[250724]: 2025-11-29 07:48:40.743628922 +0000 UTC m=+0.076174515 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:48:40 np0005539505 podman[250723]: 2025-11-29 07:48:40.754553505 +0000 UTC m=+0.082801575 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.785 186962 INFO nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Creating config drive at /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk.config#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.790 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpprli74rt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.915 186962 DEBUG oslo_concurrency.processutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpprli74rt" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:48:40 np0005539505 kernel: tap5461b2f6-84: entered promiscuous mode
Nov 29 02:48:40 np0005539505 NetworkManager[55134]: <info>  [1764402520.9694] manager: (tap5461b2f6-84): new Tun device (/org/freedesktop/NetworkManager/Devices/391)
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.969 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:40 np0005539505 ovn_controller[95143]: 2025-11-29T07:48:40Z|00795|binding|INFO|Claiming lport 5461b2f6-84ea-439e-a902-72bbf9a5aa30 for this chassis.
Nov 29 02:48:40 np0005539505 ovn_controller[95143]: 2025-11-29T07:48:40Z|00796|binding|INFO|5461b2f6-84ea-439e-a902-72bbf9a5aa30: Claiming fa:16:3e:96:6a:76 10.100.0.13
Nov 29 02:48:40 np0005539505 nova_compute[186958]: 2025-11-29 07:48:40.973 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:40.988 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:6a:76 10.100.0.13'], port_security=['fa:16:3e:96:6a:76 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '55f06dbd-4385-42ee-b258-fb14baca55e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '220340bd80db4bf5af391eb2e4247a6c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7d07af2a-16f6-4fe3-b2a4-ed6b96a38a93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb62e23-e8c7-432f-b445-db50c529fe8e, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=5461b2f6-84ea-439e-a902-72bbf9a5aa30) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:48:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:40.989 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 5461b2f6-84ea-439e-a902-72bbf9a5aa30 in datapath 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d bound to our chassis#033[00m
Nov 29 02:48:40 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:40.991 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d#033[00m
Nov 29 02:48:40 np0005539505 systemd-udevd[250787]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.004 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[98c1ebab-03b7-4a2a-860b-c2947cbe16b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.005 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7fbe5e7f-51 in ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.008 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7fbe5e7f-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.008 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3a604e-1397-4c41-93c2-e28b59f06408]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.009 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a5acc6f9-198e-4ef3-87a2-b080a0cfe863]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 NetworkManager[55134]: <info>  [1764402521.0132] device (tap5461b2f6-84): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:48:41 np0005539505 NetworkManager[55134]: <info>  [1764402521.0146] device (tap5461b2f6-84): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.019 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[3cce5972-a1f6-44ec-a4b0-2333435b5b4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 systemd-machined[153285]: New machine qemu-83-instance-000000ac.
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.032 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.033 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f95c892b-8818-470f-8789-f1989513892e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:48:41Z|00797|binding|INFO|Setting lport 5461b2f6-84ea-439e-a902-72bbf9a5aa30 ovn-installed in OVS
Nov 29 02:48:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:48:41Z|00798|binding|INFO|Setting lport 5461b2f6-84ea-439e-a902-72bbf9a5aa30 up in Southbound
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.039 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:41 np0005539505 systemd[1]: Started Virtual Machine qemu-83-instance-000000ac.
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.059 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[177ba739-3a0d-4422-9678-aa7ee1936a73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.064 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1137d97c-8c6a-4089-80e8-2b2be63f6f64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 NetworkManager[55134]: <info>  [1764402521.0656] manager: (tap7fbe5e7f-50): new Veth device (/org/freedesktop/NetworkManager/Devices/392)
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.095 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[cc71d239-8ff1-4e76-9e3b-7388feba8a00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.098 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[642e2ada-82d4-4048-bd5e-978566e7976a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 NetworkManager[55134]: <info>  [1764402521.1187] device (tap7fbe5e7f-50): carrier: link connected
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.122 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf3319a-8b07-499f-8046-fa4e3f806e46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.139 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[09f08872-96ad-4397-9906-f0c5f705e8b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7fbe5e7f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:bf:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 796870, 'reachable_time': 39209, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250823, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.154 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7f452183-2746-484e-9fd8-dc9cc661f6ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe76:bfc0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 796870, 'tstamp': 796870}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250824, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.171 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[53dd0b16-220b-4a60-8510-990f171cbfae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7fbe5e7f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:76:bf:c0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 247], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 796870, 'reachable_time': 39209, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250825, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.201 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[04bbdae4-75c8-4c76-825e-69c9922445c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.269 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6522b1e2-f811-4571-ada9-f38f70355081]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.271 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7fbe5e7f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.272 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.272 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7fbe5e7f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.275 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:41 np0005539505 kernel: tap7fbe5e7f-50: entered promiscuous mode
Nov 29 02:48:41 np0005539505 NetworkManager[55134]: <info>  [1764402521.2771] manager: (tap7fbe5e7f-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/393)
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.277 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7fbe5e7f-50, col_values=(('external_ids', {'iface-id': 'e08502a1-bdde-4e8d-89e4-c05bd265f847'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.279 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:48:41Z|00799|binding|INFO|Releasing lport e08502a1-bdde-4e8d-89e4-c05bd265f847 from this chassis (sb_readonly=0)
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.291 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.292 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.293 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b2f2ffbb-447a-4dfa-b5a0-e27b940d9faa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.293 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.pid.haproxy
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:48:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:41.294 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'env', 'PROCESS_TAG=haproxy-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.323 186962 DEBUG nova.compute.manager [req-d52a63b4-2d55-4f7b-8607-c877d5816e01 req-0c551d7c-b4ba-40f7-8152-333291b75ac4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Received event network-vif-plugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.323 186962 DEBUG oslo_concurrency.lockutils [req-d52a63b4-2d55-4f7b-8607-c877d5816e01 req-0c551d7c-b4ba-40f7-8152-333291b75ac4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.324 186962 DEBUG oslo_concurrency.lockutils [req-d52a63b4-2d55-4f7b-8607-c877d5816e01 req-0c551d7c-b4ba-40f7-8152-333291b75ac4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.324 186962 DEBUG oslo_concurrency.lockutils [req-d52a63b4-2d55-4f7b-8607-c877d5816e01 req-0c551d7c-b4ba-40f7-8152-333291b75ac4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.324 186962 DEBUG nova.compute.manager [req-d52a63b4-2d55-4f7b-8607-c877d5816e01 req-0c551d7c-b4ba-40f7-8152-333291b75ac4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Processing event network-vif-plugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:48:41 np0005539505 podman[250857]: 2025-11-29 07:48:41.632240479 +0000 UTC m=+0.048119821 container create f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:48:41 np0005539505 systemd[1]: Started libpod-conmon-f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692.scope.
Nov 29 02:48:41 np0005539505 podman[250857]: 2025-11-29 07:48:41.605622726 +0000 UTC m=+0.021502078 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:48:41 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:48:41 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f3de99b39d118c1f85eb2516066d8546deb8b13c0c877e250ded67324e5aeea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.775 186962 DEBUG nova.network.neutron [req-7d865c54-a719-4ed6-8633-36a0456f18ba req-a3d0d95b-ed76-4c47-a8a9-62fe3545d5fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Updated VIF entry in instance network info cache for port 5461b2f6-84ea-439e-a902-72bbf9a5aa30. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.775 186962 DEBUG nova.network.neutron [req-7d865c54-a719-4ed6-8633-36a0456f18ba req-a3d0d95b-ed76-4c47-a8a9-62fe3545d5fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Updating instance_info_cache with network_info: [{"id": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "address": "fa:16:3e:96:6a:76", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5461b2f6-84", "ovs_interfaceid": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.794 186962 DEBUG oslo_concurrency.lockutils [req-7d865c54-a719-4ed6-8633-36a0456f18ba req-a3d0d95b-ed76-4c47-a8a9-62fe3545d5fc 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-55f06dbd-4385-42ee-b258-fb14baca55e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:48:41 np0005539505 podman[250857]: 2025-11-29 07:48:41.79551162 +0000 UTC m=+0.211390962 container init f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 02:48:41 np0005539505 podman[250857]: 2025-11-29 07:48:41.801522852 +0000 UTC m=+0.217402174 container start f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:48:41 np0005539505 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[250872]: [NOTICE]   (250882) : New worker (250885) forked
Nov 29 02:48:41 np0005539505 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[250872]: [NOTICE]   (250882) : Loading success.
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.852 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402521.851826, 55f06dbd-4385-42ee-b258-fb14baca55e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.853 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] VM Started (Lifecycle Event)#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.856 186962 DEBUG nova.compute.manager [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.859 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.865 186962 INFO nova.virt.libvirt.driver [-] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Instance spawned successfully.#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.866 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.874 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.878 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.886 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.886 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.887 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.888 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.888 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.889 186962 DEBUG nova.virt.libvirt.driver [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.898 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.899 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402521.8520622, 55f06dbd-4385-42ee-b258-fb14baca55e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.899 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.921 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.924 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402521.858567, 55f06dbd-4385-42ee-b258-fb14baca55e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.924 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.958 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:41 np0005539505 nova_compute[186958]: 2025-11-29 07:48:41.960 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.004 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.007 186962 INFO nova.compute.manager [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Took 9.62 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.007 186962 DEBUG nova.compute.manager [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.099 186962 INFO nova.compute.manager [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Took 11.32 seconds to build instance.#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.117 186962 DEBUG oslo_concurrency.lockutils [None req-a9a8bb85-1cb6-4a72-b4c7-831c9d4908d6 b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.769 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-55f06dbd-4385-42ee-b258-fb14baca55e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.770 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-55f06dbd-4385-42ee-b258-fb14baca55e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.770 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:48:42 np0005539505 nova_compute[186958]: 2025-11-29 07:48:42.770 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 55f06dbd-4385-42ee-b258-fb14baca55e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.098 186962 DEBUG oslo_concurrency.lockutils [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "55f06dbd-4385-42ee-b258-fb14baca55e4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.098 186962 DEBUG oslo_concurrency.lockutils [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.098 186962 DEBUG oslo_concurrency.lockutils [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.099 186962 DEBUG oslo_concurrency.lockutils [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.099 186962 DEBUG oslo_concurrency.lockutils [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.111 186962 INFO nova.compute.manager [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Terminating instance#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.122 186962 DEBUG nova.compute.manager [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:48:43 np0005539505 kernel: tap5461b2f6-84 (unregistering): left promiscuous mode
Nov 29 02:48:43 np0005539505 NetworkManager[55134]: <info>  [1764402523.1422] device (tap5461b2f6-84): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:48:43 np0005539505 ovn_controller[95143]: 2025-11-29T07:48:43Z|00800|binding|INFO|Releasing lport 5461b2f6-84ea-439e-a902-72bbf9a5aa30 from this chassis (sb_readonly=0)
Nov 29 02:48:43 np0005539505 ovn_controller[95143]: 2025-11-29T07:48:43Z|00801|binding|INFO|Setting lport 5461b2f6-84ea-439e-a902-72bbf9a5aa30 down in Southbound
Nov 29 02:48:43 np0005539505 ovn_controller[95143]: 2025-11-29T07:48:43Z|00802|binding|INFO|Removing iface tap5461b2f6-84 ovn-installed in OVS
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.152 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.158 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:96:6a:76 10.100.0.13'], port_security=['fa:16:3e:96:6a:76 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '55f06dbd-4385-42ee-b258-fb14baca55e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '220340bd80db4bf5af391eb2e4247a6c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7d07af2a-16f6-4fe3-b2a4-ed6b96a38a93', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb62e23-e8c7-432f-b445-db50c529fe8e, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=5461b2f6-84ea-439e-a902-72bbf9a5aa30) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.160 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 5461b2f6-84ea-439e-a902-72bbf9a5aa30 in datapath 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d unbound from our chassis#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.161 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.162 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bc056afe-905f-44f2-a9de-50a71c803135]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.162 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d namespace which is not needed anymore#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.166 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:43 np0005539505 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ac.scope: Deactivated successfully.
Nov 29 02:48:43 np0005539505 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ac.scope: Consumed 2.096s CPU time.
Nov 29 02:48:43 np0005539505 systemd-machined[153285]: Machine qemu-83-instance-000000ac terminated.
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.234 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:43 np0005539505 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[250872]: [NOTICE]   (250882) : haproxy version is 2.8.14-c23fe91
Nov 29 02:48:43 np0005539505 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[250872]: [NOTICE]   (250882) : path to executable is /usr/sbin/haproxy
Nov 29 02:48:43 np0005539505 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[250872]: [WARNING]  (250882) : Exiting Master process...
Nov 29 02:48:43 np0005539505 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[250872]: [ALERT]    (250882) : Current worker (250885) exited with code 143 (Terminated)
Nov 29 02:48:43 np0005539505 neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d[250872]: [WARNING]  (250882) : All workers exited. Exiting... (0)
Nov 29 02:48:43 np0005539505 systemd[1]: libpod-f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692.scope: Deactivated successfully.
Nov 29 02:48:43 np0005539505 podman[250916]: 2025-11-29 07:48:43.284770657 +0000 UTC m=+0.042233872 container died f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:48:43 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692-userdata-shm.mount: Deactivated successfully.
Nov 29 02:48:43 np0005539505 systemd[1]: var-lib-containers-storage-overlay-8f3de99b39d118c1f85eb2516066d8546deb8b13c0c877e250ded67324e5aeea-merged.mount: Deactivated successfully.
Nov 29 02:48:43 np0005539505 podman[250916]: 2025-11-29 07:48:43.323173288 +0000 UTC m=+0.080636503 container cleanup f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:48:43 np0005539505 systemd[1]: libpod-conmon-f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692.scope: Deactivated successfully.
Nov 29 02:48:43 np0005539505 NetworkManager[55134]: <info>  [1764402523.3410] manager: (tap5461b2f6-84): new Tun device (/org/freedesktop/NetworkManager/Devices/394)
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.342 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.346 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:43 np0005539505 podman[250948]: 2025-11-29 07:48:43.381713567 +0000 UTC m=+0.037741383 container remove f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.386 186962 INFO nova.virt.libvirt.driver [-] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Instance destroyed successfully.#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.387 186962 DEBUG nova.objects.instance [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lazy-loading 'resources' on Instance uuid 55f06dbd-4385-42ee-b258-fb14baca55e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.390 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e13691-9798-432e-a0a6-7b13fc0ea460]: (4, ('Sat Nov 29 07:48:43 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d (f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692)\nf37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692\nSat Nov 29 07:48:43 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d (f37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692)\nf37444defc87c0f51e64a9f7dc5348139468038c0eda23de73320179a7074692\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.392 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d570de1d-343d-44af-b2cd-921d639c0890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.393 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7fbe5e7f-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.395 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:43 np0005539505 kernel: tap7fbe5e7f-50: left promiscuous mode
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.410 186962 DEBUG nova.virt.libvirt.vif [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:48:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1302198804',display_name='tempest-TestServerMultinode-server-1302198804',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1302198804',id=172,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:48:42Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='220340bd80db4bf5af391eb2e4247a6c',ramdisk_id='',reservation_id='r-w8zfmtgw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-521650901',owner_user_name='tempest-TestServerMultinode-521650901-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:48:42Z,user_data=None,user_id='b79809b822b248ae8be15d0233f5896e',uuid=55f06dbd-4385-42ee-b258-fb14baca55e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "address": "fa:16:3e:96:6a:76", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5461b2f6-84", "ovs_interfaceid": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.410 186962 DEBUG nova.network.os_vif_util [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converting VIF {"id": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "address": "fa:16:3e:96:6a:76", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5461b2f6-84", "ovs_interfaceid": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.411 186962 DEBUG nova.network.os_vif_util [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:6a:76,bridge_name='br-int',has_traffic_filtering=True,id=5461b2f6-84ea-439e-a902-72bbf9a5aa30,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5461b2f6-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.411 186962 DEBUG os_vif [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:6a:76,bridge_name='br-int',has_traffic_filtering=True,id=5461b2f6-84ea-439e-a902-72bbf9a5aa30,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5461b2f6-84') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.411 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[26f27444-32a1-47ca-9c62-7736688da809]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.413 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.413 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5461b2f6-84, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.414 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.414 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.416 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.418 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.420 186962 INFO os_vif [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:6a:76,bridge_name='br-int',has_traffic_filtering=True,id=5461b2f6-84ea-439e-a902-72bbf9a5aa30,network=Network(7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5461b2f6-84')#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.420 186962 INFO nova.virt.libvirt.driver [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Deleting instance files /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4_del#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.421 186962 INFO nova.virt.libvirt.driver [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Deletion of /var/lib/nova/instances/55f06dbd-4385-42ee-b258-fb14baca55e4_del complete#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.432 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed897e9-65bf-4dd7-a9d0-3cb76647355a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.434 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ab5bfd20-c212-4d6b-b2ce-ace5195c357e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.448 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[76ef78d9-efb8-437b-b65d-a5ad333b09de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 796863, 'reachable_time': 24272, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250980, 'error': None, 'target': 'ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.450 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:48:43 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:48:43.451 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[3868bcb9-b36d-4d02-a95e-3bdc192fda76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:48:43 np0005539505 systemd[1]: run-netns-ovnmeta\x2d7fbe5e7f\x2d5bf0\x2d42e8\x2d9d22\x2dc7ee6968433d.mount: Deactivated successfully.
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.467 186962 DEBUG nova.compute.manager [req-96d26323-fff0-4f8d-976a-9db575129ed6 req-d9752e1e-2052-4286-8958-0363df6aa2b6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Received event network-vif-plugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.467 186962 DEBUG oslo_concurrency.lockutils [req-96d26323-fff0-4f8d-976a-9db575129ed6 req-d9752e1e-2052-4286-8958-0363df6aa2b6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.467 186962 DEBUG oslo_concurrency.lockutils [req-96d26323-fff0-4f8d-976a-9db575129ed6 req-d9752e1e-2052-4286-8958-0363df6aa2b6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.468 186962 DEBUG oslo_concurrency.lockutils [req-96d26323-fff0-4f8d-976a-9db575129ed6 req-d9752e1e-2052-4286-8958-0363df6aa2b6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.468 186962 DEBUG nova.compute.manager [req-96d26323-fff0-4f8d-976a-9db575129ed6 req-d9752e1e-2052-4286-8958-0363df6aa2b6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] No waiting events found dispatching network-vif-plugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.468 186962 WARNING nova.compute.manager [req-96d26323-fff0-4f8d-976a-9db575129ed6 req-d9752e1e-2052-4286-8958-0363df6aa2b6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Received unexpected event network-vif-plugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.625 186962 INFO nova.compute.manager [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Took 0.50 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.625 186962 DEBUG oslo.service.loopingcall [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.625 186962 DEBUG nova.compute.manager [-] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:48:43 np0005539505 nova_compute[186958]: 2025-11-29 07:48:43.626 186962 DEBUG nova.network.neutron [-] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.406 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Updating instance_info_cache with network_info: [{"id": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "address": "fa:16:3e:96:6a:76", "network": {"id": "7fbe5e7f-5bf0-42e8-9d22-c7ee6968433d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1724511624-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ff00c41537a48ac87c3d3460e6ff7e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5461b2f6-84", "ovs_interfaceid": "5461b2f6-84ea-439e-a902-72bbf9a5aa30", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.447 186962 DEBUG nova.network.neutron [-] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.449 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-55f06dbd-4385-42ee-b258-fb14baca55e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.449 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.450 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.470 186962 INFO nova.compute.manager [-] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Took 0.84 seconds to deallocate network for instance.#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.555 186962 DEBUG nova.compute.manager [req-90097fb9-a525-4e34-8ecd-f5b158ab9fe4 req-9c3964ec-6edc-4e49-9ad6-b7dde2b8a566 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Received event network-vif-deleted-5461b2f6-84ea-439e-a902-72bbf9a5aa30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.574 186962 DEBUG oslo_concurrency.lockutils [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.575 186962 DEBUG oslo_concurrency.lockutils [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.621 186962 DEBUG nova.compute.provider_tree [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.636 186962 DEBUG nova.scheduler.client.report [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.656 186962 DEBUG oslo_concurrency.lockutils [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:44 np0005539505 nova_compute[186958]: 2025-11-29 07:48:44.677 186962 INFO nova.scheduler.client.report [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Deleted allocations for instance 55f06dbd-4385-42ee-b258-fb14baca55e4#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.011 186962 DEBUG oslo_concurrency.lockutils [None req-2f6a9b7b-f449-4928-bf2d-866c8cca332c b79809b822b248ae8be15d0233f5896e 220340bd80db4bf5af391eb2e4247a6c - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.913s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.597 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.598 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.599 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.599 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:48:45 np0005539505 podman[250982]: 2025-11-29 07:48:45.707784065 +0000 UTC m=+0.068407052 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 02:48:45 np0005539505 podman[250983]: 2025-11-29 07:48:45.706627712 +0000 UTC m=+0.063194173 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.765 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.767 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5619MB free_disk=73.07303237915039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.767 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.767 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.885 186962 DEBUG nova.compute.manager [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Received event network-vif-unplugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.885 186962 DEBUG oslo_concurrency.lockutils [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.885 186962 DEBUG oslo_concurrency.lockutils [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.886 186962 DEBUG oslo_concurrency.lockutils [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.886 186962 DEBUG nova.compute.manager [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] No waiting events found dispatching network-vif-unplugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.886 186962 WARNING nova.compute.manager [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Received unexpected event network-vif-unplugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.886 186962 DEBUG nova.compute.manager [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Received event network-vif-plugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.887 186962 DEBUG oslo_concurrency.lockutils [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.887 186962 DEBUG oslo_concurrency.lockutils [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.887 186962 DEBUG oslo_concurrency.lockutils [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "55f06dbd-4385-42ee-b258-fb14baca55e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.887 186962 DEBUG nova.compute.manager [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] No waiting events found dispatching network-vif-plugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.888 186962 WARNING nova.compute.manager [req-6bd2d593-e57a-4d84-a8f6-f7fa8641b5b4 req-dd3029c3-73dc-435d-896a-5834dfc87875 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Received unexpected event network-vif-plugged-5461b2f6-84ea-439e-a902-72bbf9a5aa30 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.939 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.940 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:48:45 np0005539505 nova_compute[186958]: 2025-11-29 07:48:45.963 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:48:46 np0005539505 nova_compute[186958]: 2025-11-29 07:48:46.379 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:48:46 np0005539505 nova_compute[186958]: 2025-11-29 07:48:46.877 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:48:46 np0005539505 nova_compute[186958]: 2025-11-29 07:48:46.878 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:48:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:48:48 np0005539505 nova_compute[186958]: 2025-11-29 07:48:48.236 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:48 np0005539505 nova_compute[186958]: 2025-11-29 07:48:48.414 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:52 np0005539505 nova_compute[186958]: 2025-11-29 07:48:52.873 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:52 np0005539505 nova_compute[186958]: 2025-11-29 07:48:52.874 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:48:53 np0005539505 nova_compute[186958]: 2025-11-29 07:48:53.237 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:53 np0005539505 nova_compute[186958]: 2025-11-29 07:48:53.416 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:58 np0005539505 nova_compute[186958]: 2025-11-29 07:48:58.238 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:58 np0005539505 nova_compute[186958]: 2025-11-29 07:48:58.385 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402523.3847742, 55f06dbd-4385-42ee-b258-fb14baca55e4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:48:58 np0005539505 nova_compute[186958]: 2025-11-29 07:48:58.386 186962 INFO nova.compute.manager [-] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:48:58 np0005539505 nova_compute[186958]: 2025-11-29 07:48:58.417 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:48:59 np0005539505 podman[251023]: 2025-11-29 07:48:59.725624747 +0000 UTC m=+0.062437681 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, release=1755695350, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 02:48:59 np0005539505 podman[251024]: 2025-11-29 07:48:59.731685881 +0000 UTC m=+0.062492353 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:49:01 np0005539505 nova_compute[186958]: 2025-11-29 07:49:01.660 186962 DEBUG nova.compute.manager [None req-4d9c3f39-d7ec-45bc-b263-23ef6e0578a2 - - - - - -] [instance: 55f06dbd-4385-42ee-b258-fb14baca55e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:49:03 np0005539505 nova_compute[186958]: 2025-11-29 07:49:03.239 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:03 np0005539505 nova_compute[186958]: 2025-11-29 07:49:03.418 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:04 np0005539505 nova_compute[186958]: 2025-11-29 07:49:04.615 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:04 np0005539505 podman[251065]: 2025-11-29 07:49:04.708035063 +0000 UTC m=+0.046557075 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 02:49:08 np0005539505 nova_compute[186958]: 2025-11-29 07:49:08.242 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:08 np0005539505 nova_compute[186958]: 2025-11-29 07:49:08.453 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:11 np0005539505 podman[251085]: 2025-11-29 07:49:11.753928071 +0000 UTC m=+0.074023653 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:49:11 np0005539505 podman[251086]: 2025-11-29 07:49:11.818334578 +0000 UTC m=+0.136160365 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:49:13 np0005539505 nova_compute[186958]: 2025-11-29 07:49:13.244 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:13 np0005539505 nova_compute[186958]: 2025-11-29 07:49:13.454 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:16 np0005539505 podman[251136]: 2025-11-29 07:49:16.784407504 +0000 UTC m=+0.098636429 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 02:49:16 np0005539505 podman[251135]: 2025-11-29 07:49:16.80800432 +0000 UTC m=+0.130143522 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:49:18 np0005539505 nova_compute[186958]: 2025-11-29 07:49:18.247 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:18 np0005539505 nova_compute[186958]: 2025-11-29 07:49:18.456 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:23 np0005539505 nova_compute[186958]: 2025-11-29 07:49:23.248 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:23 np0005539505 nova_compute[186958]: 2025-11-29 07:49:23.456 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:27.534 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:27.535 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:27.535 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:28 np0005539505 nova_compute[186958]: 2025-11-29 07:49:28.250 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:28 np0005539505 nova_compute[186958]: 2025-11-29 07:49:28.459 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:30.676 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:49:30 np0005539505 nova_compute[186958]: 2025-11-29 07:49:30.677 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:30 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:30.677 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:49:30 np0005539505 podman[251177]: 2025-11-29 07:49:30.743574534 +0000 UTC m=+0.061366921 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., name=ubi9-minimal)
Nov 29 02:49:30 np0005539505 podman[251178]: 2025-11-29 07:49:30.752704796 +0000 UTC m=+0.065805788 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:49:33 np0005539505 nova_compute[186958]: 2025-11-29 07:49:33.252 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:33 np0005539505 nova_compute[186958]: 2025-11-29 07:49:33.461 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:35 np0005539505 podman[251220]: 2025-11-29 07:49:35.712954777 +0000 UTC m=+0.049896551 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:49:37 np0005539505 nova_compute[186958]: 2025-11-29 07:49:37.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:37 np0005539505 nova_compute[186958]: 2025-11-29 07:49:37.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:37 np0005539505 nova_compute[186958]: 2025-11-29 07:49:37.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:49:38 np0005539505 nova_compute[186958]: 2025-11-29 07:49:38.253 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:38 np0005539505 nova_compute[186958]: 2025-11-29 07:49:38.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:38 np0005539505 nova_compute[186958]: 2025-11-29 07:49:38.462 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:38.680 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:42 np0005539505 podman[251239]: 2025-11-29 07:49:42.743644608 +0000 UTC m=+0.067882078 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:49:42 np0005539505 podman[251240]: 2025-11-29 07:49:42.76604839 +0000 UTC m=+0.095247932 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller)
Nov 29 02:49:43 np0005539505 nova_compute[186958]: 2025-11-29 07:49:43.297 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:43 np0005539505 nova_compute[186958]: 2025-11-29 07:49:43.465 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:44 np0005539505 nova_compute[186958]: 2025-11-29 07:49:44.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:44 np0005539505 nova_compute[186958]: 2025-11-29 07:49:44.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:49:44 np0005539505 nova_compute[186958]: 2025-11-29 07:49:44.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:49:44 np0005539505 nova_compute[186958]: 2025-11-29 07:49:44.395 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:49:44 np0005539505 nova_compute[186958]: 2025-11-29 07:49:44.395 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.412 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.412 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.412 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.413 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.534 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.535 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5704MB free_disk=73.07303237915039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.535 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.535 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.712 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.712 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.729 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.742 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.743 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:49:45 np0005539505 nova_compute[186958]: 2025-11-29 07:49:45.743 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.490 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.491 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.534 186962 DEBUG nova.compute.manager [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.729 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.730 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.735 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.735 186962 INFO nova.compute.claims [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.884 186962 DEBUG nova.compute.provider_tree [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.900 186962 DEBUG nova.scheduler.client.report [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.924 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.925 186962 DEBUG nova.compute.manager [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.997 186962 DEBUG nova.compute.manager [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:49:46 np0005539505 nova_compute[186958]: 2025-11-29 07:49:46.997 186962 DEBUG nova.network.neutron [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.022 186962 INFO nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.042 186962 DEBUG nova.compute.manager [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.161 186962 DEBUG nova.policy [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '607d794b09b34b829673198ba073234c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '10520ccc4be44f138c8dd72b1d5edabe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.174 186962 DEBUG nova.compute.manager [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.175 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.176 186962 INFO nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Creating image(s)#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.176 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.176 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.177 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.191 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.250 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.251 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.252 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.263 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.320 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.321 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.532 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk 1073741824" returned: 0 in 0.211s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.533 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.533 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.588 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.589 186962 DEBUG nova.virt.disk.api [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Checking if we can resize image /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.590 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.648 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.649 186962 DEBUG nova.virt.disk.api [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Cannot resize image /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.650 186962 DEBUG nova.objects.instance [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'migration_context' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.697 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.698 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Ensure instance console log exists: /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.698 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.699 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.699 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:47 np0005539505 podman[251303]: 2025-11-29 07:49:47.726767385 +0000 UTC m=+0.061766462 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:49:47 np0005539505 podman[251306]: 2025-11-29 07:49:47.744047291 +0000 UTC m=+0.072643074 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:49:47 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.744 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:48 np0005539505 nova_compute[186958]: 2025-11-29 07:49:47.999 186962 DEBUG nova.network.neutron [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Successfully created port: 9ff644c0-307e-470e-add6-ceb7d6a15833 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:49:48 np0005539505 nova_compute[186958]: 2025-11-29 07:49:48.300 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:48 np0005539505 nova_compute[186958]: 2025-11-29 07:49:48.466 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:51 np0005539505 ovn_controller[95143]: 2025-11-29T07:49:51Z|00803|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 02:49:52 np0005539505 nova_compute[186958]: 2025-11-29 07:49:52.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:53 np0005539505 nova_compute[186958]: 2025-11-29 07:49:53.040 186962 DEBUG nova.network.neutron [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Successfully updated port: 9ff644c0-307e-470e-add6-ceb7d6a15833 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:49:53 np0005539505 nova_compute[186958]: 2025-11-29 07:49:53.070 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:49:53 np0005539505 nova_compute[186958]: 2025-11-29 07:49:53.071 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquired lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:49:53 np0005539505 nova_compute[186958]: 2025-11-29 07:49:53.071 186962 DEBUG nova.network.neutron [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:49:53 np0005539505 nova_compute[186958]: 2025-11-29 07:49:53.165 186962 DEBUG nova.compute.manager [req-550e2b96-8db2-4dd3-a351-7edc0df3eef5 req-fd984b8c-fdfd-48d5-886c-e9ba1ce4b668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-changed-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:53 np0005539505 nova_compute[186958]: 2025-11-29 07:49:53.166 186962 DEBUG nova.compute.manager [req-550e2b96-8db2-4dd3-a351-7edc0df3eef5 req-fd984b8c-fdfd-48d5-886c-e9ba1ce4b668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Refreshing instance network info cache due to event network-changed-9ff644c0-307e-470e-add6-ceb7d6a15833. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:49:53 np0005539505 nova_compute[186958]: 2025-11-29 07:49:53.166 186962 DEBUG oslo_concurrency.lockutils [req-550e2b96-8db2-4dd3-a351-7edc0df3eef5 req-fd984b8c-fdfd-48d5-886c-e9ba1ce4b668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:49:53 np0005539505 nova_compute[186958]: 2025-11-29 07:49:53.304 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:53 np0005539505 nova_compute[186958]: 2025-11-29 07:49:53.468 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:53 np0005539505 nova_compute[186958]: 2025-11-29 07:49:53.790 186962 DEBUG nova.network.neutron [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:49:54 np0005539505 nova_compute[186958]: 2025-11-29 07:49:54.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.273 186962 DEBUG nova.network.neutron [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updating instance_info_cache with network_info: [{"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.623 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Releasing lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.624 186962 DEBUG nova.compute.manager [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance network_info: |[{"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.624 186962 DEBUG oslo_concurrency.lockutils [req-550e2b96-8db2-4dd3-a351-7edc0df3eef5 req-fd984b8c-fdfd-48d5-886c-e9ba1ce4b668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.625 186962 DEBUG nova.network.neutron [req-550e2b96-8db2-4dd3-a351-7edc0df3eef5 req-fd984b8c-fdfd-48d5-886c-e9ba1ce4b668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Refreshing network info cache for port 9ff644c0-307e-470e-add6-ceb7d6a15833 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.627 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Start _get_guest_xml network_info=[{"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.633 186962 WARNING nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.638 186962 DEBUG nova.virt.libvirt.host [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.639 186962 DEBUG nova.virt.libvirt.host [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.643 186962 DEBUG nova.virt.libvirt.host [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.644 186962 DEBUG nova.virt.libvirt.host [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.646 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.646 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.647 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.647 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.648 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.648 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.648 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.648 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.649 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.649 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.649 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.649 186962 DEBUG nova.virt.hardware [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.655 186962 DEBUG nova.virt.libvirt.vif [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:49:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2098184144',display_name='tempest-TestShelveInstance-server-2098184144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2098184144',id=175,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOibuoQlHAfiaCYj4CIpZ6qJyfUZB3vV71n9+dsfX5nOIOEpheW32pCW+5Jb9gHkOLzyMwIqxCRE794nWAzInQpsnJoEx4IsVlc2/LJryddECLsGRYjobhhHnV47L4pCvg==',key_name='tempest-TestShelveInstance-1313079058',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='10520ccc4be44f138c8dd72b1d5edabe',ramdisk_id='',reservation_id='r-48eolvdi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1546337875',owner_user_name='tempest-TestShelveInstance-1546337875-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:49:47Z,user_data=None,user_id='607d794b09b34b829673198ba073234c',uuid=b97f1300-6668-4955-a425-98d44189860d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.655 186962 DEBUG nova.network.os_vif_util [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converting VIF {"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.656 186962 DEBUG nova.network.os_vif_util [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.657 186962 DEBUG nova.objects.instance [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'pci_devices' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.729 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  <uuid>b97f1300-6668-4955-a425-98d44189860d</uuid>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  <name>instance-000000af</name>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestShelveInstance-server-2098184144</nova:name>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:49:55</nova:creationTime>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:        <nova:user uuid="607d794b09b34b829673198ba073234c">tempest-TestShelveInstance-1546337875-project-member</nova:user>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:        <nova:project uuid="10520ccc4be44f138c8dd72b1d5edabe">tempest-TestShelveInstance-1546337875</nova:project>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:        <nova:port uuid="9ff644c0-307e-470e-add6-ceb7d6a15833">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <entry name="serial">b97f1300-6668-4955-a425-98d44189860d</entry>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <entry name="uuid">b97f1300-6668-4955-a425-98d44189860d</entry>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.config"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:e5:2e:ef"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <target dev="tap9ff644c0-30"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/console.log" append="off"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:49:55 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:49:55 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:49:55 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:49:55 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.730 186962 DEBUG nova.compute.manager [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Preparing to wait for external event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.731 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.731 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.731 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.732 186962 DEBUG nova.virt.libvirt.vif [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:49:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2098184144',display_name='tempest-TestShelveInstance-server-2098184144',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2098184144',id=175,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOibuoQlHAfiaCYj4CIpZ6qJyfUZB3vV71n9+dsfX5nOIOEpheW32pCW+5Jb9gHkOLzyMwIqxCRE794nWAzInQpsnJoEx4IsVlc2/LJryddECLsGRYjobhhHnV47L4pCvg==',key_name='tempest-TestShelveInstance-1313079058',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='10520ccc4be44f138c8dd72b1d5edabe',ramdisk_id='',reservation_id='r-48eolvdi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1546337875',owner_user_name='tempest-TestShelveInstance-1546337875-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:49:47Z,user_data=None,user_id='607d794b09b34b829673198ba073234c',uuid=b97f1300-6668-4955-a425-98d44189860d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.732 186962 DEBUG nova.network.os_vif_util [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converting VIF {"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.733 186962 DEBUG nova.network.os_vif_util [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.734 186962 DEBUG os_vif [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.734 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.735 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.735 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.738 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.739 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9ff644c0-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.739 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9ff644c0-30, col_values=(('external_ids', {'iface-id': '9ff644c0-307e-470e-add6-ceb7d6a15833', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e5:2e:ef', 'vm-uuid': 'b97f1300-6668-4955-a425-98d44189860d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.741 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:55 np0005539505 NetworkManager[55134]: <info>  [1764402595.7417] manager: (tap9ff644c0-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/395)
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.743 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.748 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.749 186962 INFO os_vif [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30')#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.896 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.896 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.896 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] No VIF found with MAC fa:16:3e:e5:2e:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:49:55 np0005539505 nova_compute[186958]: 2025-11-29 07:49:55.897 186962 INFO nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Using config drive#033[00m
Nov 29 02:49:56 np0005539505 nova_compute[186958]: 2025-11-29 07:49:56.830 186962 INFO nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Creating config drive at /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.config#033[00m
Nov 29 02:49:56 np0005539505 nova_compute[186958]: 2025-11-29 07:49:56.837 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmksk1xvv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:49:56 np0005539505 nova_compute[186958]: 2025-11-29 07:49:56.968 186962 DEBUG oslo_concurrency.processutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmksk1xvv" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:49:57 np0005539505 kernel: tap9ff644c0-30: entered promiscuous mode
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.034 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:49:57Z|00804|binding|INFO|Claiming lport 9ff644c0-307e-470e-add6-ceb7d6a15833 for this chassis.
Nov 29 02:49:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:49:57Z|00805|binding|INFO|9ff644c0-307e-470e-add6-ceb7d6a15833: Claiming fa:16:3e:e5:2e:ef 10.100.0.4
Nov 29 02:49:57 np0005539505 NetworkManager[55134]: <info>  [1764402597.0394] manager: (tap9ff644c0-30): new Tun device (/org/freedesktop/NetworkManager/Devices/396)
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.043 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.052 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:2e:ef 10.100.0.4'], port_security=['fa:16:3e:e5:2e:ef 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b97f1300-6668-4955-a425-98d44189860d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10520ccc4be44f138c8dd72b1d5edabe', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6d0d2b51-2297-4c99-81c6-529d5b2de4c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ef9cade-d410-4695-ad11-b5fe7020e3e8, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=9ff644c0-307e-470e-add6-ceb7d6a15833) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.053 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 9ff644c0-307e-470e-add6-ceb7d6a15833 in datapath d7cbfb39-b4f8-4082-be26-e925bf6de50f bound to our chassis#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.056 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7cbfb39-b4f8-4082-be26-e925bf6de50f#033[00m
Nov 29 02:49:57 np0005539505 systemd-udevd[251366]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:49:57 np0005539505 systemd-machined[153285]: New machine qemu-84-instance-000000af.
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.073 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b725f215-a8eb-4ad8-a788-ce4101bb0c89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.075 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7cbfb39-b1 in ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.076 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7cbfb39-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.077 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[dd1c0c8e-21cb-49e1-a8bb-fa2dfb8b7cca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.077 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb7c8ac-dcb1-411f-b328-65ca39ba268a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 NetworkManager[55134]: <info>  [1764402597.0821] device (tap9ff644c0-30): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:49:57 np0005539505 NetworkManager[55134]: <info>  [1764402597.0841] device (tap9ff644c0-30): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:49:57 np0005539505 systemd[1]: Started Virtual Machine qemu-84-instance-000000af.
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.097 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[c3f5b41a-8b79-4d11-8c92-cb89adee47d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.101 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:49:57Z|00806|binding|INFO|Setting lport 9ff644c0-307e-470e-add6-ceb7d6a15833 ovn-installed in OVS
Nov 29 02:49:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:49:57Z|00807|binding|INFO|Setting lport 9ff644c0-307e-470e-add6-ceb7d6a15833 up in Southbound
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.106 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.113 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d6164613-cfcc-4c66-a8be-634a32ee47ea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.143 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[7e780f49-2811-40b6-8fb9-c387e32737ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 NetworkManager[55134]: <info>  [1764402597.1488] manager: (tapd7cbfb39-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/397)
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.149 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8f35fcb9-59e0-4955-b378-868dc90632a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.195 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[678f3d4a-cb68-4efd-9f89-627ee8f12a37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.199 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[90bbfe42-2fc9-405f-88a8-ed9f709f1809]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 NetworkManager[55134]: <info>  [1764402597.2253] device (tapd7cbfb39-b0): carrier: link connected
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.231 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[880505c3-5bfc-4703-9692-31f124bc4978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.249 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9007e1e9-a045-4692-be6e-ad872aa25688]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7cbfb39-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:03:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 250], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 804480, 'reachable_time': 18338, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251398, 'error': None, 'target': 'ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.269 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0cd08af8-d844-4917-9fa6-9114ddc81a2f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedb:395'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 804480, 'tstamp': 804480}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251399, 'error': None, 'target': 'ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.295 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5c85c362-deaf-465b-bcda-f76ebcc9031c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7cbfb39-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:db:03:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 250], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 804480, 'reachable_time': 18338, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251400, 'error': None, 'target': 'ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.335 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6313c105-2040-4499-ada9-f590e80de1e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.381 186962 DEBUG nova.network.neutron [req-550e2b96-8db2-4dd3-a351-7edc0df3eef5 req-fd984b8c-fdfd-48d5-886c-e9ba1ce4b668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updated VIF entry in instance network info cache for port 9ff644c0-307e-470e-add6-ceb7d6a15833. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.382 186962 DEBUG nova.network.neutron [req-550e2b96-8db2-4dd3-a351-7edc0df3eef5 req-fd984b8c-fdfd-48d5-886c-e9ba1ce4b668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updating instance_info_cache with network_info: [{"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.405 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f515a8df-c5fe-4b01-b11f-024465e95d41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.406 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7cbfb39-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.407 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.407 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7cbfb39-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:57 np0005539505 kernel: tapd7cbfb39-b0: entered promiscuous mode
Nov 29 02:49:57 np0005539505 NetworkManager[55134]: <info>  [1764402597.4097] manager: (tapd7cbfb39-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/398)
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.409 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.414 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7cbfb39-b0, col_values=(('external_ids', {'iface-id': '963b66d7-328f-4f14-a5c3-5bc0702a4524'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:49:57 np0005539505 ovn_controller[95143]: 2025-11-29T07:49:57Z|00808|binding|INFO|Releasing lport 963b66d7-328f-4f14-a5c3-5bc0702a4524 from this chassis (sb_readonly=0)
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.415 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.419 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7cbfb39-b4f8-4082-be26-e925bf6de50f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7cbfb39-b4f8-4082-be26-e925bf6de50f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.420 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4bdbd8b9-33a2-4aab-8c5c-49ba876d9e48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.421 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-d7cbfb39-b4f8-4082-be26-e925bf6de50f
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/d7cbfb39-b4f8-4082-be26-e925bf6de50f.pid.haproxy
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID d7cbfb39-b4f8-4082-be26-e925bf6de50f
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:49:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:49:57.421 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'env', 'PROCESS_TAG=haproxy-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7cbfb39-b4f8-4082-be26-e925bf6de50f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.426 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.505 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402597.5052776, b97f1300-6668-4955-a425-98d44189860d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.506 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] VM Started (Lifecycle Event)#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.597 186962 DEBUG nova.compute.manager [req-49092269-38a7-429e-b760-10948e8cbe84 req-06a54383-883f-4657-a4dd-7d4a5a32a66a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.598 186962 DEBUG oslo_concurrency.lockutils [req-49092269-38a7-429e-b760-10948e8cbe84 req-06a54383-883f-4657-a4dd-7d4a5a32a66a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.598 186962 DEBUG oslo_concurrency.lockutils [req-49092269-38a7-429e-b760-10948e8cbe84 req-06a54383-883f-4657-a4dd-7d4a5a32a66a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.599 186962 DEBUG oslo_concurrency.lockutils [req-49092269-38a7-429e-b760-10948e8cbe84 req-06a54383-883f-4657-a4dd-7d4a5a32a66a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.599 186962 DEBUG nova.compute.manager [req-49092269-38a7-429e-b760-10948e8cbe84 req-06a54383-883f-4657-a4dd-7d4a5a32a66a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Processing event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.600 186962 DEBUG nova.compute.manager [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.605 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.609 186962 INFO nova.virt.libvirt.driver [-] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance spawned successfully.#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.609 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.612 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.613 186962 DEBUG oslo_concurrency.lockutils [req-550e2b96-8db2-4dd3-a351-7edc0df3eef5 req-fd984b8c-fdfd-48d5-886c-e9ba1ce4b668 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.615 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.629 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.630 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.630 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.630 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.631 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.631 186962 DEBUG nova.virt.libvirt.driver [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.641 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.642 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402597.5054874, b97f1300-6668-4955-a425-98d44189860d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.642 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.781 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.785 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402597.6046746, b97f1300-6668-4955-a425-98d44189860d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.786 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:49:57 np0005539505 podman[251439]: 2025-11-29 07:49:57.792891621 +0000 UTC m=+0.049086038 container create 96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 02:49:57 np0005539505 systemd[1]: Started libpod-conmon-96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639.scope.
Nov 29 02:49:57 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:49:57 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f716d80073ecf1fc520818078608f24395e92e8386d45c5c63db5fa0a065feba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:49:57 np0005539505 podman[251439]: 2025-11-29 07:49:57.76599712 +0000 UTC m=+0.022191557 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:49:57 np0005539505 podman[251439]: 2025-11-29 07:49:57.86992027 +0000 UTC m=+0.126114707 container init 96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:49:57 np0005539505 podman[251439]: 2025-11-29 07:49:57.87552689 +0000 UTC m=+0.131721317 container start 96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.896 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:49:57 np0005539505 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[251455]: [NOTICE]   (251459) : New worker (251461) forked
Nov 29 02:49:57 np0005539505 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[251455]: [NOTICE]   (251459) : Loading success.
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.901 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.920 186962 INFO nova.compute.manager [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Took 10.75 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.921 186962 DEBUG nova.compute.manager [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:49:57 np0005539505 nova_compute[186958]: 2025-11-29 07:49:57.932 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:49:58 np0005539505 nova_compute[186958]: 2025-11-29 07:49:58.306 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:49:58 np0005539505 nova_compute[186958]: 2025-11-29 07:49:58.345 186962 INFO nova.compute.manager [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Took 11.72 seconds to build instance.#033[00m
Nov 29 02:49:58 np0005539505 nova_compute[186958]: 2025-11-29 07:49:58.371 186962 DEBUG oslo_concurrency.lockutils [None req-5b4d3bfa-c371-46fe-ae3d-594498d28e95 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:59 np0005539505 nova_compute[186958]: 2025-11-29 07:49:59.699 186962 DEBUG nova.compute.manager [req-24eb3113-83c1-40ac-9015-c2267504c7db req-df394df1-c1ea-4dd3-ba99-ca10c9c719c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:49:59 np0005539505 nova_compute[186958]: 2025-11-29 07:49:59.699 186962 DEBUG oslo_concurrency.lockutils [req-24eb3113-83c1-40ac-9015-c2267504c7db req-df394df1-c1ea-4dd3-ba99-ca10c9c719c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:49:59 np0005539505 nova_compute[186958]: 2025-11-29 07:49:59.700 186962 DEBUG oslo_concurrency.lockutils [req-24eb3113-83c1-40ac-9015-c2267504c7db req-df394df1-c1ea-4dd3-ba99-ca10c9c719c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:49:59 np0005539505 nova_compute[186958]: 2025-11-29 07:49:59.700 186962 DEBUG oslo_concurrency.lockutils [req-24eb3113-83c1-40ac-9015-c2267504c7db req-df394df1-c1ea-4dd3-ba99-ca10c9c719c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:49:59 np0005539505 nova_compute[186958]: 2025-11-29 07:49:59.700 186962 DEBUG nova.compute.manager [req-24eb3113-83c1-40ac-9015-c2267504c7db req-df394df1-c1ea-4dd3-ba99-ca10c9c719c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] No waiting events found dispatching network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:49:59 np0005539505 nova_compute[186958]: 2025-11-29 07:49:59.701 186962 WARNING nova.compute.manager [req-24eb3113-83c1-40ac-9015-c2267504c7db req-df394df1-c1ea-4dd3-ba99-ca10c9c719c3 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received unexpected event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:50:00 np0005539505 nova_compute[186958]: 2025-11-29 07:50:00.745 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:01 np0005539505 podman[251470]: 2025-11-29 07:50:01.734063515 +0000 UTC m=+0.060061433 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Nov 29 02:50:01 np0005539505 podman[251471]: 2025-11-29 07:50:01.766746292 +0000 UTC m=+0.082356802 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:50:01 np0005539505 NetworkManager[55134]: <info>  [1764402601.8189] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/399)
Nov 29 02:50:01 np0005539505 NetworkManager[55134]: <info>  [1764402601.8198] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/400)
Nov 29 02:50:01 np0005539505 nova_compute[186958]: 2025-11-29 07:50:01.818 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:01 np0005539505 nova_compute[186958]: 2025-11-29 07:50:01.988 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:01 np0005539505 ovn_controller[95143]: 2025-11-29T07:50:01Z|00809|binding|INFO|Releasing lport 963b66d7-328f-4f14-a5c3-5bc0702a4524 from this chassis (sb_readonly=0)
Nov 29 02:50:02 np0005539505 nova_compute[186958]: 2025-11-29 07:50:02.019 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:02 np0005539505 nova_compute[186958]: 2025-11-29 07:50:02.214 186962 DEBUG nova.compute.manager [req-b4b305ce-4c19-4773-8709-61efae266bfa req-13676beb-ed8b-49a6-bb36-f7bee41ceace 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-changed-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:50:02 np0005539505 nova_compute[186958]: 2025-11-29 07:50:02.215 186962 DEBUG nova.compute.manager [req-b4b305ce-4c19-4773-8709-61efae266bfa req-13676beb-ed8b-49a6-bb36-f7bee41ceace 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Refreshing instance network info cache due to event network-changed-9ff644c0-307e-470e-add6-ceb7d6a15833. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:50:02 np0005539505 nova_compute[186958]: 2025-11-29 07:50:02.215 186962 DEBUG oslo_concurrency.lockutils [req-b4b305ce-4c19-4773-8709-61efae266bfa req-13676beb-ed8b-49a6-bb36-f7bee41ceace 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:50:02 np0005539505 nova_compute[186958]: 2025-11-29 07:50:02.216 186962 DEBUG oslo_concurrency.lockutils [req-b4b305ce-4c19-4773-8709-61efae266bfa req-13676beb-ed8b-49a6-bb36-f7bee41ceace 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:50:02 np0005539505 nova_compute[186958]: 2025-11-29 07:50:02.216 186962 DEBUG nova.network.neutron [req-b4b305ce-4c19-4773-8709-61efae266bfa req-13676beb-ed8b-49a6-bb36-f7bee41ceace 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Refreshing network info cache for port 9ff644c0-307e-470e-add6-ceb7d6a15833 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:50:03 np0005539505 nova_compute[186958]: 2025-11-29 07:50:03.308 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:05 np0005539505 nova_compute[186958]: 2025-11-29 07:50:05.753 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:06 np0005539505 podman[251514]: 2025-11-29 07:50:06.760997727 +0000 UTC m=+0.087135769 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:50:07 np0005539505 nova_compute[186958]: 2025-11-29 07:50:07.806 186962 DEBUG nova.network.neutron [req-b4b305ce-4c19-4773-8709-61efae266bfa req-13676beb-ed8b-49a6-bb36-f7bee41ceace 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updated VIF entry in instance network info cache for port 9ff644c0-307e-470e-add6-ceb7d6a15833. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:50:07 np0005539505 nova_compute[186958]: 2025-11-29 07:50:07.807 186962 DEBUG nova.network.neutron [req-b4b305ce-4c19-4773-8709-61efae266bfa req-13676beb-ed8b-49a6-bb36-f7bee41ceace 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updating instance_info_cache with network_info: [{"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:50:07 np0005539505 nova_compute[186958]: 2025-11-29 07:50:07.831 186962 DEBUG oslo_concurrency.lockutils [req-b4b305ce-4c19-4773-8709-61efae266bfa req-13676beb-ed8b-49a6-bb36-f7bee41ceace 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:50:08 np0005539505 nova_compute[186958]: 2025-11-29 07:50:08.310 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:10 np0005539505 nova_compute[186958]: 2025-11-29 07:50:10.758 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:12 np0005539505 ovn_controller[95143]: 2025-11-29T07:50:12Z|00092|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e5:2e:ef 10.100.0.4
Nov 29 02:50:12 np0005539505 ovn_controller[95143]: 2025-11-29T07:50:12Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e5:2e:ef 10.100.0.4
Nov 29 02:50:13 np0005539505 nova_compute[186958]: 2025-11-29 07:50:13.313 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:13 np0005539505 podman[251548]: 2025-11-29 07:50:13.778852867 +0000 UTC m=+0.057619923 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:50:13 np0005539505 podman[251549]: 2025-11-29 07:50:13.819230555 +0000 UTC m=+0.094496580 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:50:14 np0005539505 ovn_controller[95143]: 2025-11-29T07:50:14Z|00810|binding|INFO|Releasing lport 963b66d7-328f-4f14-a5c3-5bc0702a4524 from this chassis (sb_readonly=0)
Nov 29 02:50:14 np0005539505 nova_compute[186958]: 2025-11-29 07:50:14.906 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:15 np0005539505 nova_compute[186958]: 2025-11-29 07:50:15.760 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:18 np0005539505 nova_compute[186958]: 2025-11-29 07:50:18.195 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:18 np0005539505 nova_compute[186958]: 2025-11-29 07:50:18.315 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:18 np0005539505 nova_compute[186958]: 2025-11-29 07:50:18.655 186962 DEBUG oslo_concurrency.lockutils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:18 np0005539505 nova_compute[186958]: 2025-11-29 07:50:18.655 186962 DEBUG oslo_concurrency.lockutils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:18 np0005539505 nova_compute[186958]: 2025-11-29 07:50:18.655 186962 INFO nova.compute.manager [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Shelving#033[00m
Nov 29 02:50:18 np0005539505 nova_compute[186958]: 2025-11-29 07:50:18.705 186962 DEBUG nova.virt.libvirt.driver [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 29 02:50:18 np0005539505 podman[251596]: 2025-11-29 07:50:18.750003122 +0000 UTC m=+0.064441978 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:50:18 np0005539505 podman[251595]: 2025-11-29 07:50:18.776130161 +0000 UTC m=+0.095145358 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:50:20 np0005539505 nova_compute[186958]: 2025-11-29 07:50:20.762 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539505 kernel: tap9ff644c0-30 (unregistering): left promiscuous mode
Nov 29 02:50:20 np0005539505 NetworkManager[55134]: <info>  [1764402620.9020] device (tap9ff644c0-30): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:50:20 np0005539505 ovn_controller[95143]: 2025-11-29T07:50:20Z|00811|binding|INFO|Releasing lport 9ff644c0-307e-470e-add6-ceb7d6a15833 from this chassis (sb_readonly=0)
Nov 29 02:50:20 np0005539505 nova_compute[186958]: 2025-11-29 07:50:20.909 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539505 ovn_controller[95143]: 2025-11-29T07:50:20Z|00812|binding|INFO|Setting lport 9ff644c0-307e-470e-add6-ceb7d6a15833 down in Southbound
Nov 29 02:50:20 np0005539505 ovn_controller[95143]: 2025-11-29T07:50:20Z|00813|binding|INFO|Removing iface tap9ff644c0-30 ovn-installed in OVS
Nov 29 02:50:20 np0005539505 nova_compute[186958]: 2025-11-29 07:50:20.913 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:20.919 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e5:2e:ef 10.100.0.4'], port_security=['fa:16:3e:e5:2e:ef 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b97f1300-6668-4955-a425-98d44189860d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10520ccc4be44f138c8dd72b1d5edabe', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6d0d2b51-2297-4c99-81c6-529d5b2de4c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.231'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ef9cade-d410-4695-ad11-b5fe7020e3e8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=9ff644c0-307e-470e-add6-ceb7d6a15833) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:50:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:20.922 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 9ff644c0-307e-470e-add6-ceb7d6a15833 in datapath d7cbfb39-b4f8-4082-be26-e925bf6de50f unbound from our chassis#033[00m
Nov 29 02:50:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:20.925 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7cbfb39-b4f8-4082-be26-e925bf6de50f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:50:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:20.926 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[680315ae-aade-4524-888c-6a6b4e40f0f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:20 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:20.927 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f namespace which is not needed anymore#033[00m
Nov 29 02:50:20 np0005539505 nova_compute[186958]: 2025-11-29 07:50:20.928 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:20 np0005539505 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000af.scope: Deactivated successfully.
Nov 29 02:50:20 np0005539505 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000af.scope: Consumed 13.015s CPU time.
Nov 29 02:50:20 np0005539505 systemd-machined[153285]: Machine qemu-84-instance-000000af terminated.
Nov 29 02:50:21 np0005539505 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[251455]: [NOTICE]   (251459) : haproxy version is 2.8.14-c23fe91
Nov 29 02:50:21 np0005539505 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[251455]: [NOTICE]   (251459) : path to executable is /usr/sbin/haproxy
Nov 29 02:50:21 np0005539505 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[251455]: [WARNING]  (251459) : Exiting Master process...
Nov 29 02:50:21 np0005539505 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[251455]: [ALERT]    (251459) : Current worker (251461) exited with code 143 (Terminated)
Nov 29 02:50:21 np0005539505 neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f[251455]: [WARNING]  (251459) : All workers exited. Exiting... (0)
Nov 29 02:50:21 np0005539505 systemd[1]: libpod-96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639.scope: Deactivated successfully.
Nov 29 02:50:21 np0005539505 podman[251654]: 2025-11-29 07:50:21.143567146 +0000 UTC m=+0.115400919 container died 96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:50:21 np0005539505 nova_compute[186958]: 2025-11-29 07:50:21.723 186962 INFO nova.virt.libvirt.driver [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance shutdown successfully after 3 seconds.#033[00m
Nov 29 02:50:21 np0005539505 nova_compute[186958]: 2025-11-29 07:50:21.728 186962 INFO nova.virt.libvirt.driver [-] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance destroyed successfully.#033[00m
Nov 29 02:50:21 np0005539505 nova_compute[186958]: 2025-11-29 07:50:21.729 186962 DEBUG nova.objects.instance [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'numa_topology' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:50:22 np0005539505 nova_compute[186958]: 2025-11-29 07:50:22.046 186962 INFO nova.virt.libvirt.driver [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Beginning cold snapshot process#033[00m
Nov 29 02:50:22 np0005539505 systemd[1]: var-lib-containers-storage-overlay-f716d80073ecf1fc520818078608f24395e92e8386d45c5c63db5fa0a065feba-merged.mount: Deactivated successfully.
Nov 29 02:50:22 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639-userdata-shm.mount: Deactivated successfully.
Nov 29 02:50:22 np0005539505 nova_compute[186958]: 2025-11-29 07:50:22.500 186962 DEBUG nova.privsep.utils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 29 02:50:22 np0005539505 nova_compute[186958]: 2025-11-29 07:50:22.501 186962 DEBUG oslo_concurrency.processutils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk /var/lib/nova/instances/snapshots/tmpvzbnwkb_/77bd4cc7afce4ba88a5cd4197d7aa4a5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:50:22 np0005539505 nova_compute[186958]: 2025-11-29 07:50:22.526 186962 DEBUG nova.compute.manager [req-9bdd24ac-e6db-49d1-b067-906a536e2860 req-b3311896-d4dc-4782-ae26-484950046a89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-vif-unplugged-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:50:22 np0005539505 nova_compute[186958]: 2025-11-29 07:50:22.527 186962 DEBUG oslo_concurrency.lockutils [req-9bdd24ac-e6db-49d1-b067-906a536e2860 req-b3311896-d4dc-4782-ae26-484950046a89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:22 np0005539505 nova_compute[186958]: 2025-11-29 07:50:22.527 186962 DEBUG oslo_concurrency.lockutils [req-9bdd24ac-e6db-49d1-b067-906a536e2860 req-b3311896-d4dc-4782-ae26-484950046a89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:22 np0005539505 nova_compute[186958]: 2025-11-29 07:50:22.527 186962 DEBUG oslo_concurrency.lockutils [req-9bdd24ac-e6db-49d1-b067-906a536e2860 req-b3311896-d4dc-4782-ae26-484950046a89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:22 np0005539505 nova_compute[186958]: 2025-11-29 07:50:22.528 186962 DEBUG nova.compute.manager [req-9bdd24ac-e6db-49d1-b067-906a536e2860 req-b3311896-d4dc-4782-ae26-484950046a89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] No waiting events found dispatching network-vif-unplugged-9ff644c0-307e-470e-add6-ceb7d6a15833 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:50:22 np0005539505 nova_compute[186958]: 2025-11-29 07:50:22.528 186962 WARNING nova.compute.manager [req-9bdd24ac-e6db-49d1-b067-906a536e2860 req-b3311896-d4dc-4782-ae26-484950046a89 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received unexpected event network-vif-unplugged-9ff644c0-307e-470e-add6-ceb7d6a15833 for instance with vm_state active and task_state shelving_image_pending_upload.#033[00m
Nov 29 02:50:22 np0005539505 podman[251654]: 2025-11-29 07:50:22.824913701 +0000 UTC m=+1.796747484 container cleanup 96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 02:50:22 np0005539505 systemd[1]: libpod-conmon-96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639.scope: Deactivated successfully.
Nov 29 02:50:23 np0005539505 nova_compute[186958]: 2025-11-29 07:50:23.318 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:24 np0005539505 nova_compute[186958]: 2025-11-29 07:50:24.437 186962 DEBUG nova.compute.manager [req-27edb84f-d961-4305-abdd-314410e60132 req-88a01974-d059-48e2-8854-4f9d31654007 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:50:24 np0005539505 nova_compute[186958]: 2025-11-29 07:50:24.438 186962 DEBUG oslo_concurrency.lockutils [req-27edb84f-d961-4305-abdd-314410e60132 req-88a01974-d059-48e2-8854-4f9d31654007 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "b97f1300-6668-4955-a425-98d44189860d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:24 np0005539505 nova_compute[186958]: 2025-11-29 07:50:24.438 186962 DEBUG oslo_concurrency.lockutils [req-27edb84f-d961-4305-abdd-314410e60132 req-88a01974-d059-48e2-8854-4f9d31654007 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:24 np0005539505 nova_compute[186958]: 2025-11-29 07:50:24.439 186962 DEBUG oslo_concurrency.lockutils [req-27edb84f-d961-4305-abdd-314410e60132 req-88a01974-d059-48e2-8854-4f9d31654007 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:24 np0005539505 nova_compute[186958]: 2025-11-29 07:50:24.439 186962 DEBUG nova.compute.manager [req-27edb84f-d961-4305-abdd-314410e60132 req-88a01974-d059-48e2-8854-4f9d31654007 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] No waiting events found dispatching network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:50:24 np0005539505 nova_compute[186958]: 2025-11-29 07:50:24.440 186962 WARNING nova.compute.manager [req-27edb84f-d961-4305-abdd-314410e60132 req-88a01974-d059-48e2-8854-4f9d31654007 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received unexpected event network-vif-plugged-9ff644c0-307e-470e-add6-ceb7d6a15833 for instance with vm_state active and task_state shelving_image_pending_upload.#033[00m
Nov 29 02:50:25 np0005539505 podman[251713]: 2025-11-29 07:50:25.258707678 +0000 UTC m=+2.407022740 container remove 96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:25.268 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0dfb5ab4-7235-4535-9a93-ac08dac9e602]: (4, ('Sat Nov 29 07:50:21 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f (96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639)\n96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639\nSat Nov 29 07:50:22 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f (96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639)\n96a5d8124b56b99eea4e15108470fda6e7666f43fd8bd54b2a4a5cd16b6c9639\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:25.271 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9d6c5857-3e82-449f-bb0e-a3bcf2d9c6d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:25.273 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7cbfb39-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:25 np0005539505 nova_compute[186958]: 2025-11-29 07:50:25.276 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:25 np0005539505 kernel: tapd7cbfb39-b0: left promiscuous mode
Nov 29 02:50:25 np0005539505 nova_compute[186958]: 2025-11-29 07:50:25.280 186962 DEBUG oslo_concurrency.processutils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d/disk /var/lib/nova/instances/snapshots/tmpvzbnwkb_/77bd4cc7afce4ba88a5cd4197d7aa4a5" returned: 0 in 2.779s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:50:25 np0005539505 nova_compute[186958]: 2025-11-29 07:50:25.282 186962 INFO nova.virt.libvirt.driver [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Snapshot extracted, beginning image upload#033[00m
Nov 29 02:50:25 np0005539505 nova_compute[186958]: 2025-11-29 07:50:25.306 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:25.309 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a154ae-6a94-4850-9a09-3ad90a94d53b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:25.325 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e9926f1a-df78-4dfb-9982-5e4e766901a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:25.327 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bd59fe99-95cf-4685-8778-a67fd5f08904]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:25.350 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f24e8a-96ab-4003-9864-9b8bb807dfb2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 804472, 'reachable_time': 23585, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251730, 'error': None, 'target': 'ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:25 np0005539505 systemd[1]: run-netns-ovnmeta\x2dd7cbfb39\x2db4f8\x2d4082\x2dbe26\x2de925bf6de50f.mount: Deactivated successfully.
Nov 29 02:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:25.355 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7cbfb39-b4f8-4082-be26-e925bf6de50f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:50:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:25.355 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[c36da143-3a19-45b5-bae4-0d1898aa832d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:50:25 np0005539505 nova_compute[186958]: 2025-11-29 07:50:25.821 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:27.535 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:27.536 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:27.537 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:27 np0005539505 nova_compute[186958]: 2025-11-29 07:50:27.980 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:28 np0005539505 nova_compute[186958]: 2025-11-29 07:50:28.066 186962 INFO nova.virt.libvirt.driver [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Snapshot image upload complete#033[00m
Nov 29 02:50:28 np0005539505 nova_compute[186958]: 2025-11-29 07:50:28.067 186962 DEBUG nova.compute.manager [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:50:28 np0005539505 nova_compute[186958]: 2025-11-29 07:50:28.157 186962 INFO nova.compute.manager [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Shelve offloading#033[00m
Nov 29 02:50:28 np0005539505 nova_compute[186958]: 2025-11-29 07:50:28.178 186962 INFO nova.virt.libvirt.driver [-] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance destroyed successfully.#033[00m
Nov 29 02:50:28 np0005539505 nova_compute[186958]: 2025-11-29 07:50:28.179 186962 DEBUG nova.compute.manager [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:50:28 np0005539505 nova_compute[186958]: 2025-11-29 07:50:28.187 186962 DEBUG oslo_concurrency.lockutils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:50:28 np0005539505 nova_compute[186958]: 2025-11-29 07:50:28.188 186962 DEBUG oslo_concurrency.lockutils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquired lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:50:28 np0005539505 nova_compute[186958]: 2025-11-29 07:50:28.188 186962 DEBUG nova.network.neutron [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:50:28 np0005539505 nova_compute[186958]: 2025-11-29 07:50:28.211 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:28 np0005539505 nova_compute[186958]: 2025-11-29 07:50:28.319 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:29.030 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:50:29 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:29.031 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:50:29 np0005539505 nova_compute[186958]: 2025-11-29 07:50:29.078 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:30 np0005539505 nova_compute[186958]: 2025-11-29 07:50:30.824 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:31 np0005539505 nova_compute[186958]: 2025-11-29 07:50:31.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:31 np0005539505 nova_compute[186958]: 2025-11-29 07:50:31.927 186962 DEBUG nova.network.neutron [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updating instance_info_cache with network_info: [{"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:50:32 np0005539505 nova_compute[186958]: 2025-11-29 07:50:32.074 186962 DEBUG oslo_concurrency.lockutils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Releasing lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:50:32 np0005539505 podman[251738]: 2025-11-29 07:50:32.779361905 +0000 UTC m=+0.089705433 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:50:32 np0005539505 podman[251737]: 2025-11-29 07:50:32.779748876 +0000 UTC m=+0.103054766 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.316 186962 INFO nova.virt.libvirt.driver [-] [instance: b97f1300-6668-4955-a425-98d44189860d] Instance destroyed successfully.#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.317 186962 DEBUG nova.objects.instance [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lazy-loading 'resources' on Instance uuid b97f1300-6668-4955-a425-98d44189860d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.322 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.335 186962 DEBUG nova.virt.libvirt.vif [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:49:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2098184144',display_name='tempest-TestShelveInstance-server-2098184144',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2098184144',id=175,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOibuoQlHAfiaCYj4CIpZ6qJyfUZB3vV71n9+dsfX5nOIOEpheW32pCW+5Jb9gHkOLzyMwIqxCRE794nWAzInQpsnJoEx4IsVlc2/LJryddECLsGRYjobhhHnV47L4pCvg==',key_name='tempest-TestShelveInstance-1313079058',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:49:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='10520ccc4be44f138c8dd72b1d5edabe',ramdisk_id='',reservation_id='r-48eolvdi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1546337875',owner_user_name='tempest-TestShelveInstance-1546337875-project-member',shelved_at='2025-11-29T07:50:28.067577',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='46c68a90-c681-4942-bd16-74b36275e924'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:50:25Z,user_data=None,user_id='607d794b09b34b829673198ba073234c',uuid=b97f1300-6668-4955-a425-98d44189860d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.336 186962 DEBUG nova.network.os_vif_util [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converting VIF {"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": "br-int", "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9ff644c0-30", "ovs_interfaceid": "9ff644c0-307e-470e-add6-ceb7d6a15833", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.337 186962 DEBUG nova.network.os_vif_util [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.338 186962 DEBUG os_vif [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.340 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.340 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9ff644c0-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.342 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.344 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.348 186962 INFO os_vif [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e5:2e:ef,bridge_name='br-int',has_traffic_filtering=True,id=9ff644c0-307e-470e-add6-ceb7d6a15833,network=Network(d7cbfb39-b4f8-4082-be26-e925bf6de50f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9ff644c0-30')#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.348 186962 INFO nova.virt.libvirt.driver [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Deleting instance files /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d_del#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.356 186962 INFO nova.virt.libvirt.driver [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Deletion of /var/lib/nova/instances/b97f1300-6668-4955-a425-98d44189860d_del complete#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.435 186962 DEBUG nova.compute.manager [req-d06e0387-5c40-4a85-8759-db2c32126d76 req-b990e000-3feb-452c-92c6-123bb1010165 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Received event network-changed-9ff644c0-307e-470e-add6-ceb7d6a15833 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.435 186962 DEBUG nova.compute.manager [req-d06e0387-5c40-4a85-8759-db2c32126d76 req-b990e000-3feb-452c-92c6-123bb1010165 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Refreshing instance network info cache due to event network-changed-9ff644c0-307e-470e-add6-ceb7d6a15833. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.435 186962 DEBUG oslo_concurrency.lockutils [req-d06e0387-5c40-4a85-8759-db2c32126d76 req-b990e000-3feb-452c-92c6-123bb1010165 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.435 186962 DEBUG oslo_concurrency.lockutils [req-d06e0387-5c40-4a85-8759-db2c32126d76 req-b990e000-3feb-452c-92c6-123bb1010165 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.436 186962 DEBUG nova.network.neutron [req-d06e0387-5c40-4a85-8759-db2c32126d76 req-b990e000-3feb-452c-92c6-123bb1010165 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Refreshing network info cache for port 9ff644c0-307e-470e-add6-ceb7d6a15833 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.575 186962 INFO nova.scheduler.client.report [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Deleted allocations for instance b97f1300-6668-4955-a425-98d44189860d#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.642 186962 DEBUG oslo_concurrency.lockutils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.643 186962 DEBUG oslo_concurrency.lockutils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.702 186962 DEBUG nova.compute.provider_tree [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.721 186962 DEBUG nova.scheduler.client.report [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:50:33 np0005539505 nova_compute[186958]: 2025-11-29 07:50:33.750 186962 DEBUG oslo_concurrency.lockutils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:34 np0005539505 nova_compute[186958]: 2025-11-29 07:50:34.227 186962 DEBUG oslo_concurrency.lockutils [None req-37731d26-c379-4968-b228-551f9cee02fa 607d794b09b34b829673198ba073234c 10520ccc4be44f138c8dd72b1d5edabe - - default default] Lock "b97f1300-6668-4955-a425-98d44189860d" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 15.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:35 np0005539505 nova_compute[186958]: 2025-11-29 07:50:35.486 186962 DEBUG nova.network.neutron [req-d06e0387-5c40-4a85-8759-db2c32126d76 req-b990e000-3feb-452c-92c6-123bb1010165 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updated VIF entry in instance network info cache for port 9ff644c0-307e-470e-add6-ceb7d6a15833. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:50:35 np0005539505 nova_compute[186958]: 2025-11-29 07:50:35.487 186962 DEBUG nova.network.neutron [req-d06e0387-5c40-4a85-8759-db2c32126d76 req-b990e000-3feb-452c-92c6-123bb1010165 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: b97f1300-6668-4955-a425-98d44189860d] Updating instance_info_cache with network_info: [{"id": "9ff644c0-307e-470e-add6-ceb7d6a15833", "address": "fa:16:3e:e5:2e:ef", "network": {"id": "d7cbfb39-b4f8-4082-be26-e925bf6de50f", "bridge": null, "label": "tempest-TestShelveInstance-362870585-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.231", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10520ccc4be44f138c8dd72b1d5edabe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap9ff644c0-30", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:50:35 np0005539505 nova_compute[186958]: 2025-11-29 07:50:35.508 186962 DEBUG oslo_concurrency.lockutils [req-d06e0387-5c40-4a85-8759-db2c32126d76 req-b990e000-3feb-452c-92c6-123bb1010165 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-b97f1300-6668-4955-a425-98d44189860d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:50:36 np0005539505 nova_compute[186958]: 2025-11-29 07:50:36.180 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402621.1789973, b97f1300-6668-4955-a425-98d44189860d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:50:36 np0005539505 nova_compute[186958]: 2025-11-29 07:50:36.181 186962 INFO nova.compute.manager [-] [instance: b97f1300-6668-4955-a425-98d44189860d] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:50:36 np0005539505 nova_compute[186958]: 2025-11-29 07:50:36.198 186962 DEBUG nova.compute.manager [None req-e6a40440-bbd3-4e95-acb4-310733d7c150 - - - - - -] [instance: b97f1300-6668-4955-a425-98d44189860d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:50:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:50:37.032 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:50:37 np0005539505 podman[251782]: 2025-11-29 07:50:37.73103659 +0000 UTC m=+0.059164678 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:50:38 np0005539505 nova_compute[186958]: 2025-11-29 07:50:38.323 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:38 np0005539505 nova_compute[186958]: 2025-11-29 07:50:38.342 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:38 np0005539505 nova_compute[186958]: 2025-11-29 07:50:38.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:38 np0005539505 nova_compute[186958]: 2025-11-29 07:50:38.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:38 np0005539505 nova_compute[186958]: 2025-11-29 07:50:38.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:50:39 np0005539505 nova_compute[186958]: 2025-11-29 07:50:39.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:43 np0005539505 nova_compute[186958]: 2025-11-29 07:50:43.325 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:43 np0005539505 nova_compute[186958]: 2025-11-29 07:50:43.344 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:44 np0005539505 podman[251803]: 2025-11-29 07:50:44.731966686 +0000 UTC m=+0.056232324 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:50:44 np0005539505 podman[251804]: 2025-11-29 07:50:44.75267725 +0000 UTC m=+0.075614230 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 02:50:45 np0005539505 nova_compute[186958]: 2025-11-29 07:50:45.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:45 np0005539505 nova_compute[186958]: 2025-11-29 07:50:45.383 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:45 np0005539505 nova_compute[186958]: 2025-11-29 07:50:45.411 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:45 np0005539505 nova_compute[186958]: 2025-11-29 07:50:45.412 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:45 np0005539505 nova_compute[186958]: 2025-11-29 07:50:45.412 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:45 np0005539505 nova_compute[186958]: 2025-11-29 07:50:45.412 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:50:45 np0005539505 nova_compute[186958]: 2025-11-29 07:50:45.559 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:50:45 np0005539505 nova_compute[186958]: 2025-11-29 07:50:45.561 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5703MB free_disk=73.07107162475586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:50:45 np0005539505 nova_compute[186958]: 2025-11-29 07:50:45.561 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:50:45 np0005539505 nova_compute[186958]: 2025-11-29 07:50:45.561 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.109 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.110 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:50:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:50:48 np0005539505 nova_compute[186958]: 2025-11-29 07:50:48.312 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:50:48 np0005539505 nova_compute[186958]: 2025-11-29 07:50:48.312 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:50:48 np0005539505 nova_compute[186958]: 2025-11-29 07:50:48.326 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:48 np0005539505 nova_compute[186958]: 2025-11-29 07:50:48.346 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:50:48 np0005539505 nova_compute[186958]: 2025-11-29 07:50:48.347 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:48 np0005539505 nova_compute[186958]: 2025-11-29 07:50:48.509 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:50:49 np0005539505 nova_compute[186958]: 2025-11-29 07:50:49.414 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:50:49 np0005539505 nova_compute[186958]: 2025-11-29 07:50:49.415 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.853s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:50:49 np0005539505 podman[251853]: 2025-11-29 07:50:49.741111988 +0000 UTC m=+0.064866981 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 29 02:50:49 np0005539505 podman[251854]: 2025-11-29 07:50:49.760164464 +0000 UTC m=+0.078059139 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:50:50 np0005539505 nova_compute[186958]: 2025-11-29 07:50:50.414 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:50 np0005539505 nova_compute[186958]: 2025-11-29 07:50:50.415 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:50:50 np0005539505 nova_compute[186958]: 2025-11-29 07:50:50.415 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:50:50 np0005539505 nova_compute[186958]: 2025-11-29 07:50:50.804 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:50:50 np0005539505 nova_compute[186958]: 2025-11-29 07:50:50.805 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:53 np0005539505 nova_compute[186958]: 2025-11-29 07:50:53.329 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539505 nova_compute[186958]: 2025-11-29 07:50:53.349 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:53 np0005539505 nova_compute[186958]: 2025-11-29 07:50:53.764 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:54 np0005539505 nova_compute[186958]: 2025-11-29 07:50:54.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:50:58 np0005539505 nova_compute[186958]: 2025-11-29 07:50:58.330 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:50:58 np0005539505 nova_compute[186958]: 2025-11-29 07:50:58.350 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:03 np0005539505 nova_compute[186958]: 2025-11-29 07:51:03.332 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:03 np0005539505 nova_compute[186958]: 2025-11-29 07:51:03.352 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:03 np0005539505 podman[251893]: 2025-11-29 07:51:03.730167665 +0000 UTC m=+0.053593767 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:51:03 np0005539505 podman[251892]: 2025-11-29 07:51:03.784478053 +0000 UTC m=+0.101748539 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:51:08 np0005539505 nova_compute[186958]: 2025-11-29 07:51:08.334 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:08 np0005539505 nova_compute[186958]: 2025-11-29 07:51:08.353 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:08 np0005539505 podman[251938]: 2025-11-29 07:51:08.724967527 +0000 UTC m=+0.055722859 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Nov 29 02:51:13 np0005539505 nova_compute[186958]: 2025-11-29 07:51:13.335 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:13 np0005539505 nova_compute[186958]: 2025-11-29 07:51:13.354 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:15 np0005539505 podman[251957]: 2025-11-29 07:51:15.729081685 +0000 UTC m=+0.057961373 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:51:15 np0005539505 podman[251958]: 2025-11-29 07:51:15.804092885 +0000 UTC m=+0.114583226 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:51:18 np0005539505 nova_compute[186958]: 2025-11-29 07:51:18.337 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:18 np0005539505 nova_compute[186958]: 2025-11-29 07:51:18.355 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:20 np0005539505 podman[252009]: 2025-11-29 07:51:20.718061489 +0000 UTC m=+0.054590646 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:51:20 np0005539505 podman[252010]: 2025-11-29 07:51:20.745966689 +0000 UTC m=+0.074354343 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:51:23 np0005539505 nova_compute[186958]: 2025-11-29 07:51:23.339 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:23 np0005539505 nova_compute[186958]: 2025-11-29 07:51:23.356 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:51:27.536 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:51:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:51:27.536 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:51:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:51:27.536 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:51:28 np0005539505 nova_compute[186958]: 2025-11-29 07:51:28.342 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:28 np0005539505 nova_compute[186958]: 2025-11-29 07:51:28.358 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:33 np0005539505 nova_compute[186958]: 2025-11-29 07:51:33.344 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:33 np0005539505 nova_compute[186958]: 2025-11-29 07:51:33.359 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:34 np0005539505 podman[252048]: 2025-11-29 07:51:34.726405449 +0000 UTC m=+0.049228782 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:51:34 np0005539505 podman[252047]: 2025-11-29 07:51:34.732985658 +0000 UTC m=+0.060114745 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, vcs-type=git, maintainer=Red Hat, Inc.)
Nov 29 02:51:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:51:34.806 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:51:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:51:34.806 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:51:34 np0005539505 nova_compute[186958]: 2025-11-29 07:51:34.808 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:38 np0005539505 nova_compute[186958]: 2025-11-29 07:51:38.346 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:38 np0005539505 nova_compute[186958]: 2025-11-29 07:51:38.360 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:38 np0005539505 nova_compute[186958]: 2025-11-29 07:51:38.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:39 np0005539505 podman[252095]: 2025-11-29 07:51:39.713436425 +0000 UTC m=+0.047793461 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 29 02:51:40 np0005539505 nova_compute[186958]: 2025-11-29 07:51:40.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:40 np0005539505 nova_compute[186958]: 2025-11-29 07:51:40.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:40 np0005539505 nova_compute[186958]: 2025-11-29 07:51:40.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:51:41 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:51:41.808 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:51:43 np0005539505 nova_compute[186958]: 2025-11-29 07:51:43.349 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:43 np0005539505 nova_compute[186958]: 2025-11-29 07:51:43.361 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:45 np0005539505 nova_compute[186958]: 2025-11-29 07:51:45.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:46 np0005539505 nova_compute[186958]: 2025-11-29 07:51:46.183 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:51:46 np0005539505 nova_compute[186958]: 2025-11-29 07:51:46.184 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:51:46 np0005539505 nova_compute[186958]: 2025-11-29 07:51:46.184 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:51:46 np0005539505 nova_compute[186958]: 2025-11-29 07:51:46.184 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:51:46 np0005539505 nova_compute[186958]: 2025-11-29 07:51:46.368 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:51:46 np0005539505 nova_compute[186958]: 2025-11-29 07:51:46.369 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5734MB free_disk=73.07107162475586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:51:46 np0005539505 nova_compute[186958]: 2025-11-29 07:51:46.370 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:51:46 np0005539505 nova_compute[186958]: 2025-11-29 07:51:46.370 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:51:46 np0005539505 podman[252114]: 2025-11-29 07:51:46.723953948 +0000 UTC m=+0.053981199 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:51:46 np0005539505 podman[252115]: 2025-11-29 07:51:46.755008638 +0000 UTC m=+0.085326297 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:51:47 np0005539505 nova_compute[186958]: 2025-11-29 07:51:47.836 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:51:47 np0005539505 nova_compute[186958]: 2025-11-29 07:51:47.837 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:51:47 np0005539505 nova_compute[186958]: 2025-11-29 07:51:47.852 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:51:47 np0005539505 nova_compute[186958]: 2025-11-29 07:51:47.877 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:51:47 np0005539505 nova_compute[186958]: 2025-11-29 07:51:47.878 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:51:47 np0005539505 nova_compute[186958]: 2025-11-29 07:51:47.919 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:51:47 np0005539505 nova_compute[186958]: 2025-11-29 07:51:47.942 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:51:47 np0005539505 nova_compute[186958]: 2025-11-29 07:51:47.961 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:51:48 np0005539505 nova_compute[186958]: 2025-11-29 07:51:48.123 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:51:48 np0005539505 nova_compute[186958]: 2025-11-29 07:51:48.125 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:51:48 np0005539505 nova_compute[186958]: 2025-11-29 07:51:48.125 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:51:48 np0005539505 nova_compute[186958]: 2025-11-29 07:51:48.351 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:48 np0005539505 nova_compute[186958]: 2025-11-29 07:51:48.363 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:49 np0005539505 nova_compute[186958]: 2025-11-29 07:51:49.126 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:49 np0005539505 nova_compute[186958]: 2025-11-29 07:51:49.127 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:51:49 np0005539505 nova_compute[186958]: 2025-11-29 07:51:49.127 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:51:49 np0005539505 nova_compute[186958]: 2025-11-29 07:51:49.143 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:51:49 np0005539505 nova_compute[186958]: 2025-11-29 07:51:49.143 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:50 np0005539505 nova_compute[186958]: 2025-11-29 07:51:50.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:51 np0005539505 podman[252166]: 2025-11-29 07:51:51.717828914 +0000 UTC m=+0.050879109 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:51:51 np0005539505 podman[252165]: 2025-11-29 07:51:51.717980119 +0000 UTC m=+0.052466076 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:51:53 np0005539505 nova_compute[186958]: 2025-11-29 07:51:53.353 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:53 np0005539505 nova_compute[186958]: 2025-11-29 07:51:53.365 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:53 np0005539505 nova_compute[186958]: 2025-11-29 07:51:53.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:56 np0005539505 nova_compute[186958]: 2025-11-29 07:51:56.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:51:58 np0005539505 nova_compute[186958]: 2025-11-29 07:51:58.356 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:51:58 np0005539505 nova_compute[186958]: 2025-11-29 07:51:58.366 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:03 np0005539505 nova_compute[186958]: 2025-11-29 07:52:03.357 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:03 np0005539505 nova_compute[186958]: 2025-11-29 07:52:03.367 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:03 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:03Z|00814|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 02:52:05 np0005539505 podman[252204]: 2025-11-29 07:52:05.715921569 +0000 UTC m=+0.052216428 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Nov 29 02:52:05 np0005539505 podman[252205]: 2025-11-29 07:52:05.737160808 +0000 UTC m=+0.069473663 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:52:08 np0005539505 nova_compute[186958]: 2025-11-29 07:52:08.359 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:08 np0005539505 nova_compute[186958]: 2025-11-29 07:52:08.367 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:10 np0005539505 podman[252250]: 2025-11-29 07:52:10.712509762 +0000 UTC m=+0.048803751 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:52:13 np0005539505 nova_compute[186958]: 2025-11-29 07:52:13.362 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:13 np0005539505 nova_compute[186958]: 2025-11-29 07:52:13.368 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:17 np0005539505 podman[252269]: 2025-11-29 07:52:17.739749283 +0000 UTC m=+0.061066902 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:52:17 np0005539505 podman[252270]: 2025-11-29 07:52:17.762841905 +0000 UTC m=+0.091073752 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:52:18 np0005539505 nova_compute[186958]: 2025-11-29 07:52:18.370 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:52:18 np0005539505 nova_compute[186958]: 2025-11-29 07:52:18.372 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:52:18 np0005539505 nova_compute[186958]: 2025-11-29 07:52:18.372 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Nov 29 02:52:18 np0005539505 nova_compute[186958]: 2025-11-29 07:52:18.372 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:52:18 np0005539505 nova_compute[186958]: 2025-11-29 07:52:18.375 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:18 np0005539505 nova_compute[186958]: 2025-11-29 07:52:18.375 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 29 02:52:21 np0005539505 nova_compute[186958]: 2025-11-29 07:52:21.825 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Acquiring lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:21 np0005539505 nova_compute[186958]: 2025-11-29 07:52:21.826 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:21 np0005539505 nova_compute[186958]: 2025-11-29 07:52:21.843 186962 DEBUG nova.compute.manager [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:52:21 np0005539505 nova_compute[186958]: 2025-11-29 07:52:21.952 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:21 np0005539505 nova_compute[186958]: 2025-11-29 07:52:21.952 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:21 np0005539505 nova_compute[186958]: 2025-11-29 07:52:21.958 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:52:21 np0005539505 nova_compute[186958]: 2025-11-29 07:52:21.959 186962 INFO nova.compute.claims [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.095 186962 DEBUG nova.compute.provider_tree [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.114 186962 DEBUG nova.scheduler.client.report [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.135 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.136 186962 DEBUG nova.compute.manager [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.189 186962 DEBUG nova.compute.manager [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.189 186962 DEBUG nova.network.neutron [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.204 186962 INFO nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.236 186962 DEBUG nova.compute.manager [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.347 186962 DEBUG nova.compute.manager [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.348 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.349 186962 INFO nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Creating image(s)#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.349 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Acquiring lock "/var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.350 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "/var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.350 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "/var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.363 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.454 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.455 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.456 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.467 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.526 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.527 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:52:22 np0005539505 podman[252330]: 2025-11-29 07:52:22.714058415 +0000 UTC m=+0.048348517 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:52:22 np0005539505 podman[252331]: 2025-11-29 07:52:22.747164435 +0000 UTC m=+0.077216725 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 29 02:52:22 np0005539505 nova_compute[186958]: 2025-11-29 07:52:22.870 186962 DEBUG nova.policy [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.001 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk 1073741824" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.002 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.003 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.090 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.091 186962 DEBUG nova.virt.disk.api [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Checking if we can resize image /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.091 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.144 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.146 186962 DEBUG nova.virt.disk.api [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Cannot resize image /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.147 186962 DEBUG nova.objects.instance [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lazy-loading 'migration_context' on Instance uuid 7bd4d93e-30c7-452c-ac48-bbd6cf739b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.166 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.167 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Ensure instance console log exists: /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.167 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.167 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.168 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.376 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:52:23 np0005539505 nova_compute[186958]: 2025-11-29 07:52:23.981 186962 DEBUG nova.network.neutron [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Successfully created port: 8458662e-9962-45e2-b5a5-fc24dafd350c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:52:25 np0005539505 nova_compute[186958]: 2025-11-29 07:52:25.082 186962 DEBUG nova.network.neutron [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Successfully updated port: 8458662e-9962-45e2-b5a5-fc24dafd350c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:52:25 np0005539505 nova_compute[186958]: 2025-11-29 07:52:25.099 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Acquiring lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:52:25 np0005539505 nova_compute[186958]: 2025-11-29 07:52:25.099 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Acquired lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:52:25 np0005539505 nova_compute[186958]: 2025-11-29 07:52:25.099 186962 DEBUG nova.network.neutron [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:52:25 np0005539505 nova_compute[186958]: 2025-11-29 07:52:25.190 186962 DEBUG nova.compute.manager [req-ba9c83de-ae80-4037-97c6-5d82f977bd1a req-0af916f8-8b72-463e-806d-d5cb63951a78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Received event network-changed-8458662e-9962-45e2-b5a5-fc24dafd350c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:52:25 np0005539505 nova_compute[186958]: 2025-11-29 07:52:25.190 186962 DEBUG nova.compute.manager [req-ba9c83de-ae80-4037-97c6-5d82f977bd1a req-0af916f8-8b72-463e-806d-d5cb63951a78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Refreshing instance network info cache due to event network-changed-8458662e-9962-45e2-b5a5-fc24dafd350c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:52:25 np0005539505 nova_compute[186958]: 2025-11-29 07:52:25.190 186962 DEBUG oslo_concurrency.lockutils [req-ba9c83de-ae80-4037-97c6-5d82f977bd1a req-0af916f8-8b72-463e-806d-d5cb63951a78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:52:25 np0005539505 nova_compute[186958]: 2025-11-29 07:52:25.249 186962 DEBUG nova.network.neutron [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.259 186962 DEBUG nova.network.neutron [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Updating instance_info_cache with network_info: [{"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.283 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Releasing lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.283 186962 DEBUG nova.compute.manager [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Instance network_info: |[{"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.284 186962 DEBUG oslo_concurrency.lockutils [req-ba9c83de-ae80-4037-97c6-5d82f977bd1a req-0af916f8-8b72-463e-806d-d5cb63951a78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.284 186962 DEBUG nova.network.neutron [req-ba9c83de-ae80-4037-97c6-5d82f977bd1a req-0af916f8-8b72-463e-806d-d5cb63951a78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Refreshing network info cache for port 8458662e-9962-45e2-b5a5-fc24dafd350c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.288 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Start _get_guest_xml network_info=[{"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.295 186962 WARNING nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.300 186962 DEBUG nova.virt.libvirt.host [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.301 186962 DEBUG nova.virt.libvirt.host [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.306 186962 DEBUG nova.virt.libvirt.host [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.307 186962 DEBUG nova.virt.libvirt.host [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.308 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.309 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.309 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.310 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.310 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.310 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.310 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.311 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.311 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.311 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.311 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.312 186962 DEBUG nova.virt.hardware [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.317 186962 DEBUG nova.virt.libvirt.vif [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:52:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2040274870-ac',id=177,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAWzBFc8/3Sb6BS1/ZrfkfNaj2jrRaAZoEVao7d76mDDV5hsT8XfpQrn3nMB5pU7+moluz7SWJrm47HKfej+hy+pagM6gZl2OaZEh+D1RFl9WdunPf0X78trv+mSaw/Pwg==',key_name='tempest-TestSecurityGroupsBasicOps-1012627486',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='10ccdb85b00c46b998605e9c401e8471',ramdisk_id='',reservation_id='r-zqldyzm4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2040274870',owner_user_name='tempest-TestSecurityGroupsBasicOps-2040274870-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:52:22Z,user_data=None,user_id='65b6a37387a54d5da4a426cd229516da',uuid=7bd4d93e-30c7-452c-ac48-bbd6cf739b04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.317 186962 DEBUG nova.network.os_vif_util [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Converting VIF {"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.318 186962 DEBUG nova.network.os_vif_util [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:17:b1,bridge_name='br-int',has_traffic_filtering=True,id=8458662e-9962-45e2-b5a5-fc24dafd350c,network=Network(083a6a6e-74f7-4aca-8db7-8efd2372a971),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8458662e-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.319 186962 DEBUG nova.objects.instance [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7bd4d93e-30c7-452c-ac48-bbd6cf739b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.332 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  <uuid>7bd4d93e-30c7-452c-ac48-bbd6cf739b04</uuid>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  <name>instance-000000b1</name>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258</nova:name>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:52:26</nova:creationTime>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:        <nova:user uuid="65b6a37387a54d5da4a426cd229516da">tempest-TestSecurityGroupsBasicOps-2040274870-project-member</nova:user>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:        <nova:project uuid="10ccdb85b00c46b998605e9c401e8471">tempest-TestSecurityGroupsBasicOps-2040274870</nova:project>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:        <nova:port uuid="8458662e-9962-45e2-b5a5-fc24dafd350c">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <entry name="serial">7bd4d93e-30c7-452c-ac48-bbd6cf739b04</entry>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <entry name="uuid">7bd4d93e-30c7-452c-ac48-bbd6cf739b04</entry>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.config"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:9a:17:b1"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <target dev="tap8458662e-99"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/console.log" append="off"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:52:26 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:52:26 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:52:26 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:52:26 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.334 186962 DEBUG nova.compute.manager [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Preparing to wait for external event network-vif-plugged-8458662e-9962-45e2-b5a5-fc24dafd350c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.334 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Acquiring lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.334 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.335 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.335 186962 DEBUG nova.virt.libvirt.vif [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:52:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2040274870-ac',id=177,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAWzBFc8/3Sb6BS1/ZrfkfNaj2jrRaAZoEVao7d76mDDV5hsT8XfpQrn3nMB5pU7+moluz7SWJrm47HKfej+hy+pagM6gZl2OaZEh+D1RFl9WdunPf0X78trv+mSaw/Pwg==',key_name='tempest-TestSecurityGroupsBasicOps-1012627486',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='10ccdb85b00c46b998605e9c401e8471',ramdisk_id='',reservation_id='r-zqldyzm4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2040274870',owner_user_name='tempest-TestSecurityGroupsBasicOps-2040274870-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:52:22Z,user_data=None,user_id='65b6a37387a54d5da4a426cd229516da',uuid=7bd4d93e-30c7-452c-ac48-bbd6cf739b04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.336 186962 DEBUG nova.network.os_vif_util [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Converting VIF {"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.337 186962 DEBUG nova.network.os_vif_util [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:17:b1,bridge_name='br-int',has_traffic_filtering=True,id=8458662e-9962-45e2-b5a5-fc24dafd350c,network=Network(083a6a6e-74f7-4aca-8db7-8efd2372a971),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8458662e-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.337 186962 DEBUG os_vif [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:17:b1,bridge_name='br-int',has_traffic_filtering=True,id=8458662e-9962-45e2-b5a5-fc24dafd350c,network=Network(083a6a6e-74f7-4aca-8db7-8efd2372a971),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8458662e-99') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.338 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.338 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.339 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.342 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.342 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8458662e-99, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.342 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8458662e-99, col_values=(('external_ids', {'iface-id': '8458662e-9962-45e2-b5a5-fc24dafd350c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:17:b1', 'vm-uuid': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.344 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:26 np0005539505 NetworkManager[55134]: <info>  [1764402746.3457] manager: (tap8458662e-99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/401)
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.347 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.351 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:26 np0005539505 nova_compute[186958]: 2025-11-29 07:52:26.352 186962 INFO os_vif [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:17:b1,bridge_name='br-int',has_traffic_filtering=True,id=8458662e-9962-45e2-b5a5-fc24dafd350c,network=Network(083a6a6e-74f7-4aca-8db7-8efd2372a971),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8458662e-99')#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.039 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.040 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.040 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] No VIF found with MAC fa:16:3e:9a:17:b1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.041 186962 INFO nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Using config drive#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.424 186962 DEBUG nova.network.neutron [req-ba9c83de-ae80-4037-97c6-5d82f977bd1a req-0af916f8-8b72-463e-806d-d5cb63951a78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Updated VIF entry in instance network info cache for port 8458662e-9962-45e2-b5a5-fc24dafd350c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.424 186962 DEBUG nova.network.neutron [req-ba9c83de-ae80-4037-97c6-5d82f977bd1a req-0af916f8-8b72-463e-806d-d5cb63951a78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Updating instance_info_cache with network_info: [{"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.441 186962 INFO nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Creating config drive at /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.config#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.446 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0skffx4q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.465 186962 DEBUG oslo_concurrency.lockutils [req-ba9c83de-ae80-4037-97c6-5d82f977bd1a req-0af916f8-8b72-463e-806d-d5cb63951a78 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.537 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.538 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.538 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.571 186962 DEBUG oslo_concurrency.processutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0skffx4q" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:52:27 np0005539505 kernel: tap8458662e-99: entered promiscuous mode
Nov 29 02:52:27 np0005539505 NetworkManager[55134]: <info>  [1764402747.6525] manager: (tap8458662e-99): new Tun device (/org/freedesktop/NetworkManager/Devices/402)
Nov 29 02:52:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:27Z|00815|binding|INFO|Claiming lport 8458662e-9962-45e2-b5a5-fc24dafd350c for this chassis.
Nov 29 02:52:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:27Z|00816|binding|INFO|8458662e-9962-45e2-b5a5-fc24dafd350c: Claiming fa:16:3e:9a:17:b1 10.100.0.4
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.654 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.659 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:27 np0005539505 systemd-udevd[252392]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:52:27 np0005539505 NetworkManager[55134]: <info>  [1764402747.7035] device (tap8458662e-99): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:52:27 np0005539505 NetworkManager[55134]: <info>  [1764402747.7063] device (tap8458662e-99): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:52:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:27Z|00817|binding|INFO|Setting lport 8458662e-9962-45e2-b5a5-fc24dafd350c ovn-installed in OVS
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.737 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:27 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:27Z|00818|binding|INFO|Setting lport 8458662e-9962-45e2-b5a5-fc24dafd350c up in Southbound
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.743 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:17:b1 10.100.0.4'], port_security=['fa:16:3e:9a:17:b1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-083a6a6e-74f7-4aca-8db7-8efd2372a971', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10ccdb85b00c46b998605e9c401e8471', 'neutron:revision_number': '2', 'neutron:security_group_ids': '182ed522-20b6-46f8-b127-0094ebdc5f1e eb12b87d-71a1-463c-9452-9a8a13cd5539', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ee39ebc-2ea6-4a36-95ac-0df5bb2bee30, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=8458662e-9962-45e2-b5a5-fc24dafd350c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.744 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 8458662e-9962-45e2-b5a5-fc24dafd350c in datapath 083a6a6e-74f7-4aca-8db7-8efd2372a971 bound to our chassis#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.746 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 083a6a6e-74f7-4aca-8db7-8efd2372a971#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.868 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.872 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9819e2b1-7aa1-4b9a-9520-04091134053f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.873 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap083a6a6e-71 in ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.875 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.876 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap083a6a6e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.877 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e43dbd02-6306-43d1-976b-3192745ea063]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.878 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[15bb7a66-5b81-448c-a93a-dc6b81e7ce40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.893 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[4ecb92a0-1d6e-4767-912c-09a79eb5ae13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:27 np0005539505 systemd-machined[153285]: New machine qemu-85-instance-000000b1.
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.913 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3b8d0309-4d38-43ad-836f-f65f1c0bc9f2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:27 np0005539505 systemd[1]: Started Virtual Machine qemu-85-instance-000000b1.
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.954 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[dcea7629-5fdd-4e9d-8789-fc98df07238b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:27.962 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a2abee19-c35f-4e83-a78c-440bae14e99d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:27 np0005539505 NetworkManager[55134]: <info>  [1764402747.9633] manager: (tap083a6a6e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/403)
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.980 186962 DEBUG nova.compute.manager [req-88feac19-1b99-47a4-bb7a-b0edef7de95b req-fadc5bfe-0df7-4442-be0d-d3a42f41b256 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Received event network-vif-plugged-8458662e-9962-45e2-b5a5-fc24dafd350c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.981 186962 DEBUG oslo_concurrency.lockutils [req-88feac19-1b99-47a4-bb7a-b0edef7de95b req-fadc5bfe-0df7-4442-be0d-d3a42f41b256 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.981 186962 DEBUG oslo_concurrency.lockutils [req-88feac19-1b99-47a4-bb7a-b0edef7de95b req-fadc5bfe-0df7-4442-be0d-d3a42f41b256 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.982 186962 DEBUG oslo_concurrency.lockutils [req-88feac19-1b99-47a4-bb7a-b0edef7de95b req-fadc5bfe-0df7-4442-be0d-d3a42f41b256 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:27 np0005539505 nova_compute[186958]: 2025-11-29 07:52:27.982 186962 DEBUG nova.compute.manager [req-88feac19-1b99-47a4-bb7a-b0edef7de95b req-fadc5bfe-0df7-4442-be0d-d3a42f41b256 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Processing event network-vif-plugged-8458662e-9962-45e2-b5a5-fc24dafd350c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.009 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[64f7bfa5-1161-4816-b6dc-c636c9eb862c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.014 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[05b161cd-fd65-4347-9477-1415bf57c47e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:28 np0005539505 NetworkManager[55134]: <info>  [1764402748.0432] device (tap083a6a6e-70): carrier: link connected
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.048 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[32beea64-98eb-4fc3-8a19-00f37e873cf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.072 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7487daf8-a9d0-4a5a-8a1f-b88258f034ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap083a6a6e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:bf:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 819562, 'reachable_time': 22021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252429, 'error': None, 'target': 'ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.089 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ec94dd1c-d347-4e9f-a0af-219968913899]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:bf2e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 819562, 'tstamp': 819562}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252430, 'error': None, 'target': 'ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.106 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ccda9993-0a8d-4a56-a36d-626aef78f2c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap083a6a6e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:bf:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 819562, 'reachable_time': 22021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252431, 'error': None, 'target': 'ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.138 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0702af-0669-4158-b772-17e249e57715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.195 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4fee90cf-59e6-4e06-a300-619dd0c5bf89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.197 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap083a6a6e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.197 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.197 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap083a6a6e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.199 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:28 np0005539505 NetworkManager[55134]: <info>  [1764402748.2002] manager: (tap083a6a6e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/404)
Nov 29 02:52:28 np0005539505 kernel: tap083a6a6e-70: entered promiscuous mode
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.202 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.203 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap083a6a6e-70, col_values=(('external_ids', {'iface-id': 'a7d70f4b-2d70-4242-bb4d-0e2d1e9247ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.204 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:28 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:28Z|00819|binding|INFO|Releasing lport a7d70f4b-2d70-4242-bb4d-0e2d1e9247ef from this chassis (sb_readonly=0)
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.217 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.219 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/083a6a6e-74f7-4aca-8db7-8efd2372a971.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/083a6a6e-74f7-4aca-8db7-8efd2372a971.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.220 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d9473726-8c65-43f5-9a4c-d33233ef289b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.221 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-083a6a6e-74f7-4aca-8db7-8efd2372a971
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/083a6a6e-74f7-4aca-8db7-8efd2372a971.pid.haproxy
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 083a6a6e-74f7-4aca-8db7-8efd2372a971
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:52:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:28.222 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971', 'env', 'PROCESS_TAG=haproxy-083a6a6e-74f7-4aca-8db7-8efd2372a971', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/083a6a6e-74f7-4aca-8db7-8efd2372a971.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.237 186962 DEBUG nova.compute.manager [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.238 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402748.2362747, 7bd4d93e-30c7-452c-ac48-bbd6cf739b04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.239 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] VM Started (Lifecycle Event)#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.243 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.248 186962 INFO nova.virt.libvirt.driver [-] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Instance spawned successfully.#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.248 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.266 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.274 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.279 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.280 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.281 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.281 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.282 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.283 186962 DEBUG nova.virt.libvirt.driver [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.307 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.308 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402748.238262, 7bd4d93e-30c7-452c-ac48-bbd6cf739b04 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.308 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.338 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.345 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402748.2430947, 7bd4d93e-30c7-452c-ac48-bbd6cf739b04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.345 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.359 186962 INFO nova.compute.manager [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Took 6.01 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.360 186962 DEBUG nova.compute.manager [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.367 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.370 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.379 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.391 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.449 186962 INFO nova.compute.manager [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Took 6.54 seconds to build instance.#033[00m
Nov 29 02:52:28 np0005539505 nova_compute[186958]: 2025-11-29 07:52:28.471 186962 DEBUG oslo_concurrency.lockutils [None req-7c8f013b-4ca7-4ac7-be03-9a6c05f80c90 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:28 np0005539505 podman[252469]: 2025-11-29 07:52:28.573045113 +0000 UTC m=+0.020331254 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:52:30 np0005539505 nova_compute[186958]: 2025-11-29 07:52:30.087 186962 DEBUG nova.compute.manager [req-7f66079a-f9ca-44d7-a112-a6a2ee327baf req-29f99afb-6957-45a5-b6d2-5ed70c6f215a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Received event network-vif-plugged-8458662e-9962-45e2-b5a5-fc24dafd350c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:52:30 np0005539505 nova_compute[186958]: 2025-11-29 07:52:30.089 186962 DEBUG oslo_concurrency.lockutils [req-7f66079a-f9ca-44d7-a112-a6a2ee327baf req-29f99afb-6957-45a5-b6d2-5ed70c6f215a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:30 np0005539505 nova_compute[186958]: 2025-11-29 07:52:30.090 186962 DEBUG oslo_concurrency.lockutils [req-7f66079a-f9ca-44d7-a112-a6a2ee327baf req-29f99afb-6957-45a5-b6d2-5ed70c6f215a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:30 np0005539505 nova_compute[186958]: 2025-11-29 07:52:30.090 186962 DEBUG oslo_concurrency.lockutils [req-7f66079a-f9ca-44d7-a112-a6a2ee327baf req-29f99afb-6957-45a5-b6d2-5ed70c6f215a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:30 np0005539505 nova_compute[186958]: 2025-11-29 07:52:30.090 186962 DEBUG nova.compute.manager [req-7f66079a-f9ca-44d7-a112-a6a2ee327baf req-29f99afb-6957-45a5-b6d2-5ed70c6f215a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] No waiting events found dispatching network-vif-plugged-8458662e-9962-45e2-b5a5-fc24dafd350c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:52:30 np0005539505 nova_compute[186958]: 2025-11-29 07:52:30.091 186962 WARNING nova.compute.manager [req-7f66079a-f9ca-44d7-a112-a6a2ee327baf req-29f99afb-6957-45a5-b6d2-5ed70c6f215a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Received unexpected event network-vif-plugged-8458662e-9962-45e2-b5a5-fc24dafd350c for instance with vm_state active and task_state None.#033[00m
Nov 29 02:52:31 np0005539505 nova_compute[186958]: 2025-11-29 07:52:31.345 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:32 np0005539505 podman[252469]: 2025-11-29 07:52:32.17810619 +0000 UTC m=+3.625392301 container create a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:52:32 np0005539505 systemd[1]: Started libpod-conmon-a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f.scope.
Nov 29 02:52:32 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:52:32 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/544365cb9b1649369cc15a1a47576a2af559ddffc22dbbca82c310970a29794c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:52:33 np0005539505 NetworkManager[55134]: <info>  [1764402753.1832] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/405)
Nov 29 02:52:33 np0005539505 nova_compute[186958]: 2025-11-29 07:52:33.182 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:33 np0005539505 NetworkManager[55134]: <info>  [1764402753.1840] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/406)
Nov 29 02:52:33 np0005539505 podman[252469]: 2025-11-29 07:52:33.211424454 +0000 UTC m=+4.658710565 container init a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:52:33 np0005539505 podman[252469]: 2025-11-29 07:52:33.22276769 +0000 UTC m=+4.670053781 container start a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:52:33 np0005539505 neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971[252485]: [NOTICE]   (252489) : New worker (252491) forked
Nov 29 02:52:33 np0005539505 neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971[252485]: [NOTICE]   (252489) : Loading success.
Nov 29 02:52:33 np0005539505 nova_compute[186958]: 2025-11-29 07:52:33.320 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:33 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:33Z|00820|binding|INFO|Releasing lport a7d70f4b-2d70-4242-bb4d-0e2d1e9247ef from this chassis (sb_readonly=0)
Nov 29 02:52:33 np0005539505 nova_compute[186958]: 2025-11-29 07:52:33.339 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:33 np0005539505 nova_compute[186958]: 2025-11-29 07:52:33.381 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:33 np0005539505 nova_compute[186958]: 2025-11-29 07:52:33.451 186962 DEBUG nova.compute.manager [req-6d4bef25-924e-4a89-add1-eb2fe6586764 req-59204b90-49dc-4554-af28-15cec2501e50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Received event network-changed-8458662e-9962-45e2-b5a5-fc24dafd350c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:52:33 np0005539505 nova_compute[186958]: 2025-11-29 07:52:33.452 186962 DEBUG nova.compute.manager [req-6d4bef25-924e-4a89-add1-eb2fe6586764 req-59204b90-49dc-4554-af28-15cec2501e50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Refreshing instance network info cache due to event network-changed-8458662e-9962-45e2-b5a5-fc24dafd350c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:52:33 np0005539505 nova_compute[186958]: 2025-11-29 07:52:33.452 186962 DEBUG oslo_concurrency.lockutils [req-6d4bef25-924e-4a89-add1-eb2fe6586764 req-59204b90-49dc-4554-af28-15cec2501e50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:52:33 np0005539505 nova_compute[186958]: 2025-11-29 07:52:33.453 186962 DEBUG oslo_concurrency.lockutils [req-6d4bef25-924e-4a89-add1-eb2fe6586764 req-59204b90-49dc-4554-af28-15cec2501e50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:52:33 np0005539505 nova_compute[186958]: 2025-11-29 07:52:33.453 186962 DEBUG nova.network.neutron [req-6d4bef25-924e-4a89-add1-eb2fe6586764 req-59204b90-49dc-4554-af28-15cec2501e50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Refreshing network info cache for port 8458662e-9962-45e2-b5a5-fc24dafd350c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:52:35 np0005539505 nova_compute[186958]: 2025-11-29 07:52:35.059 186962 DEBUG nova.network.neutron [req-6d4bef25-924e-4a89-add1-eb2fe6586764 req-59204b90-49dc-4554-af28-15cec2501e50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Updated VIF entry in instance network info cache for port 8458662e-9962-45e2-b5a5-fc24dafd350c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:52:35 np0005539505 nova_compute[186958]: 2025-11-29 07:52:35.059 186962 DEBUG nova.network.neutron [req-6d4bef25-924e-4a89-add1-eb2fe6586764 req-59204b90-49dc-4554-af28-15cec2501e50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Updating instance_info_cache with network_info: [{"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:52:35 np0005539505 nova_compute[186958]: 2025-11-29 07:52:35.077 186962 DEBUG oslo_concurrency.lockutils [req-6d4bef25-924e-4a89-add1-eb2fe6586764 req-59204b90-49dc-4554-af28-15cec2501e50 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:52:35 np0005539505 nova_compute[186958]: 2025-11-29 07:52:35.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:36 np0005539505 nova_compute[186958]: 2025-11-29 07:52:36.349 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:36 np0005539505 podman[252502]: 2025-11-29 07:52:36.733243951 +0000 UTC m=+0.060195148 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:52:36 np0005539505 podman[252501]: 2025-11-29 07:52:36.766179684 +0000 UTC m=+0.093778109 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Nov 29 02:52:38 np0005539505 nova_compute[186958]: 2025-11-29 07:52:38.431 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:39 np0005539505 nova_compute[186958]: 2025-11-29 07:52:39.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:40 np0005539505 nova_compute[186958]: 2025-11-29 07:52:40.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:40 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:40Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9a:17:b1 10.100.0.4
Nov 29 02:52:40 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:40Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9a:17:b1 10.100.0.4
Nov 29 02:52:41 np0005539505 nova_compute[186958]: 2025-11-29 07:52:41.353 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:41 np0005539505 podman[252565]: 2025-11-29 07:52:41.721015225 +0000 UTC m=+0.053311242 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:52:42 np0005539505 nova_compute[186958]: 2025-11-29 07:52:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:42 np0005539505 nova_compute[186958]: 2025-11-29 07:52:42.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:52:43 np0005539505 nova_compute[186958]: 2025-11-29 07:52:43.431 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:46 np0005539505 nova_compute[186958]: 2025-11-29 07:52:46.355 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.470 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.471 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.471 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.471 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.675 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.733 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.735 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.791 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.938 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.939 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5504MB free_disk=73.04237747192383GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.940 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:47 np0005539505 nova_compute[186958]: 2025-11-29 07:52:47.940 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.112 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b1', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '10ccdb85b00c46b998605e9c401e8471', 'user_id': '65b6a37387a54d5da4a426cd229516da', 'hostId': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.113 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.113 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258>]
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.116 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7bd4d93e-30c7-452c-ac48-bbd6cf739b04 / tap8458662e-99 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.116 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5cfde072-b1d3-4be2-a377-63e99365c8e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': 'instance-000000b1-7bd4d93e-30c7-452c-ac48-bbd6cf739b04-tap8458662e-99', 'timestamp': '2025-11-29T07:52:48.114447', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'tap8458662e-99', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:17:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8458662e-99'}, 'message_id': '65d92fd4-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.755280556, 'message_signature': 'c4339a1b2c69e8e6d1cb78d887826c2046e42d72b32356f0b540302536ad77ec'}]}, 'timestamp': '2025-11-29 07:52:48.117613', '_unique_id': '545d84aba1364736b389aa83147f6ec7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.119 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.130 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.130 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2ca4f5e-4497-40d8-8533-a4295970c8b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-vda', 'timestamp': '2025-11-29T07:52:48.120599', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65db35fe-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.761453671, 'message_signature': 'e10b0f7403cbca2a78f0a906d70140e2d747c2f6c6da000e56c7bf23e91a053f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-sda', 'timestamp': '2025-11-29T07:52:48.120599', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65db4512-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.761453671, 'message_signature': '9e933e7872afd44689a59b4e3cc69ec1d919848e75cb38f8520aef0bbe017873'}]}, 'timestamp': '2025-11-29 07:52:48.131126', '_unique_id': '1df64b9401bd43b2be52668f5f38dd24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.132 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.133 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.147 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/memory.usage volume: 40.4140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df5d7f09-47e6-423f-bbdf-a5cf2e9f1e27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4140625, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'timestamp': '2025-11-29T07:52:48.133408', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '65dde218-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.788373774, 'message_signature': '0e6892feea646372bf2aaeab2040aff3fe960980f11689f4de27eacdf6f8ba94'}]}, 'timestamp': '2025-11-29 07:52:48.148342', '_unique_id': '94e0f921611c4ed7bca897e9009c763e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.149 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.150 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/network.outgoing.packets volume: 14 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6411d755-df55-4e70-ae36-d9a01b7a3bda', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 14, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': 'instance-000000b1-7bd4d93e-30c7-452c-ac48-bbd6cf739b04-tap8458662e-99', 'timestamp': '2025-11-29T07:52:48.150797', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'tap8458662e-99', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:17:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8458662e-99'}, 'message_id': '65de519e-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.755280556, 'message_signature': 'a4dcc709bf0386d4798c36c3eeab9611ce198c62a2e2a69ed7a935c6e237615c'}]}, 'timestamp': '2025-11-29 07:52:48.151119', '_unique_id': '4f0abbaf441c4b67b52890e1c8371bfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.151 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.152 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.152 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0271d7f2-d359-4305-9530-cd98ff123485', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-vda', 'timestamp': '2025-11-29T07:52:48.152737', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65de9cda-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.761453671, 'message_signature': '547007004b60566c1e02a8424ee676bb14cff0360e641866a90d7e3cc070333c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-sda', 'timestamp': '2025-11-29T07:52:48.152737', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65dea662-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.761453671, 'message_signature': '7f3dbe12dd01c8a460440ac9c24aa5fc820048f9319a1bc03c35624c0cc74c6c'}]}, 'timestamp': '2025-11-29 07:52:48.153289', '_unique_id': '5ec10b5e30c94ba8bfa03cf4efa4ace1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.153 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:52:48 np0005539505 nova_compute[186958]: 2025-11-29 07:52:48.166 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 7bd4d93e-30c7-452c-ac48-bbd6cf739b04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:52:48 np0005539505 nova_compute[186958]: 2025-11-29 07:52:48.167 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:52:48 np0005539505 nova_compute[186958]: 2025-11-29 07:52:48.167 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.177 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.178 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '647d0a4f-a2a6-45d6-b57c-5f6728e6fc87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-vda', 'timestamp': '2025-11-29T07:52:48.154822', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65e27274-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': 'ccc4112d9b8bcf2515372fd23a20eb75010642bbf790c37ae59a8c37f3363ea2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-sda', 'timestamp': '2025-11-29T07:52:48.154822', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65e282a0-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': '88fa567425f4d1f8f182855eb26778113142dc82c9185e0ce95b1e74f72fe276'}]}, 'timestamp': '2025-11-29 07:52:48.178569', '_unique_id': 'b0af52b37ce0410d9c50f3563495379f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.180 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.181 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.181 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258>]
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.181 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc84d2c4-cddd-45f6-a3e0-969fa9224c02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': 'instance-000000b1-7bd4d93e-30c7-452c-ac48-bbd6cf739b04-tap8458662e-99', 'timestamp': '2025-11-29T07:52:48.181607', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'tap8458662e-99', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:17:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8458662e-99'}, 'message_id': '65e3063a-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.755280556, 'message_signature': '0810119dc95d8eb281fead7147bf3e289d18935ba3f570e7bf0c54a68ea7c4a1'}]}, 'timestamp': '2025-11-29 07:52:48.181943', '_unique_id': '5dd08b6634034e9fb54e4c89123ac971'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.183 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.183 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee38cfb0-6e83-46ca-ae21-ef90c9fb0b42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': 'instance-000000b1-7bd4d93e-30c7-452c-ac48-bbd6cf739b04-tap8458662e-99', 'timestamp': '2025-11-29T07:52:48.183810', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'tap8458662e-99', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:17:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8458662e-99'}, 'message_id': '65e35a72-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.755280556, 'message_signature': 'd57424fa591fc8f275234bc918428f5c87df8f2c121c73551c06bff89f8c2ab1'}]}, 'timestamp': '2025-11-29 07:52:48.184087', '_unique_id': '59d26572fc3c412c8e3dc1105c562dbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.185 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.read.bytes volume: 30665216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.185 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72959a1f-cf79-4df0-b65c-951221c64806', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30665216, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-vda', 'timestamp': '2025-11-29T07:52:48.185474', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65e39bcc-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': '572f9e14d7e3b3601878cbccb333b098412c2c3ad6acd311d41882afecebc70f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-sda', 'timestamp': '2025-11-29T07:52:48.185474', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65e3a43c-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': '3de62731408db7f71627e48d8ba28f07529bef1ff7847440c1b732dccb03cf02'}]}, 'timestamp': '2025-11-29 07:52:48.185961', '_unique_id': '1ff9a51db30e4ddab773de4705c2bad6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f975b29-fcfd-4b01-80be-d0d263afc46e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': 'instance-000000b1-7bd4d93e-30c7-452c-ac48-bbd6cf739b04-tap8458662e-99', 'timestamp': '2025-11-29T07:52:48.187146', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'tap8458662e-99', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:17:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8458662e-99'}, 'message_id': '65e3dc18-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.755280556, 'message_signature': 'd8618f4080c1636935d1bfc2591d2841421b2e9e8c6f797a2919a397bb03961b'}]}, 'timestamp': '2025-11-29 07:52:48.187416', '_unique_id': 'ac82a434d87d4438878460c01c63c478'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.188 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.188 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.read.latency volume: 218042498 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.188 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.read.latency volume: 25230093 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0e4cba8-a240-4e94-be33-f36cd8678359', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 218042498, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-vda', 'timestamp': '2025-11-29T07:52:48.188674', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65e417dc-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': '416fc059ff34e449152a2d4e812041d6d5e76f65fa08aff01e7c2eb18a9187c3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25230093, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-sda', 'timestamp': '2025-11-29T07:52:48.188674', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65e42222-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': 'bfb3a87cf0eab9773eb68d58e074177ba5cf517d4e592aab837d168ff416d2ad'}]}, 'timestamp': '2025-11-29 07:52:48.189265', '_unique_id': '83d4d099abb840fca8af5195a53e1996'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.190 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.190 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/cpu volume: 12120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3eebed39-06e6-4662-b3fa-a8fcf94e5bd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12120000000, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'timestamp': '2025-11-29T07:52:48.190550', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '65e4600c-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.788373774, 'message_signature': '4ba11abaa3217210fce074ed6d4c7294645a51a2ff7660e87c26d2280c1347a8'}]}, 'timestamp': '2025-11-29 07:52:48.190774', '_unique_id': '41d0f9226a8446ab943a744abdcdd72b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.191 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7afb634c-8a59-48c2-ae00-3ecb52135119', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': 'instance-000000b1-7bd4d93e-30c7-452c-ac48-bbd6cf739b04-tap8458662e-99', 'timestamp': '2025-11-29T07:52:48.191963', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'tap8458662e-99', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:17:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8458662e-99'}, 'message_id': '65e497ac-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.755280556, 'message_signature': '8801d0f44d3e36a5b6a91395d45f8594a15c21b7fae4bca5d2a0ed946824506f'}]}, 'timestamp': '2025-11-29 07:52:48.192206', '_unique_id': 'dbe6275b116e4c5da906e5254c9e8679'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.192 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.193 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2da5429a-000c-483a-99eb-61fbd7b839ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': 'instance-000000b1-7bd4d93e-30c7-452c-ac48-bbd6cf739b04-tap8458662e-99', 'timestamp': '2025-11-29T07:52:48.193378', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'tap8458662e-99', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:17:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8458662e-99'}, 'message_id': '65e4cf4c-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.755280556, 'message_signature': '6c52f97ae49605ef97bbab5db0f6849bf60d48145c56bdf1d105dc5be4b168b9'}]}, 'timestamp': '2025-11-29 07:52:48.193646', '_unique_id': 'a5203e82674b4af9a20dde4d6bdf0299'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.194 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.195 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258>]
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.195 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.195 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.read.requests volume: 1108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.195 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfe69a42-cd58-4d58-8d01-6a5a45b928bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1108, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-vda', 'timestamp': '2025-11-29T07:52:48.195269', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65e519de-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': 'bf5eb6223a313c2f66c8a13249d77e4bcc608b1d639521fb2ee360dfd46da42d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-sda', 'timestamp': '2025-11-29T07:52:48.195269', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65e52230-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': '2936823654c263181c2e9e57e3994eb1d479bae45223477a50c5bc710bec297f'}]}, 'timestamp': '2025-11-29 07:52:48.195738', '_unique_id': '5791ca7ebb854c75a19fbe698a275cf5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.196 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.write.requests volume: 301 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '514958df-b384-48a8-adc0-0725f54a931b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 301, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-vda', 'timestamp': '2025-11-29T07:52:48.196837', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65e55566-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': '34f99719f813e2b5705c7d40210d32026e29be5b9ba046f1ca86b050343fa885'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-sda', 'timestamp': '2025-11-29T07:52:48.196837', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65e55d40-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': 'f160a2a0c9de0ea6248a5ab642025cfa9a1580248ae0896d9d95263db42c0341'}]}, 'timestamp': '2025-11-29 07:52:48.197284', '_unique_id': '723783a3490c4660b060ab3c39248a10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.198 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/network.outgoing.bytes volume: 1480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '861feb5f-1fb6-4952-be33-636ee18219b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1480, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': 'instance-000000b1-7bd4d93e-30c7-452c-ac48-bbd6cf739b04-tap8458662e-99', 'timestamp': '2025-11-29T07:52:48.198408', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'tap8458662e-99', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:17:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8458662e-99'}, 'message_id': '65e59328-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.755280556, 'message_signature': '4592f3adeb6223969042e8bb0397aa43a43cc6336ea2e75388e1242189afc4f3'}]}, 'timestamp': '2025-11-29 07:52:48.198640', '_unique_id': '8c141821c7394963bc17af59205f5cba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.199 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7950e420-fdb3-4762-8cca-a9a8954e43ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-vda', 'timestamp': '2025-11-29T07:52:48.199784', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65e5c898-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.761453671, 'message_signature': '87f176563cba0834deb7376199a0256142ebefb9125324b265c891bc726f1ee4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-sda', 'timestamp': '2025-11-29T07:52:48.199784', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65e5d0a4-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.761453671, 'message_signature': 'b05b78c4dc88b0751de7310170b9b509a61baa3c13d6107a686023ea636fe7f4'}]}, 'timestamp': '2025-11-29 07:52:48.200210', '_unique_id': '71ee0289d6a74e4eb9e9e0f129f08244'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.201 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.201 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258>]
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.201 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.write.latency volume: 5152014083 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.202 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a47ba2e-ad2e-43ff-a794-61b763f85e78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5152014083, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-vda', 'timestamp': '2025-11-29T07:52:48.201927', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65e61d70-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': 'e33a6fc625fa918e0a65e8da0431316d0fe57c0e29aadccc1a22fd4175fc722d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04-sda', 'timestamp': '2025-11-29T07:52:48.201927', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'instance-000000b1', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65e628b0-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.795670761, 'message_signature': 'bcddc878c96ac1f04e7a51ae381f5156f62c7b86142b8ff3ede5244347c33187'}]}, 'timestamp': '2025-11-29 07:52:48.202491', '_unique_id': 'b47559f4f0fd4fc88e836fa7fbc61ddd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.203 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/network.incoming.bytes volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '198a4da8-e5f0-4e7b-a7b5-543b2e24f769', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': 'instance-000000b1-7bd4d93e-30c7-452c-ac48-bbd6cf739b04-tap8458662e-99', 'timestamp': '2025-11-29T07:52:48.203740', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'tap8458662e-99', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:17:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8458662e-99'}, 'message_id': '65e66410-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.755280556, 'message_signature': 'b4bb289d6d3092b9fd1950b5959f242df2b22b8e45f547fbea21e489ba526b04'}]}, 'timestamp': '2025-11-29 07:52:48.203990', '_unique_id': 'ce7a931481e840ad8a6931490771fd54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 DEBUG ceilometer.compute.pollsters [-] 7bd4d93e-30c7-452c-ac48-bbd6cf739b04/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cdd2010-6875-429a-93b2-e9d5160c34b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '65b6a37387a54d5da4a426cd229516da', 'user_name': None, 'project_id': '10ccdb85b00c46b998605e9c401e8471', 'project_name': None, 'resource_id': 'instance-000000b1-7bd4d93e-30c7-452c-ac48-bbd6cf739b04-tap8458662e-99', 'timestamp': '2025-11-29T07:52:48.205164', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258', 'name': 'tap8458662e-99', 'instance_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'instance_type': 'm1.nano', 'host': 'b6300176004f40af436a4d5492fc3cb04187bbd94dd408ad6844b6cd', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9a:17:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8458662e-99'}, 'message_id': '65e69dd6-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8215.755280556, 'message_signature': 'f966c81801c5333220b4559c33ee79e44ecbb7b4823500adf62d304b9afd926f'}]}, 'timestamp': '2025-11-29 07:52:48.205471', '_unique_id': '446ea91b844f44dbbf915a8b3925db47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:52:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:52:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:52:48 np0005539505 nova_compute[186958]: 2025-11-29 07:52:48.255 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:52:48 np0005539505 nova_compute[186958]: 2025-11-29 07:52:48.277 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:52:48 np0005539505 nova_compute[186958]: 2025-11-29 07:52:48.310 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:52:48 np0005539505 nova_compute[186958]: 2025-11-29 07:52:48.310 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:48 np0005539505 nova_compute[186958]: 2025-11-29 07:52:48.433 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:48 np0005539505 podman[252591]: 2025-11-29 07:52:48.763905782 +0000 UTC m=+0.079358471 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:52:48 np0005539505 podman[252592]: 2025-11-29 07:52:48.774900384 +0000 UTC m=+0.091412463 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 29 02:52:49 np0005539505 nova_compute[186958]: 2025-11-29 07:52:49.310 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:49 np0005539505 nova_compute[186958]: 2025-11-29 07:52:49.311 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:52:49 np0005539505 nova_compute[186958]: 2025-11-29 07:52:49.311 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:52:50 np0005539505 nova_compute[186958]: 2025-11-29 07:52:50.162 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:52:50 np0005539505 nova_compute[186958]: 2025-11-29 07:52:50.164 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:52:50 np0005539505 nova_compute[186958]: 2025-11-29 07:52:50.164 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:52:50 np0005539505 nova_compute[186958]: 2025-11-29 07:52:50.164 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7bd4d93e-30c7-452c-ac48-bbd6cf739b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:52:51 np0005539505 nova_compute[186958]: 2025-11-29 07:52:51.359 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:51 np0005539505 nova_compute[186958]: 2025-11-29 07:52:51.757 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Updating instance_info_cache with network_info: [{"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:52:51 np0005539505 nova_compute[186958]: 2025-11-29 07:52:51.770 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:51.772 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:52:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:51.777 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:52:51 np0005539505 nova_compute[186958]: 2025-11-29 07:52:51.788 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:52:51 np0005539505 nova_compute[186958]: 2025-11-29 07:52:51.788 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:52:51 np0005539505 nova_compute[186958]: 2025-11-29 07:52:51.788 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:52 np0005539505 nova_compute[186958]: 2025-11-29 07:52:52.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:53 np0005539505 nova_compute[186958]: 2025-11-29 07:52:53.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:53 np0005539505 nova_compute[186958]: 2025-11-29 07:52:53.434 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:53 np0005539505 podman[252637]: 2025-11-29 07:52:53.745296416 +0000 UTC m=+0.061571717 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:52:53 np0005539505 podman[252636]: 2025-11-29 07:52:53.768394531 +0000 UTC m=+0.105555574 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.057 186962 DEBUG nova.compute.manager [req-0669a575-412c-4a81-8e03-a74d1f6e7d0e req-c5c438d7-3e43-4416-bc4e-2d2017a0e98d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Received event network-changed-8458662e-9962-45e2-b5a5-fc24dafd350c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.057 186962 DEBUG nova.compute.manager [req-0669a575-412c-4a81-8e03-a74d1f6e7d0e req-c5c438d7-3e43-4416-bc4e-2d2017a0e98d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Refreshing instance network info cache due to event network-changed-8458662e-9962-45e2-b5a5-fc24dafd350c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.057 186962 DEBUG oslo_concurrency.lockutils [req-0669a575-412c-4a81-8e03-a74d1f6e7d0e req-c5c438d7-3e43-4416-bc4e-2d2017a0e98d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.057 186962 DEBUG oslo_concurrency.lockutils [req-0669a575-412c-4a81-8e03-a74d1f6e7d0e req-c5c438d7-3e43-4416-bc4e-2d2017a0e98d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.058 186962 DEBUG nova.network.neutron [req-0669a575-412c-4a81-8e03-a74d1f6e7d0e req-c5c438d7-3e43-4416-bc4e-2d2017a0e98d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Refreshing network info cache for port 8458662e-9962-45e2-b5a5-fc24dafd350c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.125 186962 DEBUG oslo_concurrency.lockutils [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Acquiring lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.126 186962 DEBUG oslo_concurrency.lockutils [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.126 186962 DEBUG oslo_concurrency.lockutils [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Acquiring lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.126 186962 DEBUG oslo_concurrency.lockutils [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.126 186962 DEBUG oslo_concurrency.lockutils [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.140 186962 INFO nova.compute.manager [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Terminating instance#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.169 186962 DEBUG nova.compute.manager [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:52:55 np0005539505 kernel: tap8458662e-99 (unregistering): left promiscuous mode
Nov 29 02:52:55 np0005539505 NetworkManager[55134]: <info>  [1764402775.1958] device (tap8458662e-99): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.209 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:55 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:55Z|00821|binding|INFO|Releasing lport 8458662e-9962-45e2-b5a5-fc24dafd350c from this chassis (sb_readonly=0)
Nov 29 02:52:55 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:55Z|00822|binding|INFO|Setting lport 8458662e-9962-45e2-b5a5-fc24dafd350c down in Southbound
Nov 29 02:52:55 np0005539505 ovn_controller[95143]: 2025-11-29T07:52:55Z|00823|binding|INFO|Removing iface tap8458662e-99 ovn-installed in OVS
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.211 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.218 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:17:b1 10.100.0.4'], port_security=['fa:16:3e:9a:17:b1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7bd4d93e-30c7-452c-ac48-bbd6cf739b04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-083a6a6e-74f7-4aca-8db7-8efd2372a971', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '10ccdb85b00c46b998605e9c401e8471', 'neutron:revision_number': '4', 'neutron:security_group_ids': '182ed522-20b6-46f8-b127-0094ebdc5f1e eb12b87d-71a1-463c-9452-9a8a13cd5539', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7ee39ebc-2ea6-4a36-95ac-0df5bb2bee30, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=8458662e-9962-45e2-b5a5-fc24dafd350c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.220 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 8458662e-9962-45e2-b5a5-fc24dafd350c in datapath 083a6a6e-74f7-4aca-8db7-8efd2372a971 unbound from our chassis#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.221 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 083a6a6e-74f7-4aca-8db7-8efd2372a971, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.223 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4c05877f-f715-4f6e-a002-f1e8d1f06d90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.224 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971 namespace which is not needed anymore#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.226 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:55 np0005539505 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b1.scope: Deactivated successfully.
Nov 29 02:52:55 np0005539505 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000b1.scope: Consumed 13.856s CPU time.
Nov 29 02:52:55 np0005539505 systemd-machined[153285]: Machine qemu-85-instance-000000b1 terminated.
Nov 29 02:52:55 np0005539505 neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971[252485]: [NOTICE]   (252489) : haproxy version is 2.8.14-c23fe91
Nov 29 02:52:55 np0005539505 neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971[252485]: [NOTICE]   (252489) : path to executable is /usr/sbin/haproxy
Nov 29 02:52:55 np0005539505 neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971[252485]: [WARNING]  (252489) : Exiting Master process...
Nov 29 02:52:55 np0005539505 neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971[252485]: [ALERT]    (252489) : Current worker (252491) exited with code 143 (Terminated)
Nov 29 02:52:55 np0005539505 neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971[252485]: [WARNING]  (252489) : All workers exited. Exiting... (0)
Nov 29 02:52:55 np0005539505 systemd[1]: libpod-a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f.scope: Deactivated successfully.
Nov 29 02:52:55 np0005539505 podman[252698]: 2025-11-29 07:52:55.354830086 +0000 UTC m=+0.047195789 container died a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 29 02:52:55 np0005539505 systemd[1]: var-lib-containers-storage-overlay-544365cb9b1649369cc15a1a47576a2af559ddffc22dbbca82c310970a29794c-merged.mount: Deactivated successfully.
Nov 29 02:52:55 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f-userdata-shm.mount: Deactivated successfully.
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.388 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.393 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:55 np0005539505 podman[252698]: 2025-11-29 07:52:55.395463148 +0000 UTC m=+0.087828851 container cleanup a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:52:55 np0005539505 systemd[1]: libpod-conmon-a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f.scope: Deactivated successfully.
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.441 186962 INFO nova.virt.libvirt.driver [-] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Instance destroyed successfully.#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.441 186962 DEBUG nova.objects.instance [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lazy-loading 'resources' on Instance uuid 7bd4d93e-30c7-452c-ac48-bbd6cf739b04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.455 186962 DEBUG nova.virt.libvirt.vif [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:52:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2040274870-access_point-589628258',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2040274870-ac',id=177,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAWzBFc8/3Sb6BS1/ZrfkfNaj2jrRaAZoEVao7d76mDDV5hsT8XfpQrn3nMB5pU7+moluz7SWJrm47HKfej+hy+pagM6gZl2OaZEh+D1RFl9WdunPf0X78trv+mSaw/Pwg==',key_name='tempest-TestSecurityGroupsBasicOps-1012627486',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:52:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='10ccdb85b00c46b998605e9c401e8471',ramdisk_id='',reservation_id='r-zqldyzm4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2040274870',owner_user_name='tempest-TestSecurityGroupsBasicOps-2040274870-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:52:28Z,user_data=None,user_id='65b6a37387a54d5da4a426cd229516da',uuid=7bd4d93e-30c7-452c-ac48-bbd6cf739b04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.456 186962 DEBUG nova.network.os_vif_util [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Converting VIF {"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.457 186962 DEBUG nova.network.os_vif_util [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9a:17:b1,bridge_name='br-int',has_traffic_filtering=True,id=8458662e-9962-45e2-b5a5-fc24dafd350c,network=Network(083a6a6e-74f7-4aca-8db7-8efd2372a971),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8458662e-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.457 186962 DEBUG os_vif [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:17:b1,bridge_name='br-int',has_traffic_filtering=True,id=8458662e-9962-45e2-b5a5-fc24dafd350c,network=Network(083a6a6e-74f7-4aca-8db7-8efd2372a971),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8458662e-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.459 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.460 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8458662e-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.462 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.464 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:55 np0005539505 podman[252737]: 2025-11-29 07:52:55.466462441 +0000 UTC m=+0.051712287 container remove a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.467 186962 INFO os_vif [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9a:17:b1,bridge_name='br-int',has_traffic_filtering=True,id=8458662e-9962-45e2-b5a5-fc24dafd350c,network=Network(083a6a6e-74f7-4aca-8db7-8efd2372a971),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8458662e-99')#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.468 186962 INFO nova.virt.libvirt.driver [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Deleting instance files /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04_del#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.469 186962 INFO nova.virt.libvirt.driver [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Deletion of /var/lib/nova/instances/7bd4d93e-30c7-452c-ac48-bbd6cf739b04_del complete#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.471 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ac96813a-a0e6-4694-b1bf-bd2fca6b30a5]: (4, ('Sat Nov 29 07:52:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971 (a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f)\na866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f\nSat Nov 29 07:52:55 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971 (a866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f)\na866eee422856784a07c86de914c0b552d5ca1b91aa9c9e9d07fe4601158fd0f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.474 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fa55c2ba-134a-4ba0-bd46-fafbe3efaa99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.475 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap083a6a6e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.476 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:55 np0005539505 kernel: tap083a6a6e-70: left promiscuous mode
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.488 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.491 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c7509279-a6b1-48f6-b58d-625591618755]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.507 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[98060b0a-9ae8-45a7-8a99-b38dd4dc5ca2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.509 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1a0101ed-89ac-48fb-911a-10e3f787522a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.523 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6e60aa74-0394-4dcd-8821-eea6da73f38f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 819553, 'reachable_time': 30571, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252757, 'error': None, 'target': 'ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:55 np0005539505 systemd[1]: run-netns-ovnmeta\x2d083a6a6e\x2d74f7\x2d4aca\x2d8db7\x2d8efd2372a971.mount: Deactivated successfully.
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.529 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-083a6a6e-74f7-4aca-8db7-8efd2372a971 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:52:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:52:55.529 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[3d22d580-d7b0-432a-b9b0-5075a524506a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.541 186962 INFO nova.compute.manager [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.542 186962 DEBUG oslo.service.loopingcall [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.542 186962 DEBUG nova.compute.manager [-] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:52:55 np0005539505 nova_compute[186958]: 2025-11-29 07:52:55.543 186962 DEBUG nova.network.neutron [-] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:52:56 np0005539505 nova_compute[186958]: 2025-11-29 07:52:56.849 186962 DEBUG nova.network.neutron [-] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:52:56 np0005539505 nova_compute[186958]: 2025-11-29 07:52:56.881 186962 INFO nova.compute.manager [-] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Took 1.34 seconds to deallocate network for instance.#033[00m
Nov 29 02:52:56 np0005539505 nova_compute[186958]: 2025-11-29 07:52:56.910 186962 DEBUG nova.network.neutron [req-0669a575-412c-4a81-8e03-a74d1f6e7d0e req-c5c438d7-3e43-4416-bc4e-2d2017a0e98d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Updated VIF entry in instance network info cache for port 8458662e-9962-45e2-b5a5-fc24dafd350c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:52:56 np0005539505 nova_compute[186958]: 2025-11-29 07:52:56.911 186962 DEBUG nova.network.neutron [req-0669a575-412c-4a81-8e03-a74d1f6e7d0e req-c5c438d7-3e43-4416-bc4e-2d2017a0e98d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Updating instance_info_cache with network_info: [{"id": "8458662e-9962-45e2-b5a5-fc24dafd350c", "address": "fa:16:3e:9a:17:b1", "network": {"id": "083a6a6e-74f7-4aca-8db7-8efd2372a971", "bridge": "br-int", "label": "tempest-network-smoke--918646947", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "10ccdb85b00c46b998605e9c401e8471", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8458662e-99", "ovs_interfaceid": "8458662e-9962-45e2-b5a5-fc24dafd350c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:52:56 np0005539505 nova_compute[186958]: 2025-11-29 07:52:56.939 186962 DEBUG oslo_concurrency.lockutils [req-0669a575-412c-4a81-8e03-a74d1f6e7d0e req-c5c438d7-3e43-4416-bc4e-2d2017a0e98d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-7bd4d93e-30c7-452c-ac48-bbd6cf739b04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:52:56 np0005539505 nova_compute[186958]: 2025-11-29 07:52:56.957 186962 DEBUG nova.compute.manager [req-1ae6aa27-674c-4a86-ae5e-a686a73d3090 req-02ee238a-af87-4515-b33f-6e18ce8b0554 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Received event network-vif-deleted-8458662e-9962-45e2-b5a5-fc24dafd350c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:52:56 np0005539505 nova_compute[186958]: 2025-11-29 07:52:56.958 186962 INFO nova.compute.manager [req-1ae6aa27-674c-4a86-ae5e-a686a73d3090 req-02ee238a-af87-4515-b33f-6e18ce8b0554 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Neutron deleted interface 8458662e-9962-45e2-b5a5-fc24dafd350c; detaching it from the instance and deleting it from the info cache#033[00m
Nov 29 02:52:56 np0005539505 nova_compute[186958]: 2025-11-29 07:52:56.958 186962 DEBUG nova.network.neutron [req-1ae6aa27-674c-4a86-ae5e-a686a73d3090 req-02ee238a-af87-4515-b33f-6e18ce8b0554 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:52:56 np0005539505 nova_compute[186958]: 2025-11-29 07:52:56.962 186962 DEBUG oslo_concurrency.lockutils [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:56 np0005539505 nova_compute[186958]: 2025-11-29 07:52:56.963 186962 DEBUG oslo_concurrency.lockutils [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.066 186962 DEBUG nova.compute.manager [req-1ae6aa27-674c-4a86-ae5e-a686a73d3090 req-02ee238a-af87-4515-b33f-6e18ce8b0554 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Detach interface failed, port_id=8458662e-9962-45e2-b5a5-fc24dafd350c, reason: Instance 7bd4d93e-30c7-452c-ac48-bbd6cf739b04 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.103 186962 DEBUG nova.compute.provider_tree [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.123 186962 DEBUG nova.scheduler.client.report [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.146 186962 DEBUG oslo_concurrency.lockutils [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.168 186962 DEBUG nova.compute.manager [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Received event network-vif-unplugged-8458662e-9962-45e2-b5a5-fc24dafd350c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.168 186962 DEBUG oslo_concurrency.lockutils [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.168 186962 DEBUG oslo_concurrency.lockutils [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.169 186962 DEBUG oslo_concurrency.lockutils [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.169 186962 DEBUG nova.compute.manager [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] No waiting events found dispatching network-vif-unplugged-8458662e-9962-45e2-b5a5-fc24dafd350c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.169 186962 WARNING nova.compute.manager [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Received unexpected event network-vif-unplugged-8458662e-9962-45e2-b5a5-fc24dafd350c for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.169 186962 DEBUG nova.compute.manager [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Received event network-vif-plugged-8458662e-9962-45e2-b5a5-fc24dafd350c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.170 186962 DEBUG oslo_concurrency.lockutils [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.170 186962 DEBUG oslo_concurrency.lockutils [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.170 186962 DEBUG oslo_concurrency.lockutils [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.170 186962 DEBUG nova.compute.manager [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] No waiting events found dispatching network-vif-plugged-8458662e-9962-45e2-b5a5-fc24dafd350c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.170 186962 WARNING nova.compute.manager [req-2bf2c3db-4133-4463-b7e9-47843d699b49 req-6933e65d-2b34-46bd-a203-4ecc18b4da56 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Received unexpected event network-vif-plugged-8458662e-9962-45e2-b5a5-fc24dafd350c for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.181 186962 INFO nova.scheduler.client.report [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Deleted allocations for instance 7bd4d93e-30c7-452c-ac48-bbd6cf739b04#033[00m
Nov 29 02:52:57 np0005539505 nova_compute[186958]: 2025-11-29 07:52:57.275 186962 DEBUG oslo_concurrency.lockutils [None req-4c0b8bf5-30c2-4b74-9994-513e5a4fda30 65b6a37387a54d5da4a426cd229516da 10ccdb85b00c46b998605e9c401e8471 - - default default] Lock "7bd4d93e-30c7-452c-ac48-bbd6cf739b04" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:52:58 np0005539505 nova_compute[186958]: 2025-11-29 07:52:58.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:52:58 np0005539505 nova_compute[186958]: 2025-11-29 07:52:58.437 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:00 np0005539505 nova_compute[186958]: 2025-11-29 07:53:00.464 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:01 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:53:01.781 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:53:01 np0005539505 nova_compute[186958]: 2025-11-29 07:53:01.820 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:02 np0005539505 nova_compute[186958]: 2025-11-29 07:53:02.003 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:03 np0005539505 nova_compute[186958]: 2025-11-29 07:53:03.474 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:05 np0005539505 nova_compute[186958]: 2025-11-29 07:53:05.467 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:07 np0005539505 nova_compute[186958]: 2025-11-29 07:53:07.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:07 np0005539505 podman[252760]: 2025-11-29 07:53:07.728108963 +0000 UTC m=+0.058454338 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:53:07 np0005539505 podman[252759]: 2025-11-29 07:53:07.743086298 +0000 UTC m=+0.073811064 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 29 02:53:08 np0005539505 nova_compute[186958]: 2025-11-29 07:53:08.477 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:10 np0005539505 nova_compute[186958]: 2025-11-29 07:53:10.441 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402775.4391334, 7bd4d93e-30c7-452c-ac48-bbd6cf739b04 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:53:10 np0005539505 nova_compute[186958]: 2025-11-29 07:53:10.441 186962 INFO nova.compute.manager [-] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:53:10 np0005539505 nova_compute[186958]: 2025-11-29 07:53:10.458 186962 DEBUG nova.compute.manager [None req-bad54f1d-314f-4353-94f4-e5b9d18175aa - - - - - -] [instance: 7bd4d93e-30c7-452c-ac48-bbd6cf739b04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:53:10 np0005539505 nova_compute[186958]: 2025-11-29 07:53:10.470 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:12 np0005539505 podman[252804]: 2025-11-29 07:53:12.741255986 +0000 UTC m=+0.078675023 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:53:13 np0005539505 nova_compute[186958]: 2025-11-29 07:53:13.479 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:15 np0005539505 nova_compute[186958]: 2025-11-29 07:53:15.473 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:18 np0005539505 nova_compute[186958]: 2025-11-29 07:53:18.481 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:19 np0005539505 podman[252823]: 2025-11-29 07:53:19.737901535 +0000 UTC m=+0.065171269 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:53:19 np0005539505 podman[252824]: 2025-11-29 07:53:19.81108805 +0000 UTC m=+0.139277650 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible)
Nov 29 02:53:20 np0005539505 nova_compute[186958]: 2025-11-29 07:53:20.396 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:20 np0005539505 nova_compute[186958]: 2025-11-29 07:53:20.397 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:53:20 np0005539505 nova_compute[186958]: 2025-11-29 07:53:20.475 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:20 np0005539505 nova_compute[186958]: 2025-11-29 07:53:20.588 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:53:21 np0005539505 nova_compute[186958]: 2025-11-29 07:53:21.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:21 np0005539505 nova_compute[186958]: 2025-11-29 07:53:21.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:53:23 np0005539505 nova_compute[186958]: 2025-11-29 07:53:23.482 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:24 np0005539505 podman[252874]: 2025-11-29 07:53:24.733305816 +0000 UTC m=+0.062534065 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:53:24 np0005539505 podman[252873]: 2025-11-29 07:53:24.753092227 +0000 UTC m=+0.081671777 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 29 02:53:25 np0005539505 nova_compute[186958]: 2025-11-29 07:53:25.479 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:53:27.538 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:53:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:53:27.539 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:53:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:53:27.539 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:53:28 np0005539505 nova_compute[186958]: 2025-11-29 07:53:28.483 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:30 np0005539505 nova_compute[186958]: 2025-11-29 07:53:30.543 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:30 np0005539505 nova_compute[186958]: 2025-11-29 07:53:30.639 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:33 np0005539505 nova_compute[186958]: 2025-11-29 07:53:33.484 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:35 np0005539505 nova_compute[186958]: 2025-11-29 07:53:35.545 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:38 np0005539505 nova_compute[186958]: 2025-11-29 07:53:38.487 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:38 np0005539505 podman[252910]: 2025-11-29 07:53:38.720093785 +0000 UTC m=+0.055332349 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, release=1755695350, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Nov 29 02:53:38 np0005539505 podman[252911]: 2025-11-29 07:53:38.743763756 +0000 UTC m=+0.075090040 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:53:39 np0005539505 nova_compute[186958]: 2025-11-29 07:53:39.399 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:40 np0005539505 nova_compute[186958]: 2025-11-29 07:53:40.548 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:42 np0005539505 nova_compute[186958]: 2025-11-29 07:53:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:43 np0005539505 nova_compute[186958]: 2025-11-29 07:53:43.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:43 np0005539505 nova_compute[186958]: 2025-11-29 07:53:43.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:53:43 np0005539505 nova_compute[186958]: 2025-11-29 07:53:43.490 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:43 np0005539505 podman[252954]: 2025-11-29 07:53:43.736888673 +0000 UTC m=+0.063166641 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 02:53:45 np0005539505 nova_compute[186958]: 2025-11-29 07:53:45.551 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.409 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.409 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.410 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.410 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.491 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.562 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.563 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5722MB free_disk=73.07106399536133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.564 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.564 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.629 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.629 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.650 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.672 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.713 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:53:48 np0005539505 nova_compute[186958]: 2025-11-29 07:53:48.713 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:53:49 np0005539505 nova_compute[186958]: 2025-11-29 07:53:49.714 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:49 np0005539505 nova_compute[186958]: 2025-11-29 07:53:49.715 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:53:49 np0005539505 nova_compute[186958]: 2025-11-29 07:53:49.715 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:53:49 np0005539505 nova_compute[186958]: 2025-11-29 07:53:49.735 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:53:49 np0005539505 nova_compute[186958]: 2025-11-29 07:53:49.735 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:50 np0005539505 nova_compute[186958]: 2025-11-29 07:53:50.554 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:50 np0005539505 podman[252976]: 2025-11-29 07:53:50.712165265 +0000 UTC m=+0.047927690 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:53:50 np0005539505 podman[252977]: 2025-11-29 07:53:50.7486864 +0000 UTC m=+0.081128351 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:53:53 np0005539505 nova_compute[186958]: 2025-11-29 07:53:53.493 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:54 np0005539505 nova_compute[186958]: 2025-11-29 07:53:54.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:55 np0005539505 nova_compute[186958]: 2025-11-29 07:53:55.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:55 np0005539505 nova_compute[186958]: 2025-11-29 07:53:55.557 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:55 np0005539505 podman[253024]: 2025-11-29 07:53:55.73244359 +0000 UTC m=+0.065165378 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:53:55 np0005539505 podman[253025]: 2025-11-29 07:53:55.750493192 +0000 UTC m=+0.082755967 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:53:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:53:57.040 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:53:57 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:53:57.041 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:53:57 np0005539505 nova_compute[186958]: 2025-11-29 07:53:57.042 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:53:58 np0005539505 nova_compute[186958]: 2025-11-29 07:53:58.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:53:58 np0005539505 nova_compute[186958]: 2025-11-29 07:53:58.496 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:00 np0005539505 nova_compute[186958]: 2025-11-29 07:54:00.560 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:02 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:02.045 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:54:03 np0005539505 nova_compute[186958]: 2025-11-29 07:54:03.497 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:03 np0005539505 ovn_controller[95143]: 2025-11-29T07:54:03Z|00824|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 29 02:54:05 np0005539505 nova_compute[186958]: 2025-11-29 07:54:05.563 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:08 np0005539505 nova_compute[186958]: 2025-11-29 07:54:08.499 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:09 np0005539505 podman[253060]: 2025-11-29 07:54:09.723033928 +0000 UTC m=+0.058490040 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Nov 29 02:54:09 np0005539505 podman[253061]: 2025-11-29 07:54:09.751415093 +0000 UTC m=+0.084120677 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:54:10 np0005539505 nova_compute[186958]: 2025-11-29 07:54:10.567 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:13 np0005539505 nova_compute[186958]: 2025-11-29 07:54:13.501 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:14 np0005539505 podman[253103]: 2025-11-29 07:54:14.771911625 +0000 UTC m=+0.100704338 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 02:54:15 np0005539505 nova_compute[186958]: 2025-11-29 07:54:15.570 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:18 np0005539505 nova_compute[186958]: 2025-11-29 07:54:18.504 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:20 np0005539505 nova_compute[186958]: 2025-11-29 07:54:20.573 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:21 np0005539505 podman[253122]: 2025-11-29 07:54:21.737192293 +0000 UTC m=+0.073537396 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:54:21 np0005539505 podman[253123]: 2025-11-29 07:54:21.753170526 +0000 UTC m=+0.085148406 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 02:54:23 np0005539505 nova_compute[186958]: 2025-11-29 07:54:23.505 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:24 np0005539505 nova_compute[186958]: 2025-11-29 07:54:24.673 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "f395beac-b14d-4701-bbfe-1190216043d1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:24 np0005539505 nova_compute[186958]: 2025-11-29 07:54:24.674 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:24 np0005539505 nova_compute[186958]: 2025-11-29 07:54:24.808 186962 DEBUG nova.compute.manager [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:54:25 np0005539505 nova_compute[186958]: 2025-11-29 07:54:25.575 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:25 np0005539505 nova_compute[186958]: 2025-11-29 07:54:25.858 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:25 np0005539505 nova_compute[186958]: 2025-11-29 07:54:25.858 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:25 np0005539505 nova_compute[186958]: 2025-11-29 07:54:25.866 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:54:25 np0005539505 nova_compute[186958]: 2025-11-29 07:54:25.867 186962 INFO nova.compute.claims [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:54:25 np0005539505 nova_compute[186958]: 2025-11-29 07:54:25.993 186962 DEBUG nova.compute.provider_tree [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.011 186962 DEBUG nova.scheduler.client.report [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.042 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.043 186962 DEBUG nova.compute.manager [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.107 186962 DEBUG nova.compute.manager [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.107 186962 DEBUG nova.network.neutron [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.137 186962 INFO nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.158 186962 DEBUG nova.compute.manager [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.293 186962 DEBUG nova.compute.manager [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.294 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.295 186962 INFO nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Creating image(s)#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.295 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "/var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.295 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.296 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.313 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.378 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.379 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.380 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.391 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.449 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.450 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:54:26 np0005539505 podman[253180]: 2025-11-29 07:54:26.745292714 +0000 UTC m=+0.070799909 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:54:26 np0005539505 podman[253179]: 2025-11-29 07:54:26.753324082 +0000 UTC m=+0.075037629 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:54:26 np0005539505 nova_compute[186958]: 2025-11-29 07:54:26.978 186962 DEBUG nova.policy [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.072 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk 1073741824" returned: 0 in 0.622s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.073 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.073 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.137 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.138 186962 DEBUG nova.virt.disk.api [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Checking if we can resize image /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.139 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.201 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.202 186962 DEBUG nova.virt.disk.api [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Cannot resize image /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.202 186962 DEBUG nova.objects.instance [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'migration_context' on Instance uuid f395beac-b14d-4701-bbfe-1190216043d1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.475 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.476 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Ensure instance console log exists: /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.476 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.477 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:27 np0005539505 nova_compute[186958]: 2025-11-29 07:54:27.478 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:27.539 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:27.540 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:27.540 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:28 np0005539505 nova_compute[186958]: 2025-11-29 07:54:28.506 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:30 np0005539505 nova_compute[186958]: 2025-11-29 07:54:30.078 186962 DEBUG nova.network.neutron [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Successfully created port: cd560093-fc70-43b0-889b-598fc21465d4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:54:30 np0005539505 nova_compute[186958]: 2025-11-29 07:54:30.578 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:31 np0005539505 nova_compute[186958]: 2025-11-29 07:54:31.252 186962 DEBUG nova.network.neutron [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Successfully updated port: cd560093-fc70-43b0-889b-598fc21465d4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:54:31 np0005539505 nova_compute[186958]: 2025-11-29 07:54:31.269 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "refresh_cache-f395beac-b14d-4701-bbfe-1190216043d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:54:31 np0005539505 nova_compute[186958]: 2025-11-29 07:54:31.269 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquired lock "refresh_cache-f395beac-b14d-4701-bbfe-1190216043d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:54:31 np0005539505 nova_compute[186958]: 2025-11-29 07:54:31.270 186962 DEBUG nova.network.neutron [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:54:31 np0005539505 nova_compute[186958]: 2025-11-29 07:54:31.355 186962 DEBUG nova.compute.manager [req-1b8a6e39-857a-4891-893a-060c4f149264 req-4c459bf0-e12d-4dcb-989a-5231451a0caa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Received event network-changed-cd560093-fc70-43b0-889b-598fc21465d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:54:31 np0005539505 nova_compute[186958]: 2025-11-29 07:54:31.355 186962 DEBUG nova.compute.manager [req-1b8a6e39-857a-4891-893a-060c4f149264 req-4c459bf0-e12d-4dcb-989a-5231451a0caa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Refreshing instance network info cache due to event network-changed-cd560093-fc70-43b0-889b-598fc21465d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:54:31 np0005539505 nova_compute[186958]: 2025-11-29 07:54:31.356 186962 DEBUG oslo_concurrency.lockutils [req-1b8a6e39-857a-4891-893a-060c4f149264 req-4c459bf0-e12d-4dcb-989a-5231451a0caa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-f395beac-b14d-4701-bbfe-1190216043d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:54:31 np0005539505 nova_compute[186958]: 2025-11-29 07:54:31.400 186962 DEBUG nova.network.neutron [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.544 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.563 186962 DEBUG nova.network.neutron [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Updating instance_info_cache with network_info: [{"id": "cd560093-fc70-43b0-889b-598fc21465d4", "address": "fa:16:3e:be:af:66", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd560093-fc", "ovs_interfaceid": "cd560093-fc70-43b0-889b-598fc21465d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.580 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Releasing lock "refresh_cache-f395beac-b14d-4701-bbfe-1190216043d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.581 186962 DEBUG nova.compute.manager [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Instance network_info: |[{"id": "cd560093-fc70-43b0-889b-598fc21465d4", "address": "fa:16:3e:be:af:66", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd560093-fc", "ovs_interfaceid": "cd560093-fc70-43b0-889b-598fc21465d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.581 186962 DEBUG oslo_concurrency.lockutils [req-1b8a6e39-857a-4891-893a-060c4f149264 req-4c459bf0-e12d-4dcb-989a-5231451a0caa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-f395beac-b14d-4701-bbfe-1190216043d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.581 186962 DEBUG nova.network.neutron [req-1b8a6e39-857a-4891-893a-060c4f149264 req-4c459bf0-e12d-4dcb-989a-5231451a0caa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Refreshing network info cache for port cd560093-fc70-43b0-889b-598fc21465d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.584 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Start _get_guest_xml network_info=[{"id": "cd560093-fc70-43b0-889b-598fc21465d4", "address": "fa:16:3e:be:af:66", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd560093-fc", "ovs_interfaceid": "cd560093-fc70-43b0-889b-598fc21465d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.590 186962 WARNING nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.594 186962 DEBUG nova.virt.libvirt.host [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.595 186962 DEBUG nova.virt.libvirt.host [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.598 186962 DEBUG nova.virt.libvirt.host [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.599 186962 DEBUG nova.virt.libvirt.host [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.600 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.601 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.601 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.601 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.602 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.602 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.602 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.603 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.603 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.603 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.603 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.604 186962 DEBUG nova.virt.hardware [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.609 186962 DEBUG nova.virt.libvirt.vif [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:54:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ge',id=179,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa1HqYHK6Fbyj4+WZlz/MxWl+SfeIBiBlR8m/oY3Vy4Q3n28dGa98Jt6Jmq1CjInsdStO6SA1dYTN5Q75hPAjlWGa44sox6aoYIWGLELZYzttGtittaInjJUuQncR/BLQ==',key_name='tempest-TestSecurityGroupsBasicOps-2142300595',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-j07er5w3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:54:26Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=f395beac-b14d-4701-bbfe-1190216043d1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd560093-fc70-43b0-889b-598fc21465d4", "address": "fa:16:3e:be:af:66", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd560093-fc", "ovs_interfaceid": "cd560093-fc70-43b0-889b-598fc21465d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.609 186962 DEBUG nova.network.os_vif_util [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "cd560093-fc70-43b0-889b-598fc21465d4", "address": "fa:16:3e:be:af:66", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd560093-fc", "ovs_interfaceid": "cd560093-fc70-43b0-889b-598fc21465d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.610 186962 DEBUG nova.network.os_vif_util [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:af:66,bridge_name='br-int',has_traffic_filtering=True,id=cd560093-fc70-43b0-889b-598fc21465d4,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd560093-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.611 186962 DEBUG nova.objects.instance [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'pci_devices' on Instance uuid f395beac-b14d-4701-bbfe-1190216043d1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.627 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  <uuid>f395beac-b14d-4701-bbfe-1190216043d1</uuid>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  <name>instance-000000b3</name>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481</nova:name>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:54:33</nova:creationTime>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:        <nova:user uuid="dec30fbde18e4b2382ea2c59847d067f">tempest-TestSecurityGroupsBasicOps-2022058758-project-member</nova:user>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:        <nova:project uuid="e8e45e91223b45a79dd698a82af4a2a5">tempest-TestSecurityGroupsBasicOps-2022058758</nova:project>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:        <nova:port uuid="cd560093-fc70-43b0-889b-598fc21465d4">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <entry name="serial">f395beac-b14d-4701-bbfe-1190216043d1</entry>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <entry name="uuid">f395beac-b14d-4701-bbfe-1190216043d1</entry>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk.config"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:be:af:66"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <target dev="tapcd560093-fc"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/console.log" append="off"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:54:33 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:54:33 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:54:33 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:54:33 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.629 186962 DEBUG nova.compute.manager [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Preparing to wait for external event network-vif-plugged-cd560093-fc70-43b0-889b-598fc21465d4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.629 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "f395beac-b14d-4701-bbfe-1190216043d1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.629 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.629 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.630 186962 DEBUG nova.virt.libvirt.vif [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:54:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ge',id=179,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa1HqYHK6Fbyj4+WZlz/MxWl+SfeIBiBlR8m/oY3Vy4Q3n28dGa98Jt6Jmq1CjInsdStO6SA1dYTN5Q75hPAjlWGa44sox6aoYIWGLELZYzttGtittaInjJUuQncR/BLQ==',key_name='tempest-TestSecurityGroupsBasicOps-2142300595',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-j07er5w3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:54:26Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=f395beac-b14d-4701-bbfe-1190216043d1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cd560093-fc70-43b0-889b-598fc21465d4", "address": "fa:16:3e:be:af:66", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd560093-fc", "ovs_interfaceid": "cd560093-fc70-43b0-889b-598fc21465d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.630 186962 DEBUG nova.network.os_vif_util [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "cd560093-fc70-43b0-889b-598fc21465d4", "address": "fa:16:3e:be:af:66", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd560093-fc", "ovs_interfaceid": "cd560093-fc70-43b0-889b-598fc21465d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.631 186962 DEBUG nova.network.os_vif_util [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:af:66,bridge_name='br-int',has_traffic_filtering=True,id=cd560093-fc70-43b0-889b-598fc21465d4,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd560093-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.631 186962 DEBUG os_vif [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:af:66,bridge_name='br-int',has_traffic_filtering=True,id=cd560093-fc70-43b0-889b-598fc21465d4,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd560093-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.632 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.632 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.633 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.635 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.636 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd560093-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.636 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcd560093-fc, col_values=(('external_ids', {'iface-id': 'cd560093-fc70-43b0-889b-598fc21465d4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:af:66', 'vm-uuid': 'f395beac-b14d-4701-bbfe-1190216043d1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.638 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:33 np0005539505 NetworkManager[55134]: <info>  [1764402873.6391] manager: (tapcd560093-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/407)
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.641 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.649 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.651 186962 INFO os_vif [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:af:66,bridge_name='br-int',has_traffic_filtering=True,id=cd560093-fc70-43b0-889b-598fc21465d4,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd560093-fc')#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.730 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.731 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.731 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No VIF found with MAC fa:16:3e:be:af:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:54:33 np0005539505 nova_compute[186958]: 2025-11-29 07:54:33.732 186962 INFO nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Using config drive#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.107 186962 INFO nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Creating config drive at /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk.config#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.112 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3d3m6z8a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.242 186962 DEBUG oslo_concurrency.processutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3d3m6z8a" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:54:34 np0005539505 kernel: tapcd560093-fc: entered promiscuous mode
Nov 29 02:54:34 np0005539505 NetworkManager[55134]: <info>  [1764402874.3202] manager: (tapcd560093-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/408)
Nov 29 02:54:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:54:34Z|00825|binding|INFO|Claiming lport cd560093-fc70-43b0-889b-598fc21465d4 for this chassis.
Nov 29 02:54:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:54:34Z|00826|binding|INFO|cd560093-fc70-43b0-889b-598fc21465d4: Claiming fa:16:3e:be:af:66 10.100.0.8
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.321 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.325 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.329 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:34 np0005539505 NetworkManager[55134]: <info>  [1764402874.3350] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/409)
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.334 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:34 np0005539505 NetworkManager[55134]: <info>  [1764402874.3357] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/410)
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.339 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:af:66 10.100.0.8'], port_security=['fa:16:3e:be:af:66 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8429e89c-8540-4db3-b6b2-48775311a13d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '302ff4eb-5b37-47a5-8263-6df9580417a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab1419e5-3fc4-47d1-a2be-d34ec9f548ab, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=cd560093-fc70-43b0-889b-598fc21465d4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.341 104094 INFO neutron.agent.ovn.metadata.agent [-] Port cd560093-fc70-43b0-889b-598fc21465d4 in datapath 8429e89c-8540-4db3-b6b2-48775311a13d bound to our chassis#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.343 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8429e89c-8540-4db3-b6b2-48775311a13d#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.356 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e73a9423-625f-420e-b28d-266fb4cb90a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.357 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8429e89c-81 in ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:54:34 np0005539505 systemd-udevd[253242]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.358 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8429e89c-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.359 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0aed38bd-7330-4fae-92e2-6064b4868ff7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.360 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[77728eb9-a2fc-4114-a15d-47588f175c3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 systemd-machined[153285]: New machine qemu-86-instance-000000b3.
Nov 29 02:54:34 np0005539505 NetworkManager[55134]: <info>  [1764402874.3716] device (tapcd560093-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:54:34 np0005539505 NetworkManager[55134]: <info>  [1764402874.3727] device (tapcd560093-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.379 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9a4127-6392-43ec-adf9-8d9e07ca6524]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.413 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f6cb44ea-062a-4ae2-a368-fd45db5f8138]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 systemd[1]: Started Virtual Machine qemu-86-instance-000000b3.
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.450 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[93bb32ac-3a7f-4027-a729-5a8bf41ef42f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.472 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:34 np0005539505 NetworkManager[55134]: <info>  [1764402874.4727] manager: (tap8429e89c-80): new Veth device (/org/freedesktop/NetworkManager/Devices/411)
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.470 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e4dc0c2a-3c6e-4090-96d6-3502ef25f3eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.494 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:54:34Z|00827|binding|INFO|Setting lport cd560093-fc70-43b0-889b-598fc21465d4 ovn-installed in OVS
Nov 29 02:54:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:54:34Z|00828|binding|INFO|Setting lport cd560093-fc70-43b0-889b-598fc21465d4 up in Southbound
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.505 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.510 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[c11f7ee8-a695-4c52-a79e-55852ea472a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.513 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[51b3cb62-67bb-42fc-9f9b-d686d9b96bae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 NetworkManager[55134]: <info>  [1764402874.5343] device (tap8429e89c-80): carrier: link connected
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.539 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[cb62c6e3-e2d9-45de-8ae1-2e4cc5a8a596]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.559 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d35ba1-8398-46c8-bf5d-b1b19f3aab8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8429e89c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:be:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 256], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 832211, 'reachable_time': 35434, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253275, 'error': None, 'target': 'ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.573 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[48171c27-43a8-4a95-bdff-2ef323a1a5ee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe64:be84'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 832211, 'tstamp': 832211}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 253276, 'error': None, 'target': 'ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.594 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb1f315-7c81-4f50-8c66-29ed6b97c5a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8429e89c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:be:84'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 256], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 832211, 'reachable_time': 35434, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 253277, 'error': None, 'target': 'ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.631 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[16055c5f-cf74-48b6-a864-3d80ebd5cb55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.693 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4c3e09e8-1be5-4a4b-8bbb-9d2e54d3266c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.696 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8429e89c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.696 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.697 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8429e89c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:54:34 np0005539505 kernel: tap8429e89c-80: entered promiscuous mode
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.699 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:34 np0005539505 NetworkManager[55134]: <info>  [1764402874.6998] manager: (tap8429e89c-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/412)
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.701 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.703 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8429e89c-80, col_values=(('external_ids', {'iface-id': '09e63821-cfdc-4962-ad09-7970b232d886'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:54:34 np0005539505 ovn_controller[95143]: 2025-11-29T07:54:34Z|00829|binding|INFO|Releasing lport 09e63821-cfdc-4962-ad09-7970b232d886 from this chassis (sb_readonly=0)
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.706 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8429e89c-8540-4db3-b6b2-48775311a13d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8429e89c-8540-4db3-b6b2-48775311a13d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.707 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f70107bb-1cb9-415b-aee5-bf2c3b2e85cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.708 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-8429e89c-8540-4db3-b6b2-48775311a13d
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/8429e89c-8540-4db3-b6b2-48775311a13d.pid.haproxy
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 8429e89c-8540-4db3-b6b2-48775311a13d
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:54:34 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:34.709 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d', 'env', 'PROCESS_TAG=haproxy-8429e89c-8540-4db3-b6b2-48775311a13d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8429e89c-8540-4db3-b6b2-48775311a13d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.720 186962 DEBUG nova.compute.manager [req-b699d2a1-f790-4ae1-98d0-f01f0a43c7b6 req-50cb202d-d4e7-4b47-9bb8-ecdf121cd62d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Received event network-vif-plugged-cd560093-fc70-43b0-889b-598fc21465d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.721 186962 DEBUG oslo_concurrency.lockutils [req-b699d2a1-f790-4ae1-98d0-f01f0a43c7b6 req-50cb202d-d4e7-4b47-9bb8-ecdf121cd62d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f395beac-b14d-4701-bbfe-1190216043d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.721 186962 DEBUG oslo_concurrency.lockutils [req-b699d2a1-f790-4ae1-98d0-f01f0a43c7b6 req-50cb202d-d4e7-4b47-9bb8-ecdf121cd62d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.721 186962 DEBUG oslo_concurrency.lockutils [req-b699d2a1-f790-4ae1-98d0-f01f0a43c7b6 req-50cb202d-d4e7-4b47-9bb8-ecdf121cd62d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.722 186962 DEBUG nova.compute.manager [req-b699d2a1-f790-4ae1-98d0-f01f0a43c7b6 req-50cb202d-d4e7-4b47-9bb8-ecdf121cd62d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Processing event network-vif-plugged-cd560093-fc70-43b0-889b-598fc21465d4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.722 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.735 186962 DEBUG nova.network.neutron [req-1b8a6e39-857a-4891-893a-060c4f149264 req-4c459bf0-e12d-4dcb-989a-5231451a0caa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Updated VIF entry in instance network info cache for port cd560093-fc70-43b0-889b-598fc21465d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.735 186962 DEBUG nova.network.neutron [req-1b8a6e39-857a-4891-893a-060c4f149264 req-4c459bf0-e12d-4dcb-989a-5231451a0caa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Updating instance_info_cache with network_info: [{"id": "cd560093-fc70-43b0-889b-598fc21465d4", "address": "fa:16:3e:be:af:66", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd560093-fc", "ovs_interfaceid": "cd560093-fc70-43b0-889b-598fc21465d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.755 186962 DEBUG oslo_concurrency.lockutils [req-1b8a6e39-857a-4891-893a-060c4f149264 req-4c459bf0-e12d-4dcb-989a-5231451a0caa 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-f395beac-b14d-4701-bbfe-1190216043d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.780 186962 DEBUG nova.compute.manager [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.781 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402874.7800267, f395beac-b14d-4701-bbfe-1190216043d1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.781 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] VM Started (Lifecycle Event)#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.786 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.790 186962 INFO nova.virt.libvirt.driver [-] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Instance spawned successfully.#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.791 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.807 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.810 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.817 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.818 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.818 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.819 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.819 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.820 186962 DEBUG nova.virt.libvirt.driver [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.858 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.859 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402874.78013, f395beac-b14d-4701-bbfe-1190216043d1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.859 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.945 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.948 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764402874.7854605, f395beac-b14d-4701-bbfe-1190216043d1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.949 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.979 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.982 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.987 186962 INFO nova.compute.manager [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Took 8.69 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:54:34 np0005539505 nova_compute[186958]: 2025-11-29 07:54:34.988 186962 DEBUG nova.compute.manager [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:54:35 np0005539505 nova_compute[186958]: 2025-11-29 07:54:35.019 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:54:35 np0005539505 podman[253316]: 2025-11-29 07:54:35.047881665 +0000 UTC m=+0.022709585 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:54:35 np0005539505 podman[253316]: 2025-11-29 07:54:35.556424502 +0000 UTC m=+0.531252422 container create 3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 02:54:35 np0005539505 systemd[1]: Started libpod-conmon-3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85.scope.
Nov 29 02:54:35 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:54:35 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3ff88f35bce6df5c3ec03ec34f45c107d6d5cd035b24837308afa0f8af3e86b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:54:36 np0005539505 podman[253316]: 2025-11-29 07:54:36.294931759 +0000 UTC m=+1.269759679 container init 3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 29 02:54:36 np0005539505 podman[253316]: 2025-11-29 07:54:36.300867357 +0000 UTC m=+1.275695247 container start 3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:54:36 np0005539505 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[253332]: [NOTICE]   (253336) : New worker (253338) forked
Nov 29 02:54:36 np0005539505 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[253332]: [NOTICE]   (253336) : Loading success.
Nov 29 02:54:36 np0005539505 nova_compute[186958]: 2025-11-29 07:54:36.682 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:36.682 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:54:36 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:36.684 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:54:36 np0005539505 nova_compute[186958]: 2025-11-29 07:54:36.719 186962 INFO nova.compute.manager [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Took 10.90 seconds to build instance.#033[00m
Nov 29 02:54:36 np0005539505 nova_compute[186958]: 2025-11-29 07:54:36.743 186962 DEBUG oslo_concurrency.lockutils [None req-1d44c2a7-2ab3-42ce-a472-4e0fc796f2d2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:36 np0005539505 nova_compute[186958]: 2025-11-29 07:54:36.794 186962 DEBUG nova.compute.manager [req-e581ea0c-f66a-4f4e-92e7-9409f5109bf1 req-d54f991a-d5f6-4df1-9249-9021dc9c3a19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Received event network-vif-plugged-cd560093-fc70-43b0-889b-598fc21465d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:54:36 np0005539505 nova_compute[186958]: 2025-11-29 07:54:36.795 186962 DEBUG oslo_concurrency.lockutils [req-e581ea0c-f66a-4f4e-92e7-9409f5109bf1 req-d54f991a-d5f6-4df1-9249-9021dc9c3a19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f395beac-b14d-4701-bbfe-1190216043d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:36 np0005539505 nova_compute[186958]: 2025-11-29 07:54:36.795 186962 DEBUG oslo_concurrency.lockutils [req-e581ea0c-f66a-4f4e-92e7-9409f5109bf1 req-d54f991a-d5f6-4df1-9249-9021dc9c3a19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:36 np0005539505 nova_compute[186958]: 2025-11-29 07:54:36.795 186962 DEBUG oslo_concurrency.lockutils [req-e581ea0c-f66a-4f4e-92e7-9409f5109bf1 req-d54f991a-d5f6-4df1-9249-9021dc9c3a19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:36 np0005539505 nova_compute[186958]: 2025-11-29 07:54:36.796 186962 DEBUG nova.compute.manager [req-e581ea0c-f66a-4f4e-92e7-9409f5109bf1 req-d54f991a-d5f6-4df1-9249-9021dc9c3a19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] No waiting events found dispatching network-vif-plugged-cd560093-fc70-43b0-889b-598fc21465d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:54:36 np0005539505 nova_compute[186958]: 2025-11-29 07:54:36.796 186962 WARNING nova.compute.manager [req-e581ea0c-f66a-4f4e-92e7-9409f5109bf1 req-d54f991a-d5f6-4df1-9249-9021dc9c3a19 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Received unexpected event network-vif-plugged-cd560093-fc70-43b0-889b-598fc21465d4 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:54:38 np0005539505 nova_compute[186958]: 2025-11-29 07:54:38.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:38 np0005539505 nova_compute[186958]: 2025-11-29 07:54:38.547 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:38 np0005539505 nova_compute[186958]: 2025-11-29 07:54:38.638 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:40 np0005539505 podman[253347]: 2025-11-29 07:54:40.733935317 +0000 UTC m=+0.064051147 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 29 02:54:40 np0005539505 podman[253348]: 2025-11-29 07:54:40.738357952 +0000 UTC m=+0.059241281 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:54:41 np0005539505 nova_compute[186958]: 2025-11-29 07:54:41.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:43 np0005539505 nova_compute[186958]: 2025-11-29 07:54:43.549 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:43 np0005539505 nova_compute[186958]: 2025-11-29 07:54:43.640 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:44 np0005539505 nova_compute[186958]: 2025-11-29 07:54:44.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:44 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:44.687 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:54:45 np0005539505 nova_compute[186958]: 2025-11-29 07:54:45.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:45 np0005539505 nova_compute[186958]: 2025-11-29 07:54:45.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:54:45 np0005539505 podman[253391]: 2025-11-29 07:54:45.727911527 +0000 UTC m=+0.064415518 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.114 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b3', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'hostId': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.116 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.119 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f395beac-b14d-4701-bbfe-1190216043d1 / tapcd560093-fc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.119 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f899e1a-95fb-4bbe-9700-aaa403e7f737', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b3-f395beac-b14d-4701-bbfe-1190216043d1-tapcd560093-fc', 'timestamp': '2025-11-29T07:54:48.116235', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'tapcd560093-fc', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:af:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd560093-fc'}, 'message_id': 'ad603a50-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.757152446, 'message_signature': '3a10edf9e7a173574cbfe8fa7ecbf0d05e2fe3929611e6fc14e7feeff39c6368'}]}, 'timestamp': '2025-11-29 07:54:48.120875', '_unique_id': '8374063b29c2405face49e6f8d5417ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.122 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.125 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62653585-2fa4-4fe9-b4f2-5a4f66a6267e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b3-f395beac-b14d-4701-bbfe-1190216043d1-tapcd560093-fc', 'timestamp': '2025-11-29T07:54:48.125422', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'tapcd560093-fc', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:af:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd560093-fc'}, 'message_id': 'ad6107f0-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.757152446, 'message_signature': 'c3ca4daf206bbfe5dc2853e7c66120f28bd2925c5ea1c0d61079877d143a1ad0'}]}, 'timestamp': '2025-11-29 07:54:48.126120', '_unique_id': '68de1f2d2835458793e1d277606e64ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.127 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.128 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.128 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '294d3df9-484c-4a0a-8b29-89d93ca38cd5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b3-f395beac-b14d-4701-bbfe-1190216043d1-tapcd560093-fc', 'timestamp': '2025-11-29T07:54:48.128489', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'tapcd560093-fc', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:af:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd560093-fc'}, 'message_id': 'ad617816-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.757152446, 'message_signature': 'e018b1c5943d46172ed62e730949f72a36dc7e330c1bccd5056e19d2a3d2bb93'}]}, 'timestamp': '2025-11-29 07:54:48.128900', '_unique_id': '90d4554386ce4314bb5b2def9c5635ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.129 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.153 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.write.latency volume: 23160213599 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.154 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd81be89a-94af-4aa9-951b-cc57765b8623', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23160213599, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-vda', 'timestamp': '2025-11-29T07:54:48.131025', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad656156-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': '2d38d0ab03800afda077a76dee7eee55a1a2f7affd96aab5a6a94968a55867a8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-sda', 'timestamp': '2025-11-29T07:54:48.131025', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad6575a6-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': '891ff4e1e15575153912ba34d871efdd0406829fbb6dd9b446e9443077dd7b42'}]}, 'timestamp': '2025-11-29 07:54:48.155038', '_unique_id': '1f92a2927cd245c2b79fe7a4fb88acaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.157 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.157 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.read.latency volume: 249494509 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.157 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.read.latency volume: 17400470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b912894-b4dd-42e1-af58-157470a12e5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 249494509, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-vda', 'timestamp': '2025-11-29T07:54:48.157296', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad65dbea-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': '82488b9643b1548d69220579301147c46ab38571d6852b3c44c07f5c142c9003'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17400470, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-sda', 'timestamp': '2025-11-29T07:54:48.157296', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad65e7f2-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': '89e51cc5a900fd69bbfb1bc3f715e7e923776d0b9312e3a7846a81dc6a135303'}]}, 'timestamp': '2025-11-29 07:54:48.157936', '_unique_id': '1ca35df48a2e4151b5a1d073ce2d61f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.158 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.159 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.176 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/memory.usage volume: 40.4296875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5543cc98-9522-4d69-a3a9-e01179ca0fdb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.4296875, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'timestamp': '2025-11-29T07:54:48.159686', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ad68db88-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.817520967, 'message_signature': 'dc7acfc2c23b05d28bfabdbdc0d8d731c6feb44705899f245850f80c74dc4b90'}]}, 'timestamp': '2025-11-29 07:54:48.177353', '_unique_id': '27034d901ccd44d79058569e4756a517'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fa826fb-fc48-4fce-953b-868afae12a7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b3-f395beac-b14d-4701-bbfe-1190216043d1-tapcd560093-fc', 'timestamp': '2025-11-29T07:54:48.179262', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'tapcd560093-fc', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:af:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd560093-fc'}, 'message_id': 'ad69343e-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.757152446, 'message_signature': '283884e8e25da0781cb0d13920e686fc1fc148222bca7a3fa6fe1647bb8fbed5'}]}, 'timestamp': '2025-11-29 07:54:48.179526', '_unique_id': 'b3cbb36f14794bd696165c0d98bb4733'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.180 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.180 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cf25e41-1645-468c-a17b-dc11a692ce35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b3-f395beac-b14d-4701-bbfe-1190216043d1-tapcd560093-fc', 'timestamp': '2025-11-29T07:54:48.180620', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'tapcd560093-fc', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:af:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd560093-fc'}, 'message_id': 'ad696846-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.757152446, 'message_signature': '92e70213511d5b6416ee687cd988303ab4777bb97bfeeef1ad15f7c462fd3dfa'}]}, 'timestamp': '2025-11-29 07:54:48.180855', '_unique_id': '4b6c44f252fc42dd8119e51fb4605ea6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.190 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.190 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '159d59ff-cd5b-45d4-a5c1-b8d3d7019249', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-vda', 'timestamp': '2025-11-29T07:54:48.181921', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad6ae036-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.822750945, 'message_signature': '91f7056e02c075e4c3c2610a97943f4de2578f879e286ac9f8ec4e6acabde82c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-sda', 'timestamp': '2025-11-29T07:54:48.181921', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad6ae9be-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.822750945, 'message_signature': '72192cc8e9636731c59f9cc85263e6437e3f3336472434622e13c5979f12756d'}]}, 'timestamp': '2025-11-29 07:54:48.190710', '_unique_id': 'bff911d3dfd34fb6910a010a3dec10eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.191 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.allocation volume: 29106176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cd13693-a20c-446c-b0ad-21babb2468ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29106176, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-vda', 'timestamp': '2025-11-29T07:54:48.191929', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad6b2276-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.822750945, 'message_signature': '4ce3e714445c8d5b088254f363f633c3de86df508532e878cea11a73e6c5d62d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-sda', 'timestamp': '2025-11-29T07:54:48.191929', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad6b2b7c-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.822750945, 'message_signature': 'd7ff5deb35a401ed44ca155084c5bf407880754f20a598d5a64b201cc7b3792c'}]}, 'timestamp': '2025-11-29 07:54:48.192392', '_unique_id': 'c12b77c845114a2e9eaab39a6e35ca33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.192 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.193 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '910eb00c-9e32-43dc-a490-3e3c859c633e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b3-f395beac-b14d-4701-bbfe-1190216043d1-tapcd560093-fc', 'timestamp': '2025-11-29T07:54:48.193488', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'tapcd560093-fc', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:af:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd560093-fc'}, 'message_id': 'ad6b5ed0-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.757152446, 'message_signature': 'a9add03c20365888d180d97b9202293cc524101a801d57b701760f33e2e07e0c'}]}, 'timestamp': '2025-11-29 07:54:48.193717', '_unique_id': 'a537d86309df487cbf4567b761cacf9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.194 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e78586c7-8716-4ea1-839c-ece7418689ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b3-f395beac-b14d-4701-bbfe-1190216043d1-tapcd560093-fc', 'timestamp': '2025-11-29T07:54:48.194900', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'tapcd560093-fc', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:af:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd560093-fc'}, 'message_id': 'ad6b963e-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.757152446, 'message_signature': 'c58d21cb5402516a722ce25b174e5e6b56065a292382430835db73359de3adc4'}]}, 'timestamp': '2025-11-29 07:54:48.195140', '_unique_id': '31810c2ecbf54a34967b5ef2ec4e03b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09c420e8-1077-4fc4-8c51-0ffe758a0271', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b3-f395beac-b14d-4701-bbfe-1190216043d1-tapcd560093-fc', 'timestamp': '2025-11-29T07:54:48.196229', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'tapcd560093-fc', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:af:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd560093-fc'}, 'message_id': 'ad6bcb04-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.757152446, 'message_signature': '3ad3bb748561f38b70506666dbe06515a7c00db25480b9089c1a5b425743bcc2'}]}, 'timestamp': '2025-11-29 07:54:48.196489', '_unique_id': '1edaac2c1a8a4e1a96602e338f9da4e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.197 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.read.bytes volume: 27098112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.197 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.read.bytes volume: 221502 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8865cb1-99aa-4127-af54-658092345a8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27098112, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-vda', 'timestamp': '2025-11-29T07:54:48.197584', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad6bfe9e-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': '3fb55fce3e11c797251cc34e84a1bad76cab91aed2abbce80bb961216bd9f259'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221502, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-sda', 'timestamp': '2025-11-29T07:54:48.197584', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad6c0664-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': '30702d151c8df291cf22af758f5084fb9f00736b8d4e0b4b1314ffc5fe51d6ee'}]}, 'timestamp': '2025-11-29 07:54:48.197999', '_unique_id': '74d039b42e7b483983d416e2a12400d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.199 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.199 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481>]
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.199 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49ab3254-c17b-447f-bbde-cfa46df14ab8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b3-f395beac-b14d-4701-bbfe-1190216043d1-tapcd560093-fc', 'timestamp': '2025-11-29T07:54:48.199521', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'tapcd560093-fc', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:af:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd560093-fc'}, 'message_id': 'ad6c4a7a-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.757152446, 'message_signature': '9ed1bf29234a59e836911613265d1fff9e153092aa5413d9f9e24e1d267ca431'}]}, 'timestamp': '2025-11-29 07:54:48.199753', '_unique_id': '1b59031b81e941b5a201a9bbe85bbb0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.200 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.read.requests volume: 931 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.read.requests volume: 95 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f506504-67ba-4e88-9008-080bd856684c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 931, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-vda', 'timestamp': '2025-11-29T07:54:48.200814', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad6c7cc0-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': '7e1bdbdd3f6fcd02141898f398ac61dccfba7c9efe0485ccad66971588e8c5a5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 95, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-sda', 'timestamp': '2025-11-29T07:54:48.200814', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad6c8526-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': 'c0797c4da1bd7fda56f2a54b5fe454a63afc34924b7954bd267552ba56aeb0aa'}]}, 'timestamp': '2025-11-29 07:54:48.201255', '_unique_id': '0872c2c2c28d4eb09bcfd30330c46297'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.202 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.202 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481>]
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.202 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.202 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481>]
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.203 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.203 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481>]
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.203 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.write.requests volume: 263 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.203 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '75c870f1-0ce7-4131-9f48-2cc56a73d85d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 263, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-vda', 'timestamp': '2025-11-29T07:54:48.203340', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad6cdf8a-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': '58c7e2853459b6818cf8da853b97a717122cfffbd3d1df839b2a251d56e3609c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-sda', 'timestamp': '2025-11-29T07:54:48.203340', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad6ce764-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': '158851cb433a9319926a761a19c00f85f15240f5fb3ee55f0068add01846b65f'}]}, 'timestamp': '2025-11-29 07:54:48.203750', '_unique_id': '5cdc397ab6684f258ae0b20d7397ee21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.204 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8aaea6eb-8cad-4af7-b67e-56c8048d9b23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b3-f395beac-b14d-4701-bbfe-1190216043d1-tapcd560093-fc', 'timestamp': '2025-11-29T07:54:48.204881', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'tapcd560093-fc', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:be:af:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapcd560093-fc'}, 'message_id': 'ad6d1bda-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.757152446, 'message_signature': '54349211f222746726380b0748bf4c529c1fcb2376f12f1f904b1a2b5dd615bc'}]}, 'timestamp': '2025-11-29 07:54:48.205113', '_unique_id': 'a8ec870b949a4a39a8a2b28868e19986'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.206 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.usage volume: 28246016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.206 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4801e485-c660-4ed9-830d-deed78355d0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28246016, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-vda', 'timestamp': '2025-11-29T07:54:48.206177', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad6d4ef2-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.822750945, 'message_signature': 'a5adfe5cd188b30ffa56cb0cbba10c056fcb3bdc181aceddd247af9280f5b990'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-sda', 'timestamp': '2025-11-29T07:54:48.206177', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad6d56ea-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.822750945, 'message_signature': '90b2b724ab3c37274d3a8e3a6a31e3ca9c3b39241363d40a7fb0c3e161018cc2'}]}, 'timestamp': '2025-11-29 07:54:48.206631', '_unique_id': 'ad07a46541104157a053683c5ee9363a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.207 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.write.bytes volume: 25628672 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9408afd3-e999-4d49-9d7b-e6531730e4c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25628672, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-vda', 'timestamp': '2025-11-29T07:54:48.207769', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad6d8cf0-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': '1634c9e5e7690632a0afeb8f1f982d210c0d568e89bea33fe8343e82a2c54fb3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1-sda', 'timestamp': '2025-11-29T07:54:48.207769', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad6d972c-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.771901064, 'message_signature': '7d09b4f4946e3b8b805176b6ea5304b3b06672258c96afb378181f1993613c0d'}]}, 'timestamp': '2025-11-29 07:54:48.208276', '_unique_id': '3b035127c49d40eba1610978796f4466'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.209 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.209 12 DEBUG ceilometer.compute.pollsters [-] f395beac-b14d-4701-bbfe-1190216043d1/cpu volume: 10990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40e27ffe-09b6-48eb-a261-bee7c2f25f84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10990000000, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'timestamp': '2025-11-29T07:54:48.209502', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481', 'name': 'instance-000000b3', 'instance_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ad6dd03e-ccf8-11f0-8954-fa163e5a5606', 'monotonic_time': 8335.817520967, 'message_signature': '1f64f442905d336b48e15d682c7bae24304ceedb43d818d3fb5d124c5a6c7805'}]}, 'timestamp': '2025-11-29 07:54:48.209725', '_unique_id': '7b213871546a4d5e9f538e979f51f27a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:54:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:54:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:54:48 np0005539505 nova_compute[186958]: 2025-11-29 07:54:48.551 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:48 np0005539505 nova_compute[186958]: 2025-11-29 07:54:48.642 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:48 np0005539505 ovn_controller[95143]: 2025-11-29T07:54:48Z|00096|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:be:af:66 10.100.0.8
Nov 29 02:54:48 np0005539505 ovn_controller[95143]: 2025-11-29T07:54:48Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:be:af:66 10.100.0.8
Nov 29 02:54:49 np0005539505 nova_compute[186958]: 2025-11-29 07:54:49.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:49 np0005539505 nova_compute[186958]: 2025-11-29 07:54:49.592 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:49 np0005539505 nova_compute[186958]: 2025-11-29 07:54:49.592 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:49 np0005539505 nova_compute[186958]: 2025-11-29 07:54:49.593 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:49 np0005539505 nova_compute[186958]: 2025-11-29 07:54:49.593 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:54:49 np0005539505 nova_compute[186958]: 2025-11-29 07:54:49.776 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:54:49 np0005539505 nova_compute[186958]: 2025-11-29 07:54:49.908 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk --force-share --output=json" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:54:49 np0005539505 nova_compute[186958]: 2025-11-29 07:54:49.909 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:54:49 np0005539505 nova_compute[186958]: 2025-11-29 07:54:49.967 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:54:50 np0005539505 nova_compute[186958]: 2025-11-29 07:54:50.138 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:54:50 np0005539505 nova_compute[186958]: 2025-11-29 07:54:50.139 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5513MB free_disk=73.04329681396484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:54:50 np0005539505 nova_compute[186958]: 2025-11-29 07:54:50.139 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:50 np0005539505 nova_compute[186958]: 2025-11-29 07:54:50.140 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:50 np0005539505 nova_compute[186958]: 2025-11-29 07:54:50.442 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance f395beac-b14d-4701-bbfe-1190216043d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:54:50 np0005539505 nova_compute[186958]: 2025-11-29 07:54:50.442 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:54:50 np0005539505 nova_compute[186958]: 2025-11-29 07:54:50.442 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:54:50 np0005539505 nova_compute[186958]: 2025-11-29 07:54:50.712 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:54:50 np0005539505 nova_compute[186958]: 2025-11-29 07:54:50.802 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:54:50 np0005539505 nova_compute[186958]: 2025-11-29 07:54:50.831 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:54:50 np0005539505 nova_compute[186958]: 2025-11-29 07:54:50.831 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:51 np0005539505 nova_compute[186958]: 2025-11-29 07:54:51.832 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:51 np0005539505 nova_compute[186958]: 2025-11-29 07:54:51.833 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:54:51 np0005539505 nova_compute[186958]: 2025-11-29 07:54:51.833 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:54:52 np0005539505 podman[253435]: 2025-11-29 07:54:52.730483242 +0000 UTC m=+0.063002707 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 02:54:52 np0005539505 podman[253436]: 2025-11-29 07:54:52.794836876 +0000 UTC m=+0.122739670 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 02:54:53 np0005539505 nova_compute[186958]: 2025-11-29 07:54:53.004 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-f395beac-b14d-4701-bbfe-1190216043d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:54:53 np0005539505 nova_compute[186958]: 2025-11-29 07:54:53.005 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-f395beac-b14d-4701-bbfe-1190216043d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:54:53 np0005539505 nova_compute[186958]: 2025-11-29 07:54:53.005 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:54:53 np0005539505 nova_compute[186958]: 2025-11-29 07:54:53.005 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f395beac-b14d-4701-bbfe-1190216043d1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:54:53 np0005539505 nova_compute[186958]: 2025-11-29 07:54:53.554 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:53 np0005539505 nova_compute[186958]: 2025-11-29 07:54:53.644 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:55 np0005539505 nova_compute[186958]: 2025-11-29 07:54:55.660 186962 DEBUG oslo_concurrency.lockutils [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "f395beac-b14d-4701-bbfe-1190216043d1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:55 np0005539505 nova_compute[186958]: 2025-11-29 07:54:55.661 186962 DEBUG oslo_concurrency.lockutils [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:55 np0005539505 nova_compute[186958]: 2025-11-29 07:54:55.661 186962 DEBUG oslo_concurrency.lockutils [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "f395beac-b14d-4701-bbfe-1190216043d1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:55 np0005539505 nova_compute[186958]: 2025-11-29 07:54:55.661 186962 DEBUG oslo_concurrency.lockutils [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:55 np0005539505 nova_compute[186958]: 2025-11-29 07:54:55.662 186962 DEBUG oslo_concurrency.lockutils [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:55 np0005539505 nova_compute[186958]: 2025-11-29 07:54:55.981 186962 INFO nova.compute.manager [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Terminating instance#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.026 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Updating instance_info_cache with network_info: [{"id": "cd560093-fc70-43b0-889b-598fc21465d4", "address": "fa:16:3e:be:af:66", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd560093-fc", "ovs_interfaceid": "cd560093-fc70-43b0-889b-598fc21465d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.048 186962 DEBUG nova.compute.manager [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:54:56 np0005539505 kernel: tapcd560093-fc (unregistering): left promiscuous mode
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.068 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-f395beac-b14d-4701-bbfe-1190216043d1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.069 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.069 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.069 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:56 np0005539505 NetworkManager[55134]: <info>  [1764402896.0715] device (tapcd560093-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.084 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:56 np0005539505 ovn_controller[95143]: 2025-11-29T07:54:56Z|00830|binding|INFO|Releasing lport cd560093-fc70-43b0-889b-598fc21465d4 from this chassis (sb_readonly=0)
Nov 29 02:54:56 np0005539505 ovn_controller[95143]: 2025-11-29T07:54:56Z|00831|binding|INFO|Setting lport cd560093-fc70-43b0-889b-598fc21465d4 down in Southbound
Nov 29 02:54:56 np0005539505 ovn_controller[95143]: 2025-11-29T07:54:56Z|00832|binding|INFO|Removing iface tapcd560093-fc ovn-installed in OVS
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.086 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.098 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:af:66 10.100.0.8'], port_security=['fa:16:3e:be:af:66 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f395beac-b14d-4701-bbfe-1190216043d1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8429e89c-8540-4db3-b6b2-48775311a13d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '302ff4eb-5b37-47a5-8263-6df9580417a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ab1419e5-3fc4-47d1-a2be-d34ec9f548ab, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=cd560093-fc70-43b0-889b-598fc21465d4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.100 104094 INFO neutron.agent.ovn.metadata.agent [-] Port cd560093-fc70-43b0-889b-598fc21465d4 in datapath 8429e89c-8540-4db3-b6b2-48775311a13d unbound from our chassis#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.101 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8429e89c-8540-4db3-b6b2-48775311a13d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.103 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2d33eb27-cf49-48a1-aaf3-ac53e640c79a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.103 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.104 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d namespace which is not needed anymore#033[00m
Nov 29 02:54:56 np0005539505 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b3.scope: Deactivated successfully.
Nov 29 02:54:56 np0005539505 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b3.scope: Consumed 13.268s CPU time.
Nov 29 02:54:56 np0005539505 systemd-machined[153285]: Machine qemu-86-instance-000000b3 terminated.
Nov 29 02:54:56 np0005539505 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[253332]: [NOTICE]   (253336) : haproxy version is 2.8.14-c23fe91
Nov 29 02:54:56 np0005539505 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[253332]: [NOTICE]   (253336) : path to executable is /usr/sbin/haproxy
Nov 29 02:54:56 np0005539505 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[253332]: [WARNING]  (253336) : Exiting Master process...
Nov 29 02:54:56 np0005539505 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[253332]: [ALERT]    (253336) : Current worker (253338) exited with code 143 (Terminated)
Nov 29 02:54:56 np0005539505 neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d[253332]: [WARNING]  (253336) : All workers exited. Exiting... (0)
Nov 29 02:54:56 np0005539505 systemd[1]: libpod-3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85.scope: Deactivated successfully.
Nov 29 02:54:56 np0005539505 podman[253513]: 2025-11-29 07:54:56.261156118 +0000 UTC m=+0.048834645 container died 3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:54:56 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85-userdata-shm.mount: Deactivated successfully.
Nov 29 02:54:56 np0005539505 systemd[1]: var-lib-containers-storage-overlay-f3ff88f35bce6df5c3ec03ec34f45c107d6d5cd035b24837308afa0f8af3e86b-merged.mount: Deactivated successfully.
Nov 29 02:54:56 np0005539505 podman[253513]: 2025-11-29 07:54:56.318113973 +0000 UTC m=+0.105792490 container cleanup 3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 29 02:54:56 np0005539505 systemd[1]: libpod-conmon-3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85.scope: Deactivated successfully.
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.332 186962 INFO nova.virt.libvirt.driver [-] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Instance destroyed successfully.#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.333 186962 DEBUG nova.objects.instance [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'resources' on Instance uuid f395beac-b14d-4701-bbfe-1190216043d1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.355 186962 DEBUG nova.virt.libvirt.vif [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:54:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-0-688194481',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ge',id=179,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDa1HqYHK6Fbyj4+WZlz/MxWl+SfeIBiBlR8m/oY3Vy4Q3n28dGa98Jt6Jmq1CjInsdStO6SA1dYTN5Q75hPAjlWGa44sox6aoYIWGLELZYzttGtittaInjJUuQncR/BLQ==',key_name='tempest-TestSecurityGroupsBasicOps-2142300595',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:54:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-j07er5w3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:54:35Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=f395beac-b14d-4701-bbfe-1190216043d1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd560093-fc70-43b0-889b-598fc21465d4", "address": "fa:16:3e:be:af:66", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd560093-fc", "ovs_interfaceid": "cd560093-fc70-43b0-889b-598fc21465d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.356 186962 DEBUG nova.network.os_vif_util [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "cd560093-fc70-43b0-889b-598fc21465d4", "address": "fa:16:3e:be:af:66", "network": {"id": "8429e89c-8540-4db3-b6b2-48775311a13d", "bridge": "br-int", "label": "tempest-network-smoke--2013284958", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd560093-fc", "ovs_interfaceid": "cd560093-fc70-43b0-889b-598fc21465d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.357 186962 DEBUG nova.network.os_vif_util [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:af:66,bridge_name='br-int',has_traffic_filtering=True,id=cd560093-fc70-43b0-889b-598fc21465d4,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd560093-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.358 186962 DEBUG os_vif [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:af:66,bridge_name='br-int',has_traffic_filtering=True,id=cd560093-fc70-43b0-889b-598fc21465d4,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd560093-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.360 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.361 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd560093-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.362 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.364 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.366 186962 INFO os_vif [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:af:66,bridge_name='br-int',has_traffic_filtering=True,id=cd560093-fc70-43b0-889b-598fc21465d4,network=Network(8429e89c-8540-4db3-b6b2-48775311a13d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd560093-fc')#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.367 186962 INFO nova.virt.libvirt.driver [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Deleting instance files /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1_del#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.368 186962 INFO nova.virt.libvirt.driver [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Deletion of /var/lib/nova/instances/f395beac-b14d-4701-bbfe-1190216043d1_del complete#033[00m
Nov 29 02:54:56 np0005539505 podman[253559]: 2025-11-29 07:54:56.391010659 +0000 UTC m=+0.050137982 container remove 3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.397 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[117a5ad3-aa92-4e34-8ea3-c0d7d65040cf]: (4, ('Sat Nov 29 07:54:56 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d (3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85)\n3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85\nSat Nov 29 07:54:56 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d (3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85)\n3a35c3c5a7857aff2d622ed561849ea0727d89cc1d36f4a2b9ab06e40a9ece85\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.399 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[564608ff-6c3d-4dc0-9d0b-cad3e0c0353f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.400 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8429e89c-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.402 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:56 np0005539505 kernel: tap8429e89c-80: left promiscuous mode
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.403 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.406 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2b7f7d04-2931-4a5d-8a41-c999c831477b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.413 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.430 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[edd0736a-33cd-4766-8455-ae8b993ae511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.432 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9123eaf4-f7b7-474e-9f66-c4a5bb09e5e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.449 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[93db8ea0-e2e9-426e-b1f6-71bf6cc2e3c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 832202, 'reachable_time': 23760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253573, 'error': None, 'target': 'ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.452 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8429e89c-8540-4db3-b6b2-48775311a13d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:54:56 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:54:56.452 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[46da7913-a929-4554-a10b-00de845af62b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:54:56 np0005539505 systemd[1]: run-netns-ovnmeta\x2d8429e89c\x2d8540\x2d4db3\x2db6b2\x2d48775311a13d.mount: Deactivated successfully.
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.636 186962 DEBUG nova.compute.manager [req-ee3274fa-6da7-49f5-893f-b28b030c1c7c req-6d50ae96-7c40-4525-9743-03ea2b0eb15b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Received event network-vif-unplugged-cd560093-fc70-43b0-889b-598fc21465d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.636 186962 DEBUG oslo_concurrency.lockutils [req-ee3274fa-6da7-49f5-893f-b28b030c1c7c req-6d50ae96-7c40-4525-9743-03ea2b0eb15b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f395beac-b14d-4701-bbfe-1190216043d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.637 186962 DEBUG oslo_concurrency.lockutils [req-ee3274fa-6da7-49f5-893f-b28b030c1c7c req-6d50ae96-7c40-4525-9743-03ea2b0eb15b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.637 186962 DEBUG oslo_concurrency.lockutils [req-ee3274fa-6da7-49f5-893f-b28b030c1c7c req-6d50ae96-7c40-4525-9743-03ea2b0eb15b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.637 186962 DEBUG nova.compute.manager [req-ee3274fa-6da7-49f5-893f-b28b030c1c7c req-6d50ae96-7c40-4525-9743-03ea2b0eb15b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] No waiting events found dispatching network-vif-unplugged-cd560093-fc70-43b0-889b-598fc21465d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.638 186962 DEBUG nova.compute.manager [req-ee3274fa-6da7-49f5-893f-b28b030c1c7c req-6d50ae96-7c40-4525-9743-03ea2b0eb15b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Received event network-vif-unplugged-cd560093-fc70-43b0-889b-598fc21465d4 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.655 186962 INFO nova.compute.manager [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Took 0.61 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.656 186962 DEBUG oslo.service.loopingcall [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.656 186962 DEBUG nova.compute.manager [-] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:54:56 np0005539505 nova_compute[186958]: 2025-11-29 07:54:56.656 186962 DEBUG nova.network.neutron [-] [instance: f395beac-b14d-4701-bbfe-1190216043d1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:54:57 np0005539505 podman[253575]: 2025-11-29 07:54:57.738994996 +0000 UTC m=+0.066745975 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:54:57 np0005539505 podman[253574]: 2025-11-29 07:54:57.747491597 +0000 UTC m=+0.068132964 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:54:58 np0005539505 nova_compute[186958]: 2025-11-29 07:54:58.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:58 np0005539505 nova_compute[186958]: 2025-11-29 07:54:58.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:54:58 np0005539505 nova_compute[186958]: 2025-11-29 07:54:58.556 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:54:58 np0005539505 nova_compute[186958]: 2025-11-29 07:54:58.810 186962 DEBUG nova.compute.manager [req-226ad085-0e6b-40b8-8349-f8bce4758703 req-fa8c7571-7876-4b3c-8406-a7b098c3c876 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Received event network-vif-plugged-cd560093-fc70-43b0-889b-598fc21465d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:54:58 np0005539505 nova_compute[186958]: 2025-11-29 07:54:58.810 186962 DEBUG oslo_concurrency.lockutils [req-226ad085-0e6b-40b8-8349-f8bce4758703 req-fa8c7571-7876-4b3c-8406-a7b098c3c876 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "f395beac-b14d-4701-bbfe-1190216043d1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:54:58 np0005539505 nova_compute[186958]: 2025-11-29 07:54:58.811 186962 DEBUG oslo_concurrency.lockutils [req-226ad085-0e6b-40b8-8349-f8bce4758703 req-fa8c7571-7876-4b3c-8406-a7b098c3c876 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:54:58 np0005539505 nova_compute[186958]: 2025-11-29 07:54:58.811 186962 DEBUG oslo_concurrency.lockutils [req-226ad085-0e6b-40b8-8349-f8bce4758703 req-fa8c7571-7876-4b3c-8406-a7b098c3c876 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:54:58 np0005539505 nova_compute[186958]: 2025-11-29 07:54:58.812 186962 DEBUG nova.compute.manager [req-226ad085-0e6b-40b8-8349-f8bce4758703 req-fa8c7571-7876-4b3c-8406-a7b098c3c876 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] No waiting events found dispatching network-vif-plugged-cd560093-fc70-43b0-889b-598fc21465d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:54:58 np0005539505 nova_compute[186958]: 2025-11-29 07:54:58.812 186962 WARNING nova.compute.manager [req-226ad085-0e6b-40b8-8349-f8bce4758703 req-fa8c7571-7876-4b3c-8406-a7b098c3c876 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Received unexpected event network-vif-plugged-cd560093-fc70-43b0-889b-598fc21465d4 for instance with vm_state active and task_state deleting.#033[00m
Nov 29 02:55:01 np0005539505 nova_compute[186958]: 2025-11-29 07:55:01.047 186962 DEBUG nova.network.neutron [-] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:55:01 np0005539505 nova_compute[186958]: 2025-11-29 07:55:01.072 186962 INFO nova.compute.manager [-] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Took 4.42 seconds to deallocate network for instance.#033[00m
Nov 29 02:55:01 np0005539505 nova_compute[186958]: 2025-11-29 07:55:01.145 186962 DEBUG oslo_concurrency.lockutils [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:55:01 np0005539505 nova_compute[186958]: 2025-11-29 07:55:01.146 186962 DEBUG oslo_concurrency.lockutils [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:55:01 np0005539505 nova_compute[186958]: 2025-11-29 07:55:01.170 186962 DEBUG nova.compute.manager [req-4eb51503-4e48-43ea-abf3-7329f7f5f749 req-7ee6f2b8-164d-4533-8022-83dbfd441dfb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Received event network-vif-deleted-cd560093-fc70-43b0-889b-598fc21465d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:55:01 np0005539505 nova_compute[186958]: 2025-11-29 07:55:01.264 186962 DEBUG nova.compute.provider_tree [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:55:01 np0005539505 nova_compute[186958]: 2025-11-29 07:55:01.290 186962 DEBUG nova.scheduler.client.report [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:55:01 np0005539505 nova_compute[186958]: 2025-11-29 07:55:01.323 186962 DEBUG oslo_concurrency.lockutils [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:55:01 np0005539505 nova_compute[186958]: 2025-11-29 07:55:01.358 186962 INFO nova.scheduler.client.report [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Deleted allocations for instance f395beac-b14d-4701-bbfe-1190216043d1#033[00m
Nov 29 02:55:01 np0005539505 nova_compute[186958]: 2025-11-29 07:55:01.364 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:01 np0005539505 nova_compute[186958]: 2025-11-29 07:55:01.451 186962 DEBUG oslo_concurrency.lockutils [None req-0b92612d-5615-4694-a900-cae357435acb dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "f395beac-b14d-4701-bbfe-1190216043d1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.791s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:55:03 np0005539505 nova_compute[186958]: 2025-11-29 07:55:03.599 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:06 np0005539505 nova_compute[186958]: 2025-11-29 07:55:06.368 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:08 np0005539505 nova_compute[186958]: 2025-11-29 07:55:08.600 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:11 np0005539505 nova_compute[186958]: 2025-11-29 07:55:11.329 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764402896.3274662, f395beac-b14d-4701-bbfe-1190216043d1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:55:11 np0005539505 nova_compute[186958]: 2025-11-29 07:55:11.330 186962 INFO nova.compute.manager [-] [instance: f395beac-b14d-4701-bbfe-1190216043d1] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:55:11 np0005539505 nova_compute[186958]: 2025-11-29 07:55:11.386 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:11 np0005539505 nova_compute[186958]: 2025-11-29 07:55:11.713 186962 DEBUG nova.compute.manager [None req-4a951d22-b90f-4e5e-a130-9d35d93b048a - - - - - -] [instance: f395beac-b14d-4701-bbfe-1190216043d1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:55:11 np0005539505 podman[253615]: 2025-11-29 07:55:11.737470285 +0000 UTC m=+0.070750687 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Nov 29 02:55:11 np0005539505 podman[253616]: 2025-11-29 07:55:11.745008538 +0000 UTC m=+0.074739869 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:55:13 np0005539505 nova_compute[186958]: 2025-11-29 07:55:13.603 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:16 np0005539505 nova_compute[186958]: 2025-11-29 07:55:16.391 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:16 np0005539505 podman[253658]: 2025-11-29 07:55:16.749794546 +0000 UTC m=+0.073488654 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:55:18 np0005539505 nova_compute[186958]: 2025-11-29 07:55:18.606 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:21 np0005539505 nova_compute[186958]: 2025-11-29 07:55:21.435 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:23 np0005539505 nova_compute[186958]: 2025-11-29 07:55:23.607 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:23 np0005539505 podman[253678]: 2025-11-29 07:55:23.937418408 +0000 UTC m=+0.257260384 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:55:23 np0005539505 podman[253679]: 2025-11-29 07:55:23.951842017 +0000 UTC m=+0.265321703 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 02:55:26 np0005539505 nova_compute[186958]: 2025-11-29 07:55:26.474 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:55:27.542 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:55:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:55:27.542 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:55:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:55:27.543 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:55:28 np0005539505 nova_compute[186958]: 2025-11-29 07:55:28.609 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:28 np0005539505 podman[253731]: 2025-11-29 07:55:28.747043812 +0000 UTC m=+0.066671701 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Nov 29 02:55:28 np0005539505 podman[253730]: 2025-11-29 07:55:28.748298228 +0000 UTC m=+0.073158845 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:55:31 np0005539505 nova_compute[186958]: 2025-11-29 07:55:31.477 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:33 np0005539505 nova_compute[186958]: 2025-11-29 07:55:33.612 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:34 np0005539505 nova_compute[186958]: 2025-11-29 07:55:34.850 186962 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 3.70 sec#033[00m
Nov 29 02:55:36 np0005539505 nova_compute[186958]: 2025-11-29 07:55:36.481 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:38 np0005539505 nova_compute[186958]: 2025-11-29 07:55:38.614 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:41 np0005539505 nova_compute[186958]: 2025-11-29 07:55:41.484 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:41 np0005539505 ovn_controller[95143]: 2025-11-29T07:55:41Z|00833|memory_trim|INFO|Detected inactivity (last active 30019 ms ago): trimming memory
Nov 29 02:55:42 np0005539505 podman[253771]: 2025-11-29 07:55:42.748579979 +0000 UTC m=+0.072947531 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:55:42 np0005539505 podman[253770]: 2025-11-29 07:55:42.752193171 +0000 UTC m=+0.081073691 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, version=9.6, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9)
Nov 29 02:55:43 np0005539505 nova_compute[186958]: 2025-11-29 07:55:43.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:43 np0005539505 nova_compute[186958]: 2025-11-29 07:55:43.616 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:55:45.048 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:55:45 np0005539505 nova_compute[186958]: 2025-11-29 07:55:45.048 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:45 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:55:45.049 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:55:45 np0005539505 nova_compute[186958]: 2025-11-29 07:55:45.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:45 np0005539505 nova_compute[186958]: 2025-11-29 07:55:45.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:55:46 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:55:46.052 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:55:46 np0005539505 nova_compute[186958]: 2025-11-29 07:55:46.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:46 np0005539505 nova_compute[186958]: 2025-11-29 07:55:46.487 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:47 np0005539505 podman[253819]: 2025-11-29 07:55:47.746427421 +0000 UTC m=+0.084146008 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 29 02:55:48 np0005539505 nova_compute[186958]: 2025-11-29 07:55:48.618 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:50 np0005539505 nova_compute[186958]: 2025-11-29 07:55:50.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:50 np0005539505 nova_compute[186958]: 2025-11-29 07:55:50.630 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:55:50 np0005539505 nova_compute[186958]: 2025-11-29 07:55:50.630 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:55:50 np0005539505 nova_compute[186958]: 2025-11-29 07:55:50.630 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:55:50 np0005539505 nova_compute[186958]: 2025-11-29 07:55:50.630 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:55:50 np0005539505 nova_compute[186958]: 2025-11-29 07:55:50.805 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:55:50 np0005539505 nova_compute[186958]: 2025-11-29 07:55:50.807 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5707MB free_disk=73.0710563659668GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:55:50 np0005539505 nova_compute[186958]: 2025-11-29 07:55:50.808 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:55:50 np0005539505 nova_compute[186958]: 2025-11-29 07:55:50.808 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:55:51 np0005539505 nova_compute[186958]: 2025-11-29 07:55:51.492 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:53 np0005539505 nova_compute[186958]: 2025-11-29 07:55:53.067 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:55:53 np0005539505 nova_compute[186958]: 2025-11-29 07:55:53.067 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:55:53 np0005539505 nova_compute[186958]: 2025-11-29 07:55:53.205 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:55:53 np0005539505 nova_compute[186958]: 2025-11-29 07:55:53.229 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:55:53 np0005539505 nova_compute[186958]: 2025-11-29 07:55:53.265 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:55:53 np0005539505 nova_compute[186958]: 2025-11-29 07:55:53.265 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:55:53 np0005539505 nova_compute[186958]: 2025-11-29 07:55:53.619 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:53 np0005539505 nova_compute[186958]: 2025-11-29 07:55:53.896 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:54 np0005539505 nova_compute[186958]: 2025-11-29 07:55:54.038 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:54 np0005539505 nova_compute[186958]: 2025-11-29 07:55:54.266 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:54 np0005539505 nova_compute[186958]: 2025-11-29 07:55:54.267 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:55:54 np0005539505 nova_compute[186958]: 2025-11-29 07:55:54.267 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:55:54 np0005539505 nova_compute[186958]: 2025-11-29 07:55:54.492 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:55:54 np0005539505 nova_compute[186958]: 2025-11-29 07:55:54.492 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:54 np0005539505 nova_compute[186958]: 2025-11-29 07:55:54.493 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:54 np0005539505 podman[253841]: 2025-11-29 07:55:54.728751272 +0000 UTC m=+0.054859226 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:55:54 np0005539505 podman[253842]: 2025-11-29 07:55:54.75830006 +0000 UTC m=+0.080290328 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 29 02:55:56 np0005539505 nova_compute[186958]: 2025-11-29 07:55:56.496 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:58 np0005539505 nova_compute[186958]: 2025-11-29 07:55:58.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:58 np0005539505 nova_compute[186958]: 2025-11-29 07:55:58.621 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:55:59 np0005539505 nova_compute[186958]: 2025-11-29 07:55:59.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:55:59 np0005539505 podman[253891]: 2025-11-29 07:55:59.750202032 +0000 UTC m=+0.074810152 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 29 02:55:59 np0005539505 podman[253890]: 2025-11-29 07:55:59.760036471 +0000 UTC m=+0.077442757 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 02:56:01 np0005539505 nova_compute[186958]: 2025-11-29 07:56:01.509 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:03 np0005539505 nova_compute[186958]: 2025-11-29 07:56:03.624 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:06 np0005539505 nova_compute[186958]: 2025-11-29 07:56:06.519 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:08 np0005539505 nova_compute[186958]: 2025-11-29 07:56:08.625 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:11 np0005539505 nova_compute[186958]: 2025-11-29 07:56:11.523 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:13 np0005539505 nova_compute[186958]: 2025-11-29 07:56:13.626 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:13 np0005539505 podman[253930]: 2025-11-29 07:56:13.758714608 +0000 UTC m=+0.064600493 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:56:13 np0005539505 podman[253929]: 2025-11-29 07:56:13.76195219 +0000 UTC m=+0.075636536 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Nov 29 02:56:16 np0005539505 nova_compute[186958]: 2025-11-29 07:56:16.526 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:18 np0005539505 nova_compute[186958]: 2025-11-29 07:56:18.628 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:18 np0005539505 podman[253972]: 2025-11-29 07:56:18.73369046 +0000 UTC m=+0.060569869 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:56:21 np0005539505 nova_compute[186958]: 2025-11-29 07:56:21.393 186962 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 6.54 sec#033[00m
Nov 29 02:56:21 np0005539505 nova_compute[186958]: 2025-11-29 07:56:21.529 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:23 np0005539505 nova_compute[186958]: 2025-11-29 07:56:23.664 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:24.572 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:56:24 np0005539505 nova_compute[186958]: 2025-11-29 07:56:24.572 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:24.573 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:56:25 np0005539505 podman[253990]: 2025-11-29 07:56:25.74316325 +0000 UTC m=+0.078983870 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 02:56:25 np0005539505 podman[253991]: 2025-11-29 07:56:25.788208097 +0000 UTC m=+0.106434058 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 29 02:56:26 np0005539505 nova_compute[186958]: 2025-11-29 07:56:26.530 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:27.543 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:27.543 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:27.544 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:28 np0005539505 nova_compute[186958]: 2025-11-29 07:56:28.665 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:30 np0005539505 podman[254043]: 2025-11-29 07:56:30.73353374 +0000 UTC m=+0.060065554 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:56:30 np0005539505 podman[254042]: 2025-11-29 07:56:30.769423687 +0000 UTC m=+0.089364804 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:56:31 np0005539505 nova_compute[186958]: 2025-11-29 07:56:31.533 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:31.575 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:56:33 np0005539505 nova_compute[186958]: 2025-11-29 07:56:33.667 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:36 np0005539505 nova_compute[186958]: 2025-11-29 07:56:36.537 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:38 np0005539505 nova_compute[186958]: 2025-11-29 07:56:38.669 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:39 np0005539505 nova_compute[186958]: 2025-11-29 07:56:39.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:41 np0005539505 nova_compute[186958]: 2025-11-29 07:56:41.540 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:41 np0005539505 nova_compute[186958]: 2025-11-29 07:56:41.611 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "bcfc29cf-d647-4170-91a5-7f3158bf724b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:41 np0005539505 nova_compute[186958]: 2025-11-29 07:56:41.612 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:41 np0005539505 nova_compute[186958]: 2025-11-29 07:56:41.633 186962 DEBUG nova.compute.manager [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:56:41 np0005539505 nova_compute[186958]: 2025-11-29 07:56:41.791 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:41 np0005539505 nova_compute[186958]: 2025-11-29 07:56:41.792 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:41 np0005539505 nova_compute[186958]: 2025-11-29 07:56:41.801 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:56:41 np0005539505 nova_compute[186958]: 2025-11-29 07:56:41.802 186962 INFO nova.compute.claims [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:56:42 np0005539505 nova_compute[186958]: 2025-11-29 07:56:42.003 186962 DEBUG nova.compute.provider_tree [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:56:42 np0005539505 nova_compute[186958]: 2025-11-29 07:56:42.022 186962 DEBUG nova.scheduler.client.report [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:56:42 np0005539505 nova_compute[186958]: 2025-11-29 07:56:42.049 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:42 np0005539505 nova_compute[186958]: 2025-11-29 07:56:42.050 186962 DEBUG nova.compute.manager [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:56:42 np0005539505 nova_compute[186958]: 2025-11-29 07:56:42.109 186962 DEBUG nova.compute.manager [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:56:42 np0005539505 nova_compute[186958]: 2025-11-29 07:56:42.110 186962 DEBUG nova.network.neutron [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:56:42 np0005539505 nova_compute[186958]: 2025-11-29 07:56:42.132 186962 INFO nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:56:42 np0005539505 nova_compute[186958]: 2025-11-29 07:56:42.163 186962 DEBUG nova.compute.manager [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.162 186962 DEBUG nova.policy [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.318 186962 DEBUG nova.compute.manager [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.319 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.319 186962 INFO nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Creating image(s)#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.320 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "/var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.320 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.321 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.333 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.388 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.389 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.389 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.402 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.468 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.470 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.501 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.502 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.502 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.559 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.560 186962 DEBUG nova.virt.disk.api [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Checking if we can resize image /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.560 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.616 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.617 186962 DEBUG nova.virt.disk.api [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Cannot resize image /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.617 186962 DEBUG nova.objects.instance [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'migration_context' on Instance uuid bcfc29cf-d647-4170-91a5-7f3158bf724b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.672 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.672 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Ensure instance console log exists: /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.673 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.673 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.673 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:43 np0005539505 nova_compute[186958]: 2025-11-29 07:56:43.674 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:44 np0005539505 nova_compute[186958]: 2025-11-29 07:56:44.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:44 np0005539505 podman[254096]: 2025-11-29 07:56:44.738428442 +0000 UTC m=+0.059448967 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:56:44 np0005539505 podman[254095]: 2025-11-29 07:56:44.7591853 +0000 UTC m=+0.089504998 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 29 02:56:46 np0005539505 nova_compute[186958]: 2025-11-29 07:56:46.543 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:47 np0005539505 nova_compute[186958]: 2025-11-29 07:56:47.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:47 np0005539505 nova_compute[186958]: 2025-11-29 07:56:47.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:56:48 np0005539505 nova_compute[186958]: 2025-11-29 07:56:48.102 186962 DEBUG nova.network.neutron [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Successfully created port: 928a5256-1fd9-4eb7-b013-ea5dd33d2217 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:56:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 02:56:48 np0005539505 nova_compute[186958]: 2025-11-29 07:56:48.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:48 np0005539505 nova_compute[186958]: 2025-11-29 07:56:48.673 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:49 np0005539505 nova_compute[186958]: 2025-11-29 07:56:49.693 186962 DEBUG nova.network.neutron [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Successfully updated port: 928a5256-1fd9-4eb7-b013-ea5dd33d2217 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:56:49 np0005539505 podman[254136]: 2025-11-29 07:56:49.706725185 +0000 UTC m=+0.044584945 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 02:56:49 np0005539505 nova_compute[186958]: 2025-11-29 07:56:49.724 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:56:49 np0005539505 nova_compute[186958]: 2025-11-29 07:56:49.724 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquired lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:56:49 np0005539505 nova_compute[186958]: 2025-11-29 07:56:49.724 186962 DEBUG nova.network.neutron [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:56:49 np0005539505 nova_compute[186958]: 2025-11-29 07:56:49.789 186962 DEBUG nova.compute.manager [req-030cd2c5-c3f9-402e-871b-d8babf364e8d req-a9f35793-3ebf-4788-9725-f177fa71f187 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Received event network-changed-928a5256-1fd9-4eb7-b013-ea5dd33d2217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:56:49 np0005539505 nova_compute[186958]: 2025-11-29 07:56:49.789 186962 DEBUG nova.compute.manager [req-030cd2c5-c3f9-402e-871b-d8babf364e8d req-a9f35793-3ebf-4788-9725-f177fa71f187 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Refreshing instance network info cache due to event network-changed-928a5256-1fd9-4eb7-b013-ea5dd33d2217. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:56:49 np0005539505 nova_compute[186958]: 2025-11-29 07:56:49.789 186962 DEBUG oslo_concurrency.lockutils [req-030cd2c5-c3f9-402e-871b-d8babf364e8d req-a9f35793-3ebf-4788-9725-f177fa71f187 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.120 186962 DEBUG nova.network.neutron [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.679 186962 DEBUG nova.network.neutron [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Updating instance_info_cache with network_info: [{"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.876 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Releasing lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.876 186962 DEBUG nova.compute.manager [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Instance network_info: |[{"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.877 186962 DEBUG oslo_concurrency.lockutils [req-030cd2c5-c3f9-402e-871b-d8babf364e8d req-a9f35793-3ebf-4788-9725-f177fa71f187 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.877 186962 DEBUG nova.network.neutron [req-030cd2c5-c3f9-402e-871b-d8babf364e8d req-a9f35793-3ebf-4788-9725-f177fa71f187 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Refreshing network info cache for port 928a5256-1fd9-4eb7-b013-ea5dd33d2217 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.880 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Start _get_guest_xml network_info=[{"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.885 186962 WARNING nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.890 186962 DEBUG nova.virt.libvirt.host [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.891 186962 DEBUG nova.virt.libvirt.host [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.893 186962 DEBUG nova.virt.libvirt.host [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.893 186962 DEBUG nova.virt.libvirt.host [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.895 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.895 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.895 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.895 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.896 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.896 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.896 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.896 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.896 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.896 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.897 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.897 186962 DEBUG nova.virt.hardware [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.900 186962 DEBUG nova.virt.libvirt.vif [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-443543406',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-443543406',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=180,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOsVrzJst/QCHGwjHhEZRBxj5YlBckGN7GCEqsMkTZK5w4IGj+g0cxLV5rL6uHR53+n/Eg5IhfSXRauG4UFN2fqp2q7dQZP9G9ko5uc17a24SXN5cp3NC9hm6kGBhd08GQ==',key_name='tempest-TestSecurityGroupsBasicOps-1707000329',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-n72gf740',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:56:42Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=bcfc29cf-d647-4170-91a5-7f3158bf724b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.901 186962 DEBUG nova.network.os_vif_util [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.901 186962 DEBUG nova.network.os_vif_util [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:a5,bridge_name='br-int',has_traffic_filtering=True,id=928a5256-1fd9-4eb7-b013-ea5dd33d2217,network=Network(23b5288f-bce8-4028-9624-9fccadc3798e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap928a5256-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:56:50 np0005539505 nova_compute[186958]: 2025-11-29 07:56:50.902 186962 DEBUG nova.objects.instance [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'pci_devices' on Instance uuid bcfc29cf-d647-4170-91a5-7f3158bf724b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.384 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.384 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.547 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.604 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.604 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.605 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  <uuid>bcfc29cf-d647-4170-91a5-7f3158bf724b</uuid>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  <name>instance-000000b4</name>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-443543406</nova:name>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:56:50</nova:creationTime>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:        <nova:user uuid="dec30fbde18e4b2382ea2c59847d067f">tempest-TestSecurityGroupsBasicOps-2022058758-project-member</nova:user>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:        <nova:project uuid="e8e45e91223b45a79dd698a82af4a2a5">tempest-TestSecurityGroupsBasicOps-2022058758</nova:project>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:        <nova:port uuid="928a5256-1fd9-4eb7-b013-ea5dd33d2217">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <entry name="serial">bcfc29cf-d647-4170-91a5-7f3158bf724b</entry>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <entry name="uuid">bcfc29cf-d647-4170-91a5-7f3158bf724b</entry>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk.config"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:e8:d6:a5"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <target dev="tap928a5256-1f"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/console.log" append="off"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:56:51 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:56:51 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:56:51 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:56:51 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.606 186962 DEBUG nova.compute.manager [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Preparing to wait for external event network-vif-plugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.606 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.607 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.607 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.608 186962 DEBUG nova.virt.libvirt.vif [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-443543406',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-443543406',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=180,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOsVrzJst/QCHGwjHhEZRBxj5YlBckGN7GCEqsMkTZK5w4IGj+g0cxLV5rL6uHR53+n/Eg5IhfSXRauG4UFN2fqp2q7dQZP9G9ko5uc17a24SXN5cp3NC9hm6kGBhd08GQ==',key_name='tempest-TestSecurityGroupsBasicOps-1707000329',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-n72gf740',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:56:42Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=bcfc29cf-d647-4170-91a5-7f3158bf724b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.608 186962 DEBUG nova.network.os_vif_util [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.609 186962 DEBUG nova.network.os_vif_util [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:a5,bridge_name='br-int',has_traffic_filtering=True,id=928a5256-1fd9-4eb7-b013-ea5dd33d2217,network=Network(23b5288f-bce8-4028-9624-9fccadc3798e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap928a5256-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.609 186962 DEBUG os_vif [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:a5,bridge_name='br-int',has_traffic_filtering=True,id=928a5256-1fd9-4eb7-b013-ea5dd33d2217,network=Network(23b5288f-bce8-4028-9624-9fccadc3798e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap928a5256-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.610 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.610 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.610 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.612 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.614 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.615 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap928a5256-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.616 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap928a5256-1f, col_values=(('external_ids', {'iface-id': '928a5256-1fd9-4eb7-b013-ea5dd33d2217', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:d6:a5', 'vm-uuid': 'bcfc29cf-d647-4170-91a5-7f3158bf724b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.617 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:51 np0005539505 NetworkManager[55134]: <info>  [1764403011.6184] manager: (tap928a5256-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/413)
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.618 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.622 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.623 186962 INFO os_vif [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:d6:a5,bridge_name='br-int',has_traffic_filtering=True,id=928a5256-1fd9-4eb7-b013-ea5dd33d2217,network=Network(23b5288f-bce8-4028-9624-9fccadc3798e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap928a5256-1f')#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.940 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.940 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.941 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:51 np0005539505 nova_compute[186958]: 2025-11-29 07:56:51.941 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.301 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.301 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.301 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No VIF found with MAC fa:16:3e:e8:d6:a5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.302 186962 INFO nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Using config drive#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.304 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.381 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.382 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.436 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.438 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-000000b4, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk.config'#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.561 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.563 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5705MB free_disk=73.07085418701172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.563 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.563 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.824 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance bcfc29cf-d647-4170-91a5-7f3158bf724b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.824 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.824 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.838 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.851 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.852 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.863 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.881 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 02:56:52 np0005539505 nova_compute[186958]: 2025-11-29 07:56:52.935 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.050 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.090 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.090 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.527s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.246 186962 INFO nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Creating config drive at /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk.config#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.251 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmbsr0ne4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.375 186962 DEBUG oslo_concurrency.processutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmbsr0ne4" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:56:53 np0005539505 kernel: tap928a5256-1f: entered promiscuous mode
Nov 29 02:56:53 np0005539505 NetworkManager[55134]: <info>  [1764403013.4317] manager: (tap928a5256-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/414)
Nov 29 02:56:53 np0005539505 ovn_controller[95143]: 2025-11-29T07:56:53Z|00834|binding|INFO|Claiming lport 928a5256-1fd9-4eb7-b013-ea5dd33d2217 for this chassis.
Nov 29 02:56:53 np0005539505 ovn_controller[95143]: 2025-11-29T07:56:53Z|00835|binding|INFO|928a5256-1fd9-4eb7-b013-ea5dd33d2217: Claiming fa:16:3e:e8:d6:a5 10.100.0.8
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.432 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.436 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.439 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:53 np0005539505 systemd-udevd[254181]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:56:53 np0005539505 systemd-machined[153285]: New machine qemu-87-instance-000000b4.
Nov 29 02:56:53 np0005539505 NetworkManager[55134]: <info>  [1764403013.4752] device (tap928a5256-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:56:53 np0005539505 NetworkManager[55134]: <info>  [1764403013.4764] device (tap928a5256-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.492 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:53 np0005539505 ovn_controller[95143]: 2025-11-29T07:56:53Z|00836|binding|INFO|Setting lport 928a5256-1fd9-4eb7-b013-ea5dd33d2217 ovn-installed in OVS
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.495 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:53 np0005539505 systemd[1]: Started Virtual Machine qemu-87-instance-000000b4.
Nov 29 02:56:53 np0005539505 ovn_controller[95143]: 2025-11-29T07:56:53Z|00837|binding|INFO|Setting lport 928a5256-1fd9-4eb7-b013-ea5dd33d2217 up in Southbound
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.551 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:d6:a5 10.100.0.8'], port_security=['fa:16:3e:e8:d6:a5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'bcfc29cf-d647-4170-91a5-7f3158bf724b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23b5288f-bce8-4028-9624-9fccadc3798e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6277158c-6e74-483b-b9db-9a965d81b296 c1115576-d661-4a18-a4cf-72c2eb774188', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b0e87af0-4809-48c1-af4c-b6cced0e0834, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=928a5256-1fd9-4eb7-b013-ea5dd33d2217) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.552 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 928a5256-1fd9-4eb7-b013-ea5dd33d2217 in datapath 23b5288f-bce8-4028-9624-9fccadc3798e bound to our chassis#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.554 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 23b5288f-bce8-4028-9624-9fccadc3798e#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.563 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fd598aae-f6fb-47ea-8028-4a6033ba6b9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.565 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap23b5288f-b1 in ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.568 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap23b5288f-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.568 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[091c39e2-ced7-4094-b4de-7af440e1cc01]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.569 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[30ef4e2c-03e7-4515-aacb-5d11dfaf2f31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.580 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[6174f96f-be8b-49ec-8480-ec3c826a5b12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.593 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c001e4ee-0b51-4054-a3e2-4e6ef73ff242]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.621 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[29ddf49d-84b2-4a77-a9c6-688131e0ad78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.626 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6c96e78c-d373-4b5f-87f7-13fcb168861f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 NetworkManager[55134]: <info>  [1764403013.6272] manager: (tap23b5288f-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/415)
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.656 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[35b41dd3-684e-44ad-8bc3-661ad07d553d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.659 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[bade8e30-af0f-404d-9fe5-8dbe962111c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.676 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:53 np0005539505 NetworkManager[55134]: <info>  [1764403013.6909] device (tap23b5288f-b0): carrier: link connected
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.696 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a8c8c165-dd5b-44e8-bc74-c3ad644ad05e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.713 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[76a9f3ea-b2b8-47bd-b943-fdb4c0fc75b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23b5288f-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:15:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 259], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 846127, 'reachable_time': 21938, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254214, 'error': None, 'target': 'ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.734 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[143fdae2-5035-4560-bead-16eb067ca58a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:15ec'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 846127, 'tstamp': 846127}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254217, 'error': None, 'target': 'ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.756 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d73ffae3-5f74-43c8-b8ed-f44c57f7b987]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap23b5288f-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:15:ec'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 259], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 846127, 'reachable_time': 21938, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254222, 'error': None, 'target': 'ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.795 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d04d9b1f-0913-4093-99dd-7a2bce9354ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.797 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403013.7968025, bcfc29cf-d647-4170-91a5-7f3158bf724b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.798 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] VM Started (Lifecycle Event)#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.856 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f5325279-1f46-44b9-8935-e4fdcb21df40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.857 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.857 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23b5288f-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.858 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.858 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23b5288f-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.860 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:53 np0005539505 NetworkManager[55134]: <info>  [1764403013.8607] manager: (tap23b5288f-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/416)
Nov 29 02:56:53 np0005539505 kernel: tap23b5288f-b0: entered promiscuous mode
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.864 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap23b5288f-b0, col_values=(('external_ids', {'iface-id': '8dc0ce4a-0802-4fe5-a3c8-10919705a6d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.866 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:53 np0005539505 ovn_controller[95143]: 2025-11-29T07:56:53Z|00838|binding|INFO|Releasing lport 8dc0ce4a-0802-4fe5-a3c8-10919705a6d7 from this chassis (sb_readonly=0)
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.867 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/23b5288f-bce8-4028-9624-9fccadc3798e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/23b5288f-bce8-4028-9624-9fccadc3798e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.868 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[21f977af-c6d4-4d64-a78d-7933df6af585]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.868 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-23b5288f-bce8-4028-9624-9fccadc3798e
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/23b5288f-bce8-4028-9624-9fccadc3798e.pid.haproxy
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID 23b5288f-bce8-4028-9624-9fccadc3798e
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:56:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:56:53.869 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e', 'env', 'PROCESS_TAG=haproxy-23b5288f-bce8-4028-9624-9fccadc3798e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/23b5288f-bce8-4028-9624-9fccadc3798e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.877 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.879 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.883 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403013.7970428, bcfc29cf-d647-4170-91a5-7f3158bf724b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.884 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.912 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.915 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:56:53 np0005539505 nova_compute[186958]: 2025-11-29 07:56:53.939 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:56:54 np0005539505 podman[254255]: 2025-11-29 07:56:54.256304815 +0000 UTC m=+0.061730621 container create 7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 29 02:56:54 np0005539505 systemd[1]: Started libpod-conmon-7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4.scope.
Nov 29 02:56:54 np0005539505 podman[254255]: 2025-11-29 07:56:54.224510434 +0000 UTC m=+0.029936270 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:56:54 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:56:54 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d9ff5e150b258ddd985f13b2a1aeb1aca168a828463bb82aeb6a51a199b0631/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:56:54 np0005539505 podman[254255]: 2025-11-29 07:56:54.34572722 +0000 UTC m=+0.151153046 container init 7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.351 186962 DEBUG nova.compute.manager [req-89e5ced9-3378-4b36-84c6-ea3fefe0e40a req-b75b8113-d073-4238-9593-7259a9f60e7c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Received event network-vif-plugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.352 186962 DEBUG oslo_concurrency.lockutils [req-89e5ced9-3378-4b36-84c6-ea3fefe0e40a req-b75b8113-d073-4238-9593-7259a9f60e7c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.352 186962 DEBUG oslo_concurrency.lockutils [req-89e5ced9-3378-4b36-84c6-ea3fefe0e40a req-b75b8113-d073-4238-9593-7259a9f60e7c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:54 np0005539505 podman[254255]: 2025-11-29 07:56:54.353047118 +0000 UTC m=+0.158472934 container start 7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.352 186962 DEBUG oslo_concurrency.lockutils [req-89e5ced9-3378-4b36-84c6-ea3fefe0e40a req-b75b8113-d073-4238-9593-7259a9f60e7c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.353 186962 DEBUG nova.compute.manager [req-89e5ced9-3378-4b36-84c6-ea3fefe0e40a req-b75b8113-d073-4238-9593-7259a9f60e7c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Processing event network-vif-plugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.354 186962 DEBUG nova.compute.manager [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.356 186962 DEBUG nova.network.neutron [req-030cd2c5-c3f9-402e-871b-d8babf364e8d req-a9f35793-3ebf-4788-9725-f177fa71f187 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Updated VIF entry in instance network info cache for port 928a5256-1fd9-4eb7-b013-ea5dd33d2217. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.356 186962 DEBUG nova.network.neutron [req-030cd2c5-c3f9-402e-871b-d8babf364e8d req-a9f35793-3ebf-4788-9725-f177fa71f187 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Updating instance_info_cache with network_info: [{"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.358 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403014.3588114, bcfc29cf-d647-4170-91a5-7f3158bf724b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.359 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.360 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.364 186962 INFO nova.virt.libvirt.driver [-] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Instance spawned successfully.#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.364 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:56:54 np0005539505 neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e[254271]: [NOTICE]   (254275) : New worker (254277) forked
Nov 29 02:56:54 np0005539505 neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e[254271]: [NOTICE]   (254275) : Loading success.
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.449 186962 DEBUG oslo_concurrency.lockutils [req-030cd2c5-c3f9-402e-871b-d8babf364e8d req-a9f35793-3ebf-4788-9725-f177fa71f187 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.454 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.458 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.458 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.459 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.460 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.460 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.461 186962 DEBUG nova.virt.libvirt.driver [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.466 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.518 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.575 186962 INFO nova.compute.manager [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Took 11.26 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.576 186962 DEBUG nova.compute.manager [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.658 186962 INFO nova.compute.manager [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Took 12.94 seconds to build instance.#033[00m
Nov 29 02:56:54 np0005539505 nova_compute[186958]: 2025-11-29 07:56:54.676 186962 DEBUG oslo_concurrency.lockutils [None req-758e5eae-bd86-445d-b93f-0d893d2f3296 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:56 np0005539505 nova_compute[186958]: 2025-11-29 07:56:56.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:56 np0005539505 nova_compute[186958]: 2025-11-29 07:56:56.434 186962 DEBUG nova.compute.manager [req-e3a3de96-1863-4345-9045-e90d599316c5 req-3bb744cb-bcf3-491d-9ffd-395aa4c95ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Received event network-vif-plugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:56:56 np0005539505 nova_compute[186958]: 2025-11-29 07:56:56.434 186962 DEBUG oslo_concurrency.lockutils [req-e3a3de96-1863-4345-9045-e90d599316c5 req-3bb744cb-bcf3-491d-9ffd-395aa4c95ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:56:56 np0005539505 nova_compute[186958]: 2025-11-29 07:56:56.435 186962 DEBUG oslo_concurrency.lockutils [req-e3a3de96-1863-4345-9045-e90d599316c5 req-3bb744cb-bcf3-491d-9ffd-395aa4c95ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:56:56 np0005539505 nova_compute[186958]: 2025-11-29 07:56:56.435 186962 DEBUG oslo_concurrency.lockutils [req-e3a3de96-1863-4345-9045-e90d599316c5 req-3bb744cb-bcf3-491d-9ffd-395aa4c95ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:56:56 np0005539505 nova_compute[186958]: 2025-11-29 07:56:56.435 186962 DEBUG nova.compute.manager [req-e3a3de96-1863-4345-9045-e90d599316c5 req-3bb744cb-bcf3-491d-9ffd-395aa4c95ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] No waiting events found dispatching network-vif-plugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:56:56 np0005539505 nova_compute[186958]: 2025-11-29 07:56:56.436 186962 WARNING nova.compute.manager [req-e3a3de96-1863-4345-9045-e90d599316c5 req-3bb744cb-bcf3-491d-9ffd-395aa4c95ad0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Received unexpected event network-vif-plugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:56:56 np0005539505 nova_compute[186958]: 2025-11-29 07:56:56.619 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:56 np0005539505 podman[254286]: 2025-11-29 07:56:56.728791391 +0000 UTC m=+0.059096356 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:56:56 np0005539505 podman[254287]: 2025-11-29 07:56:56.767120558 +0000 UTC m=+0.097270179 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 29 02:56:58 np0005539505 NetworkManager[55134]: <info>  [1764403018.5178] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/417)
Nov 29 02:56:58 np0005539505 NetworkManager[55134]: <info>  [1764403018.5184] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/418)
Nov 29 02:56:58 np0005539505 nova_compute[186958]: 2025-11-29 07:56:58.522 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:58 np0005539505 nova_compute[186958]: 2025-11-29 07:56:58.628 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:56:58Z|00839|binding|INFO|Releasing lport 8dc0ce4a-0802-4fe5-a3c8-10919705a6d7 from this chassis (sb_readonly=0)
Nov 29 02:56:58 np0005539505 nova_compute[186958]: 2025-11-29 07:56:58.641 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:58 np0005539505 nova_compute[186958]: 2025-11-29 07:56:58.677 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:56:58 np0005539505 nova_compute[186958]: 2025-11-29 07:56:58.777 186962 DEBUG nova.compute.manager [req-bf8462f2-5be2-42f9-84cb-101a4155528e req-39eae3df-c937-4368-8e21-15030115e777 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Received event network-changed-928a5256-1fd9-4eb7-b013-ea5dd33d2217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:56:58 np0005539505 nova_compute[186958]: 2025-11-29 07:56:58.777 186962 DEBUG nova.compute.manager [req-bf8462f2-5be2-42f9-84cb-101a4155528e req-39eae3df-c937-4368-8e21-15030115e777 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Refreshing instance network info cache due to event network-changed-928a5256-1fd9-4eb7-b013-ea5dd33d2217. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:56:58 np0005539505 nova_compute[186958]: 2025-11-29 07:56:58.778 186962 DEBUG oslo_concurrency.lockutils [req-bf8462f2-5be2-42f9-84cb-101a4155528e req-39eae3df-c937-4368-8e21-15030115e777 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:56:58 np0005539505 nova_compute[186958]: 2025-11-29 07:56:58.778 186962 DEBUG oslo_concurrency.lockutils [req-bf8462f2-5be2-42f9-84cb-101a4155528e req-39eae3df-c937-4368-8e21-15030115e777 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:56:58 np0005539505 nova_compute[186958]: 2025-11-29 07:56:58.778 186962 DEBUG nova.network.neutron [req-bf8462f2-5be2-42f9-84cb-101a4155528e req-39eae3df-c937-4368-8e21-15030115e777 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Refreshing network info cache for port 928a5256-1fd9-4eb7-b013-ea5dd33d2217 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:56:59 np0005539505 nova_compute[186958]: 2025-11-29 07:56:59.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:56:59 np0005539505 nova_compute[186958]: 2025-11-29 07:56:59.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:00 np0005539505 nova_compute[186958]: 2025-11-29 07:57:00.181 186962 DEBUG nova.network.neutron [req-bf8462f2-5be2-42f9-84cb-101a4155528e req-39eae3df-c937-4368-8e21-15030115e777 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Updated VIF entry in instance network info cache for port 928a5256-1fd9-4eb7-b013-ea5dd33d2217. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:57:00 np0005539505 nova_compute[186958]: 2025-11-29 07:57:00.182 186962 DEBUG nova.network.neutron [req-bf8462f2-5be2-42f9-84cb-101a4155528e req-39eae3df-c937-4368-8e21-15030115e777 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Updating instance_info_cache with network_info: [{"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:57:00 np0005539505 nova_compute[186958]: 2025-11-29 07:57:00.218 186962 DEBUG oslo_concurrency.lockutils [req-bf8462f2-5be2-42f9-84cb-101a4155528e req-39eae3df-c937-4368-8e21-15030115e777 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:57:01 np0005539505 nova_compute[186958]: 2025-11-29 07:57:01.623 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:01 np0005539505 podman[254338]: 2025-11-29 07:57:01.72306368 +0000 UTC m=+0.059979922 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125)
Nov 29 02:57:01 np0005539505 podman[254339]: 2025-11-29 07:57:01.728091232 +0000 UTC m=+0.062001528 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 02:57:03 np0005539505 nova_compute[186958]: 2025-11-29 07:57:03.721 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:06 np0005539505 nova_compute[186958]: 2025-11-29 07:57:06.628 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:57:07Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e8:d6:a5 10.100.0.8
Nov 29 02:57:07 np0005539505 ovn_controller[95143]: 2025-11-29T07:57:07Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e8:d6:a5 10.100.0.8
Nov 29 02:57:08 np0005539505 nova_compute[186958]: 2025-11-29 07:57:08.724 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:11 np0005539505 nova_compute[186958]: 2025-11-29 07:57:11.632 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:13 np0005539505 nova_compute[186958]: 2025-11-29 07:57:13.726 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:15 np0005539505 podman[254388]: 2025-11-29 07:57:15.726116781 +0000 UTC m=+0.060478116 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 02:57:15 np0005539505 podman[254389]: 2025-11-29 07:57:15.743141173 +0000 UTC m=+0.075863062 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 02:57:16 np0005539505 nova_compute[186958]: 2025-11-29 07:57:16.636 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:18 np0005539505 nova_compute[186958]: 2025-11-29 07:57:18.728 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:20 np0005539505 podman[254430]: 2025-11-29 07:57:20.738072131 +0000 UTC m=+0.072011863 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 29 02:57:21 np0005539505 nova_compute[186958]: 2025-11-29 07:57:21.639 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.494 186962 DEBUG nova.compute.manager [req-a8ada1e9-a4a0-4c22-9215-2b6f1b2c250f req-91dc192c-28f5-4639-b95f-20bfa982f7ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Received event network-changed-928a5256-1fd9-4eb7-b013-ea5dd33d2217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.495 186962 DEBUG nova.compute.manager [req-a8ada1e9-a4a0-4c22-9215-2b6f1b2c250f req-91dc192c-28f5-4639-b95f-20bfa982f7ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Refreshing instance network info cache due to event network-changed-928a5256-1fd9-4eb7-b013-ea5dd33d2217. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.495 186962 DEBUG oslo_concurrency.lockutils [req-a8ada1e9-a4a0-4c22-9215-2b6f1b2c250f req-91dc192c-28f5-4639-b95f-20bfa982f7ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.496 186962 DEBUG oslo_concurrency.lockutils [req-a8ada1e9-a4a0-4c22-9215-2b6f1b2c250f req-91dc192c-28f5-4639-b95f-20bfa982f7ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.496 186962 DEBUG nova.network.neutron [req-a8ada1e9-a4a0-4c22-9215-2b6f1b2c250f req-91dc192c-28f5-4639-b95f-20bfa982f7ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Refreshing network info cache for port 928a5256-1fd9-4eb7-b013-ea5dd33d2217 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.575 186962 DEBUG oslo_concurrency.lockutils [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "bcfc29cf-d647-4170-91a5-7f3158bf724b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.576 186962 DEBUG oslo_concurrency.lockutils [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.577 186962 DEBUG oslo_concurrency.lockutils [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.577 186962 DEBUG oslo_concurrency.lockutils [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.577 186962 DEBUG oslo_concurrency.lockutils [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.594 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.594 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.595 186962 INFO nova.compute.manager [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Terminating instance#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.595 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.596 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.606 186962 DEBUG nova.compute.manager [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:57:22 np0005539505 kernel: tap928a5256-1f (unregistering): left promiscuous mode
Nov 29 02:57:22 np0005539505 NetworkManager[55134]: <info>  [1764403042.6373] device (tap928a5256-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:57:22 np0005539505 ovn_controller[95143]: 2025-11-29T07:57:22Z|00840|binding|INFO|Releasing lport 928a5256-1fd9-4eb7-b013-ea5dd33d2217 from this chassis (sb_readonly=0)
Nov 29 02:57:22 np0005539505 ovn_controller[95143]: 2025-11-29T07:57:22Z|00841|binding|INFO|Setting lport 928a5256-1fd9-4eb7-b013-ea5dd33d2217 down in Southbound
Nov 29 02:57:22 np0005539505 ovn_controller[95143]: 2025-11-29T07:57:22Z|00842|binding|INFO|Removing iface tap928a5256-1f ovn-installed in OVS
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.645 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.647 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.653 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:d6:a5 10.100.0.8'], port_security=['fa:16:3e:e8:d6:a5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'bcfc29cf-d647-4170-91a5-7f3158bf724b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-23b5288f-bce8-4028-9624-9fccadc3798e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6277158c-6e74-483b-b9db-9a965d81b296 c1115576-d661-4a18-a4cf-72c2eb774188', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b0e87af0-4809-48c1-af4c-b6cced0e0834, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=928a5256-1fd9-4eb7-b013-ea5dd33d2217) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.655 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 928a5256-1fd9-4eb7-b013-ea5dd33d2217 in datapath 23b5288f-bce8-4028-9624-9fccadc3798e unbound from our chassis#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.657 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 23b5288f-bce8-4028-9624-9fccadc3798e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.658 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[457cb4cf-2e74-4582-b590-f9817a29997a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.659 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e namespace which is not needed anymore#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.664 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:22 np0005539505 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b4.scope: Deactivated successfully.
Nov 29 02:57:22 np0005539505 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b4.scope: Consumed 13.466s CPU time.
Nov 29 02:57:22 np0005539505 systemd-machined[153285]: Machine qemu-87-instance-000000b4 terminated.
Nov 29 02:57:22 np0005539505 neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e[254271]: [NOTICE]   (254275) : haproxy version is 2.8.14-c23fe91
Nov 29 02:57:22 np0005539505 neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e[254271]: [NOTICE]   (254275) : path to executable is /usr/sbin/haproxy
Nov 29 02:57:22 np0005539505 neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e[254271]: [WARNING]  (254275) : Exiting Master process...
Nov 29 02:57:22 np0005539505 neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e[254271]: [ALERT]    (254275) : Current worker (254277) exited with code 143 (Terminated)
Nov 29 02:57:22 np0005539505 neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e[254271]: [WARNING]  (254275) : All workers exited. Exiting... (0)
Nov 29 02:57:22 np0005539505 systemd[1]: libpod-7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4.scope: Deactivated successfully.
Nov 29 02:57:22 np0005539505 podman[254476]: 2025-11-29 07:57:22.780506894 +0000 UTC m=+0.041906499 container died 7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:57:22 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4-userdata-shm.mount: Deactivated successfully.
Nov 29 02:57:22 np0005539505 systemd[1]: var-lib-containers-storage-overlay-5d9ff5e150b258ddd985f13b2a1aeb1aca168a828463bb82aeb6a51a199b0631-merged.mount: Deactivated successfully.
Nov 29 02:57:22 np0005539505 podman[254476]: 2025-11-29 07:57:22.817322568 +0000 UTC m=+0.078722173 container cleanup 7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.824 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:22 np0005539505 systemd[1]: libpod-conmon-7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4.scope: Deactivated successfully.
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.833 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.861 186962 INFO nova.virt.libvirt.driver [-] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Instance destroyed successfully.#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.861 186962 DEBUG nova.objects.instance [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'resources' on Instance uuid bcfc29cf-d647-4170-91a5-7f3158bf724b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:57:22 np0005539505 podman[254509]: 2025-11-29 07:57:22.883770672 +0000 UTC m=+0.041026855 container remove 7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.883 186962 DEBUG nova.virt.libvirt.vif [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:56:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-443543406',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-443543406',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=180,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOsVrzJst/QCHGwjHhEZRBxj5YlBckGN7GCEqsMkTZK5w4IGj+g0cxLV5rL6uHR53+n/Eg5IhfSXRauG4UFN2fqp2q7dQZP9G9ko5uc17a24SXN5cp3NC9hm6kGBhd08GQ==',key_name='tempest-TestSecurityGroupsBasicOps-1707000329',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:56:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-n72gf740',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:56:54Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=bcfc29cf-d647-4170-91a5-7f3158bf724b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.884 186962 DEBUG nova.network.os_vif_util [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.885 186962 DEBUG nova.network.os_vif_util [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:d6:a5,bridge_name='br-int',has_traffic_filtering=True,id=928a5256-1fd9-4eb7-b013-ea5dd33d2217,network=Network(23b5288f-bce8-4028-9624-9fccadc3798e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap928a5256-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.885 186962 DEBUG os_vif [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:d6:a5,bridge_name='br-int',has_traffic_filtering=True,id=928a5256-1fd9-4eb7-b013-ea5dd33d2217,network=Network(23b5288f-bce8-4028-9624-9fccadc3798e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap928a5256-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.887 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.887 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap928a5256-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.888 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[340dfd8f-a78c-4bc3-b6b6-a3e40beafa11]: (4, ('Sat Nov 29 07:57:22 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e (7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4)\n7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4\nSat Nov 29 07:57:22 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e (7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4)\n7e91cfc22d94965594ea8e55439e58dab376d9e616f70ab5813b2a182f6885f4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.890 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.892 186962 INFO os_vif [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:d6:a5,bridge_name='br-int',has_traffic_filtering=True,id=928a5256-1fd9-4eb7-b013-ea5dd33d2217,network=Network(23b5288f-bce8-4028-9624-9fccadc3798e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap928a5256-1f')#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.892 186962 INFO nova.virt.libvirt.driver [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Deleting instance files /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b_del#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.893 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8f51d54f-354b-42e4-aa5f-f1739f332d2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.893 186962 INFO nova.virt.libvirt.driver [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Deletion of /var/lib/nova/instances/bcfc29cf-d647-4170-91a5-7f3158bf724b_del complete#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.894 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23b5288f-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.895 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:22 np0005539505 kernel: tap23b5288f-b0: left promiscuous mode
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.909 186962 DEBUG nova.compute.manager [req-6f56c4ed-c10b-4cfb-9f3c-9c7c64ae4b42 req-1781f736-3b1e-4350-a4c8-e47b19943f88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Received event network-vif-unplugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.909 186962 DEBUG oslo_concurrency.lockutils [req-6f56c4ed-c10b-4cfb-9f3c-9c7c64ae4b42 req-1781f736-3b1e-4350-a4c8-e47b19943f88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.909 186962 DEBUG oslo_concurrency.lockutils [req-6f56c4ed-c10b-4cfb-9f3c-9c7c64ae4b42 req-1781f736-3b1e-4350-a4c8-e47b19943f88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.909 186962 DEBUG oslo_concurrency.lockutils [req-6f56c4ed-c10b-4cfb-9f3c-9c7c64ae4b42 req-1781f736-3b1e-4350-a4c8-e47b19943f88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.910 186962 DEBUG nova.compute.manager [req-6f56c4ed-c10b-4cfb-9f3c-9c7c64ae4b42 req-1781f736-3b1e-4350-a4c8-e47b19943f88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] No waiting events found dispatching network-vif-unplugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.910 186962 DEBUG nova.compute.manager [req-6f56c4ed-c10b-4cfb-9f3c-9c7c64ae4b42 req-1781f736-3b1e-4350-a4c8-e47b19943f88 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Received event network-vif-unplugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.910 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[7dee37e5-5f8f-42ce-960d-3b11ab982bcc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.910 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.933 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[834e7b27-7603-48a5-a1dc-f3ef0b523eb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.935 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6b0b145d-42f0-4285-a57b-107e148a1a22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.952 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3f83f92a-9c1d-482e-b129-1589d3e91d91]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 846120, 'reachable_time': 39202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254537, 'error': None, 'target': 'ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:57:22 np0005539505 systemd[1]: run-netns-ovnmeta\x2d23b5288f\x2dbce8\x2d4028\x2d9624\x2d9fccadc3798e.mount: Deactivated successfully.
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.957 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-23b5288f-bce8-4028-9624-9fccadc3798e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:57:22 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:22.957 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[c07bc0c2-b63e-437d-ad6c-652dbb034897]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.960 186962 INFO nova.compute.manager [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.960 186962 DEBUG oslo.service.loopingcall [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.960 186962 DEBUG nova.compute.manager [-] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:57:22 np0005539505 nova_compute[186958]: 2025-11-29 07:57:22.961 186962 DEBUG nova.network.neutron [-] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:57:23 np0005539505 nova_compute[186958]: 2025-11-29 07:57:23.729 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:24 np0005539505 nova_compute[186958]: 2025-11-29 07:57:24.766 186962 DEBUG nova.network.neutron [req-a8ada1e9-a4a0-4c22-9215-2b6f1b2c250f req-91dc192c-28f5-4639-b95f-20bfa982f7ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Updated VIF entry in instance network info cache for port 928a5256-1fd9-4eb7-b013-ea5dd33d2217. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:57:24 np0005539505 nova_compute[186958]: 2025-11-29 07:57:24.766 186962 DEBUG nova.network.neutron [req-a8ada1e9-a4a0-4c22-9215-2b6f1b2c250f req-91dc192c-28f5-4639-b95f-20bfa982f7ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Updating instance_info_cache with network_info: [{"id": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "address": "fa:16:3e:e8:d6:a5", "network": {"id": "23b5288f-bce8-4028-9624-9fccadc3798e", "bridge": "br-int", "label": "tempest-network-smoke--30615314", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap928a5256-1f", "ovs_interfaceid": "928a5256-1fd9-4eb7-b013-ea5dd33d2217", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:57:24 np0005539505 nova_compute[186958]: 2025-11-29 07:57:24.785 186962 DEBUG nova.network.neutron [-] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:57:24 np0005539505 nova_compute[186958]: 2025-11-29 07:57:24.786 186962 DEBUG oslo_concurrency.lockutils [req-a8ada1e9-a4a0-4c22-9215-2b6f1b2c250f req-91dc192c-28f5-4639-b95f-20bfa982f7ac 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-bcfc29cf-d647-4170-91a5-7f3158bf724b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:57:24 np0005539505 nova_compute[186958]: 2025-11-29 07:57:24.798 186962 INFO nova.compute.manager [-] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Took 1.84 seconds to deallocate network for instance.#033[00m
Nov 29 02:57:24 np0005539505 nova_compute[186958]: 2025-11-29 07:57:24.904 186962 DEBUG oslo_concurrency.lockutils [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:57:24 np0005539505 nova_compute[186958]: 2025-11-29 07:57:24.904 186962 DEBUG oslo_concurrency.lockutils [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:57:24 np0005539505 nova_compute[186958]: 2025-11-29 07:57:24.907 186962 DEBUG nova.compute.manager [req-88753c88-4699-4b1c-9328-53923b886209 req-9580d999-8412-4f8b-a290-78dbb90f8a40 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Received event network-vif-deleted-928a5256-1fd9-4eb7-b013-ea5dd33d2217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:57:25 np0005539505 nova_compute[186958]: 2025-11-29 07:57:25.066 186962 DEBUG nova.compute.provider_tree [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:57:25 np0005539505 nova_compute[186958]: 2025-11-29 07:57:25.248 186962 DEBUG nova.scheduler.client.report [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:57:26 np0005539505 nova_compute[186958]: 2025-11-29 07:57:26.485 186962 DEBUG nova.compute.manager [req-43c658d1-da46-4d7f-a3d6-74c19c7efa5a req-dfc2d3bf-5264-41be-850f-62d742b56018 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Received event network-vif-plugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:57:26 np0005539505 nova_compute[186958]: 2025-11-29 07:57:26.486 186962 DEBUG oslo_concurrency.lockutils [req-43c658d1-da46-4d7f-a3d6-74c19c7efa5a req-dfc2d3bf-5264-41be-850f-62d742b56018 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:57:26 np0005539505 nova_compute[186958]: 2025-11-29 07:57:26.486 186962 DEBUG oslo_concurrency.lockutils [req-43c658d1-da46-4d7f-a3d6-74c19c7efa5a req-dfc2d3bf-5264-41be-850f-62d742b56018 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:57:26 np0005539505 nova_compute[186958]: 2025-11-29 07:57:26.486 186962 DEBUG oslo_concurrency.lockutils [req-43c658d1-da46-4d7f-a3d6-74c19c7efa5a req-dfc2d3bf-5264-41be-850f-62d742b56018 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:57:26 np0005539505 nova_compute[186958]: 2025-11-29 07:57:26.486 186962 DEBUG nova.compute.manager [req-43c658d1-da46-4d7f-a3d6-74c19c7efa5a req-dfc2d3bf-5264-41be-850f-62d742b56018 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] No waiting events found dispatching network-vif-plugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:57:26 np0005539505 nova_compute[186958]: 2025-11-29 07:57:26.487 186962 WARNING nova.compute.manager [req-43c658d1-da46-4d7f-a3d6-74c19c7efa5a req-dfc2d3bf-5264-41be-850f-62d742b56018 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Received unexpected event network-vif-plugged-928a5256-1fd9-4eb7-b013-ea5dd33d2217 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:57:26 np0005539505 nova_compute[186958]: 2025-11-29 07:57:26.524 186962 DEBUG oslo_concurrency.lockutils [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:57:26 np0005539505 nova_compute[186958]: 2025-11-29 07:57:26.757 186962 INFO nova.scheduler.client.report [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Deleted allocations for instance bcfc29cf-d647-4170-91a5-7f3158bf724b#033[00m
Nov 29 02:57:26 np0005539505 nova_compute[186958]: 2025-11-29 07:57:26.878 186962 DEBUG oslo_concurrency.lockutils [None req-9ef1d637-61a8-41b0-9dc4-371781122062 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "bcfc29cf-d647-4170-91a5-7f3158bf724b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.302s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:57:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:27.545 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:57:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:27.545 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:57:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:57:27.545 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:57:27 np0005539505 podman[254538]: 2025-11-29 07:57:27.71540395 +0000 UTC m=+0.046985313 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 02:57:27 np0005539505 podman[254539]: 2025-11-29 07:57:27.757987548 +0000 UTC m=+0.086201255 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 29 02:57:27 np0005539505 nova_compute[186958]: 2025-11-29 07:57:27.889 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:28 np0005539505 nova_compute[186958]: 2025-11-29 07:57:28.784 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:32 np0005539505 podman[254586]: 2025-11-29 07:57:32.714456845 +0000 UTC m=+0.052102748 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 02:57:32 np0005539505 podman[254587]: 2025-11-29 07:57:32.744043174 +0000 UTC m=+0.077859119 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 02:57:32 np0005539505 nova_compute[186958]: 2025-11-29 07:57:32.892 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:33 np0005539505 nova_compute[186958]: 2025-11-29 07:57:33.785 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:37 np0005539505 nova_compute[186958]: 2025-11-29 07:57:37.189 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:37 np0005539505 nova_compute[186958]: 2025-11-29 07:57:37.330 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:37 np0005539505 nova_compute[186958]: 2025-11-29 07:57:37.859 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764403042.8575015, bcfc29cf-d647-4170-91a5-7f3158bf724b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:57:37 np0005539505 nova_compute[186958]: 2025-11-29 07:57:37.859 186962 INFO nova.compute.manager [-] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:57:37 np0005539505 nova_compute[186958]: 2025-11-29 07:57:37.879 186962 DEBUG nova.compute.manager [None req-6ccc51e8-26e6-4904-ba01-bfdae6b66766 - - - - - -] [instance: bcfc29cf-d647-4170-91a5-7f3158bf724b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:57:37 np0005539505 nova_compute[186958]: 2025-11-29 07:57:37.893 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:38 np0005539505 nova_compute[186958]: 2025-11-29 07:57:38.786 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:42 np0005539505 nova_compute[186958]: 2025-11-29 07:57:42.894 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:43 np0005539505 nova_compute[186958]: 2025-11-29 07:57:43.788 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:46 np0005539505 nova_compute[186958]: 2025-11-29 07:57:46.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:46 np0005539505 podman[254629]: 2025-11-29 07:57:46.728497089 +0000 UTC m=+0.050047356 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 02:57:46 np0005539505 podman[254628]: 2025-11-29 07:57:46.733059748 +0000 UTC m=+0.055446219 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vendor=Red Hat, Inc.)
Nov 29 02:57:47 np0005539505 nova_compute[186958]: 2025-11-29 07:57:47.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:47 np0005539505 nova_compute[186958]: 2025-11-29 07:57:47.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:57:47 np0005539505 nova_compute[186958]: 2025-11-29 07:57:47.896 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:48 np0005539505 nova_compute[186958]: 2025-11-29 07:57:48.790 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:50 np0005539505 nova_compute[186958]: 2025-11-29 07:57:50.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.392 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.393 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.421 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.421 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.422 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.422 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:57:51 np0005539505 podman[254671]: 2025-11-29 07:57:51.511664699 +0000 UTC m=+0.054722248 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.587 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.588 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5683MB free_disk=73.07092666625977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.588 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.588 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.656 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.657 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.695 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.711 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.732 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:57:51 np0005539505 nova_compute[186958]: 2025-11-29 07:57:51.732 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:57:52 np0005539505 nova_compute[186958]: 2025-11-29 07:57:52.899 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:53 np0005539505 nova_compute[186958]: 2025-11-29 07:57:53.718 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:53 np0005539505 nova_compute[186958]: 2025-11-29 07:57:53.792 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:56 np0005539505 nova_compute[186958]: 2025-11-29 07:57:56.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:57:57 np0005539505 nova_compute[186958]: 2025-11-29 07:57:57.901 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:57:58 np0005539505 podman[254692]: 2025-11-29 07:57:58.713950386 +0000 UTC m=+0.051143207 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 02:57:58 np0005539505 podman[254693]: 2025-11-29 07:57:58.741251048 +0000 UTC m=+0.076643758 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller)
Nov 29 02:57:58 np0005539505 nova_compute[186958]: 2025-11-29 07:57:58.794 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:01 np0005539505 nova_compute[186958]: 2025-11-29 07:58:01.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:01 np0005539505 nova_compute[186958]: 2025-11-29 07:58:01.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:02 np0005539505 nova_compute[186958]: 2025-11-29 07:58:02.903 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:03 np0005539505 podman[254743]: 2025-11-29 07:58:03.727114191 +0000 UTC m=+0.059341288 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 02:58:03 np0005539505 podman[254744]: 2025-11-29 07:58:03.761689199 +0000 UTC m=+0.091502308 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 02:58:03 np0005539505 nova_compute[186958]: 2025-11-29 07:58:03.795 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:07 np0005539505 nova_compute[186958]: 2025-11-29 07:58:07.905 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:08 np0005539505 nova_compute[186958]: 2025-11-29 07:58:08.798 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:12 np0005539505 nova_compute[186958]: 2025-11-29 07:58:12.906 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:13 np0005539505 nova_compute[186958]: 2025-11-29 07:58:13.800 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:15 np0005539505 ovn_controller[95143]: 2025-11-29T07:58:15Z|00843|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Nov 29 02:58:17 np0005539505 podman[254785]: 2025-11-29 07:58:17.737762963 +0000 UTC m=+0.060226544 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:58:17 np0005539505 podman[254784]: 2025-11-29 07:58:17.769095499 +0000 UTC m=+0.097114497 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Nov 29 02:58:17 np0005539505 nova_compute[186958]: 2025-11-29 07:58:17.908 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:18 np0005539505 nova_compute[186958]: 2025-11-29 07:58:18.802 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:19 np0005539505 nova_compute[186958]: 2025-11-29 07:58:19.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:20 np0005539505 nova_compute[186958]: 2025-11-29 07:58:20.393 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:20 np0005539505 nova_compute[186958]: 2025-11-29 07:58:20.394 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 02:58:20 np0005539505 nova_compute[186958]: 2025-11-29 07:58:20.409 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 02:58:21 np0005539505 podman[254827]: 2025-11-29 07:58:21.748322399 +0000 UTC m=+0.079192511 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 02:58:22 np0005539505 nova_compute[186958]: 2025-11-29 07:58:22.911 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:23 np0005539505 nova_compute[186958]: 2025-11-29 07:58:23.804 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:26 np0005539505 nova_compute[186958]: 2025-11-29 07:58:26.948 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "0a864665-adb3-44b7-8550-4dcd7f7e8251" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:26 np0005539505 nova_compute[186958]: 2025-11-29 07:58:26.948 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:26 np0005539505 nova_compute[186958]: 2025-11-29 07:58:26.972 186962 DEBUG nova.compute.manager [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.079 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.080 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.090 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.091 186962 INFO nova.compute.claims [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.197 186962 DEBUG nova.compute.provider_tree [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:58:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:27.546 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:27.547 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:27.547 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.628 186962 DEBUG nova.scheduler.client.report [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.657 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.577s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.658 186962 DEBUG nova.compute.manager [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.727 186962 DEBUG nova.compute.manager [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.727 186962 DEBUG nova.network.neutron [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.746 186962 INFO nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.762 186962 DEBUG nova.compute.manager [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:58:27 np0005539505 nova_compute[186958]: 2025-11-29 07:58:27.924 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.108 186962 DEBUG nova.policy [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.127 186962 DEBUG nova.compute.manager [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.129 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.130 186962 INFO nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Creating image(s)#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.130 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "/var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.131 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.131 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.147 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.209 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.210 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.211 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.223 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.291 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.293 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.327 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.328 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.328 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.395 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.397 186962 DEBUG nova.virt.disk.api [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Checking if we can resize image /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.397 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.452 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.453 186962 DEBUG nova.virt.disk.api [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Cannot resize image /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.453 186962 DEBUG nova.objects.instance [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'migration_context' on Instance uuid 0a864665-adb3-44b7-8550-4dcd7f7e8251 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.468 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.468 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Ensure instance console log exists: /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.469 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.469 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.470 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:28.781 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.780 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:28 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:28.782 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:58:28 np0005539505 nova_compute[186958]: 2025-11-29 07:58:28.807 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:29 np0005539505 nova_compute[186958]: 2025-11-29 07:58:29.209 186962 DEBUG nova.network.neutron [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Successfully created port: 127f3745-5a5a-40fe-a9dd-c135fbf7d109 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:58:29 np0005539505 podman[254861]: 2025-11-29 07:58:29.725464075 +0000 UTC m=+0.060497402 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:58:29 np0005539505 podman[254862]: 2025-11-29 07:58:29.771095315 +0000 UTC m=+0.101149461 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:58:30 np0005539505 nova_compute[186958]: 2025-11-29 07:58:30.348 186962 DEBUG nova.network.neutron [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Successfully updated port: 127f3745-5a5a-40fe-a9dd-c135fbf7d109 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:58:30 np0005539505 nova_compute[186958]: 2025-11-29 07:58:30.373 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:58:30 np0005539505 nova_compute[186958]: 2025-11-29 07:58:30.374 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquired lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:58:30 np0005539505 nova_compute[186958]: 2025-11-29 07:58:30.374 186962 DEBUG nova.network.neutron [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:58:30 np0005539505 nova_compute[186958]: 2025-11-29 07:58:30.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:30 np0005539505 nova_compute[186958]: 2025-11-29 07:58:30.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 02:58:30 np0005539505 nova_compute[186958]: 2025-11-29 07:58:30.504 186962 DEBUG nova.network.neutron [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:58:30 np0005539505 nova_compute[186958]: 2025-11-29 07:58:30.623 186962 DEBUG nova.compute.manager [req-7bc0ccbf-86be-46bf-ad96-5c0d2abc3345 req-5bd82f8a-bd30-4259-bfc1-4fffb12f78ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Received event network-changed-127f3745-5a5a-40fe-a9dd-c135fbf7d109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:58:30 np0005539505 nova_compute[186958]: 2025-11-29 07:58:30.624 186962 DEBUG nova.compute.manager [req-7bc0ccbf-86be-46bf-ad96-5c0d2abc3345 req-5bd82f8a-bd30-4259-bfc1-4fffb12f78ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Refreshing instance network info cache due to event network-changed-127f3745-5a5a-40fe-a9dd-c135fbf7d109. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:58:30 np0005539505 nova_compute[186958]: 2025-11-29 07:58:30.624 186962 DEBUG oslo_concurrency.lockutils [req-7bc0ccbf-86be-46bf-ad96-5c0d2abc3345 req-5bd82f8a-bd30-4259-bfc1-4fffb12f78ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.175 186962 DEBUG nova.network.neutron [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Updating instance_info_cache with network_info: [{"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.199 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Releasing lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.199 186962 DEBUG nova.compute.manager [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Instance network_info: |[{"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.199 186962 DEBUG oslo_concurrency.lockutils [req-7bc0ccbf-86be-46bf-ad96-5c0d2abc3345 req-5bd82f8a-bd30-4259-bfc1-4fffb12f78ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.200 186962 DEBUG nova.network.neutron [req-7bc0ccbf-86be-46bf-ad96-5c0d2abc3345 req-5bd82f8a-bd30-4259-bfc1-4fffb12f78ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Refreshing network info cache for port 127f3745-5a5a-40fe-a9dd-c135fbf7d109 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.202 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Start _get_guest_xml network_info=[{"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.207 186962 WARNING nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.215 186962 DEBUG nova.virt.libvirt.host [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.216 186962 DEBUG nova.virt.libvirt.host [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.219 186962 DEBUG nova.virt.libvirt.host [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.220 186962 DEBUG nova.virt.libvirt.host [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.222 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.222 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.223 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.223 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.223 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.224 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.224 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.224 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.224 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.225 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.225 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.225 186962 DEBUG nova.virt.hardware [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.228 186962 DEBUG nova.virt.libvirt.vif [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:58:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ge',id=182,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCbO5lg6MOx4cmqxiU7BbZWHBZr0hw1OFyx1wi8ylE3DD5XkolVPeqN7tTosQEkEUch/54VxCcm6kKSgh3IzPEqFnEAKzoXbjFEzrMqt3IWtOqFGx5kqMbL3gOxuWmKn/A==',key_name='tempest-TestSecurityGroupsBasicOps-519869731',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-aeiperp7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:58:27Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=0a864665-adb3-44b7-8550-4dcd7f7e8251,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.229 186962 DEBUG nova.network.os_vif_util [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.230 186962 DEBUG nova.network.os_vif_util [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:8f:87,bridge_name='br-int',has_traffic_filtering=True,id=127f3745-5a5a-40fe-a9dd-c135fbf7d109,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap127f3745-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.230 186962 DEBUG nova.objects.instance [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0a864665-adb3-44b7-8550-4dcd7f7e8251 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.245 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  <uuid>0a864665-adb3-44b7-8550-4dcd7f7e8251</uuid>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  <name>instance-000000b6</name>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403</nova:name>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:58:31</nova:creationTime>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:        <nova:user uuid="dec30fbde18e4b2382ea2c59847d067f">tempest-TestSecurityGroupsBasicOps-2022058758-project-member</nova:user>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:        <nova:project uuid="e8e45e91223b45a79dd698a82af4a2a5">tempest-TestSecurityGroupsBasicOps-2022058758</nova:project>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:        <nova:port uuid="127f3745-5a5a-40fe-a9dd-c135fbf7d109">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <entry name="serial">0a864665-adb3-44b7-8550-4dcd7f7e8251</entry>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <entry name="uuid">0a864665-adb3-44b7-8550-4dcd7f7e8251</entry>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.config"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:65:8f:87"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <target dev="tap127f3745-5a"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/console.log" append="off"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:58:31 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:58:31 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:58:31 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:58:31 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.246 186962 DEBUG nova.compute.manager [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Preparing to wait for external event network-vif-plugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.247 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.247 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.247 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.248 186962 DEBUG nova.virt.libvirt.vif [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:58:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ge',id=182,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCbO5lg6MOx4cmqxiU7BbZWHBZr0hw1OFyx1wi8ylE3DD5XkolVPeqN7tTosQEkEUch/54VxCcm6kKSgh3IzPEqFnEAKzoXbjFEzrMqt3IWtOqFGx5kqMbL3gOxuWmKn/A==',key_name='tempest-TestSecurityGroupsBasicOps-519869731',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-aeiperp7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:58:27Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=0a864665-adb3-44b7-8550-4dcd7f7e8251,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.248 186962 DEBUG nova.network.os_vif_util [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.249 186962 DEBUG nova.network.os_vif_util [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:8f:87,bridge_name='br-int',has_traffic_filtering=True,id=127f3745-5a5a-40fe-a9dd-c135fbf7d109,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap127f3745-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.249 186962 DEBUG os_vif [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:8f:87,bridge_name='br-int',has_traffic_filtering=True,id=127f3745-5a5a-40fe-a9dd-c135fbf7d109,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap127f3745-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.250 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.250 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.250 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.253 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.253 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap127f3745-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.254 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap127f3745-5a, col_values=(('external_ids', {'iface-id': '127f3745-5a5a-40fe-a9dd-c135fbf7d109', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:8f:87', 'vm-uuid': '0a864665-adb3-44b7-8550-4dcd7f7e8251'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:58:31 np0005539505 NetworkManager[55134]: <info>  [1764403111.2560] manager: (tap127f3745-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/419)
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.257 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.264 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.266 186962 INFO os_vif [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:8f:87,bridge_name='br-int',has_traffic_filtering=True,id=127f3745-5a5a-40fe-a9dd-c135fbf7d109,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap127f3745-5a')#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.318 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.319 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.319 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No VIF found with MAC fa:16:3e:65:8f:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.320 186962 INFO nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Using config drive#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.662 186962 INFO nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Creating config drive at /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.config#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.667 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3k24_84t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.791 186962 DEBUG oslo_concurrency.processutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3k24_84t" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:58:31 np0005539505 kernel: tap127f3745-5a: entered promiscuous mode
Nov 29 02:58:31 np0005539505 NetworkManager[55134]: <info>  [1764403111.8500] manager: (tap127f3745-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/420)
Nov 29 02:58:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:58:31Z|00844|binding|INFO|Claiming lport 127f3745-5a5a-40fe-a9dd-c135fbf7d109 for this chassis.
Nov 29 02:58:31 np0005539505 ovn_controller[95143]: 2025-11-29T07:58:31Z|00845|binding|INFO|127f3745-5a5a-40fe-a9dd-c135fbf7d109: Claiming fa:16:3e:65:8f:87 10.100.0.6
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.851 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.856 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.860 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.865 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:31 np0005539505 nova_compute[186958]: 2025-11-29 07:58:31.866 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:31 np0005539505 NetworkManager[55134]: <info>  [1764403111.8685] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/421)
Nov 29 02:58:31 np0005539505 NetworkManager[55134]: <info>  [1764403111.8693] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/422)
Nov 29 02:58:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:31.878 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:8f:87 10.100.0.6'], port_security=['fa:16:3e:65:8f:87 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '155897d4-49e9-4196-9d87-858cab256c02', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3099386-1dbe-4af7-95d0-de6761c24471, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=127f3745-5a5a-40fe-a9dd-c135fbf7d109) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:58:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:31.880 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 127f3745-5a5a-40fe-a9dd-c135fbf7d109 in datapath e25d5113-a42d-44ca-8e65-a777d9e11f48 bound to our chassis#033[00m
Nov 29 02:58:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:31.881 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e25d5113-a42d-44ca-8e65-a777d9e11f48#033[00m
Nov 29 02:58:31 np0005539505 systemd-udevd[254929]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:58:31 np0005539505 systemd-machined[153285]: New machine qemu-88-instance-000000b6.
Nov 29 02:58:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:31.897 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b2127cd5-d5a7-46f0-b3bd-ece250681fa3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:31.898 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape25d5113-a1 in ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:58:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:31.901 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape25d5113-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:58:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:31.901 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fa8aa70f-8554-40f0-89e4-4d5ab60e6ac1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:31 np0005539505 systemd[1]: Started Virtual Machine qemu-88-instance-000000b6.
Nov 29 02:58:31 np0005539505 NetworkManager[55134]: <info>  [1764403111.9036] device (tap127f3745-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:58:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:31.902 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[6ede0058-7593-4979-852c-4bda74c23d21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:31 np0005539505 NetworkManager[55134]: <info>  [1764403111.9044] device (tap127f3745-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:58:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:31.913 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[6a5dde9b-d299-49c1-b9cc-e8ac662338d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:31.952 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9018d120-8ef0-4915-8ec0-7e5e18fa20ea]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:31 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:31.980 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[dc7f3269-eb49-4945-9159-7dfb7e958b6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:32 np0005539505 NetworkManager[55134]: <info>  [1764403112.0001] manager: (tape25d5113-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/423)
Nov 29 02:58:32 np0005539505 systemd-udevd[254932]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.005 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a40205-5c87-4eba-9b2b-443bb2a91124]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.018 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.040 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.044 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[0de90567-58d5-4045-840c-34b48334548f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.047 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[8d40c2fc-19d8-4fb1-878c-be36add09fc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:58:32Z|00846|binding|INFO|Setting lport 127f3745-5a5a-40fe-a9dd-c135fbf7d109 ovn-installed in OVS
Nov 29 02:58:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:58:32Z|00847|binding|INFO|Setting lport 127f3745-5a5a-40fe-a9dd-c135fbf7d109 up in Southbound
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.051 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:32 np0005539505 NetworkManager[55134]: <info>  [1764403112.0703] device (tape25d5113-a0): carrier: link connected
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.076 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[33ab4275-cc66-42c7-9624-72a75a61ed23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.095 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[319c5120-3043-46e2-b7d8-6ab525ee4b83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape25d5113-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:60:0a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 262], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 855965, 'reachable_time': 35162, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254962, 'error': None, 'target': 'ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.115 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[05bd6667-0497-4a00-925b-5f757a5f2c98]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9f:600a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 855965, 'tstamp': 855965}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254963, 'error': None, 'target': 'ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.133 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[99ad21aa-ca7e-47f1-b17d-973493ad17e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape25d5113-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9f:60:0a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 262], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 855965, 'reachable_time': 35162, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254965, 'error': None, 'target': 'ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.171 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[0f725c08-1fbb-401b-9c0c-52e1e80c424d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.213 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403112.2116263, 0a864665-adb3-44b7-8550-4dcd7f7e8251 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.214 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] VM Started (Lifecycle Event)#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.235 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.241 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403112.2119858, 0a864665-adb3-44b7-8550-4dcd7f7e8251 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.241 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.246 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[3945526a-c22b-40d2-aa35-1e5c5e3c732e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.247 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape25d5113-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.247 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.248 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape25d5113-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.249 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:32 np0005539505 kernel: tape25d5113-a0: entered promiscuous mode
Nov 29 02:58:32 np0005539505 NetworkManager[55134]: <info>  [1764403112.2505] manager: (tape25d5113-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/424)
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.251 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.253 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape25d5113-a0, col_values=(('external_ids', {'iface-id': '75a0c6b7-2dfb-46f1-937c-112eb9b0d504'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.254 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:32 np0005539505 ovn_controller[95143]: 2025-11-29T07:58:32Z|00848|binding|INFO|Releasing lport 75a0c6b7-2dfb-46f1-937c-112eb9b0d504 from this chassis (sb_readonly=0)
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.255 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.258 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e25d5113-a42d-44ca-8e65-a777d9e11f48.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e25d5113-a42d-44ca-8e65-a777d9e11f48.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.259 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[052493ad-fd12-43f4-bb66-9dc60ae18036]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.260 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-e25d5113-a42d-44ca-8e65-a777d9e11f48
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/e25d5113-a42d-44ca-8e65-a777d9e11f48.pid.haproxy
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID e25d5113-a42d-44ca-8e65-a777d9e11f48
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:58:32 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:32.261 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'env', 'PROCESS_TAG=haproxy-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e25d5113-a42d-44ca-8e65-a777d9e11f48.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.262 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.266 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.268 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.298 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.372 186962 DEBUG nova.compute.manager [req-accc1670-a6c0-476b-9987-e9c20b494e4a req-fa83625a-b0e8-4803-aaea-a9150a909dd4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Received event network-vif-plugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.373 186962 DEBUG oslo_concurrency.lockutils [req-accc1670-a6c0-476b-9987-e9c20b494e4a req-fa83625a-b0e8-4803-aaea-a9150a909dd4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.373 186962 DEBUG oslo_concurrency.lockutils [req-accc1670-a6c0-476b-9987-e9c20b494e4a req-fa83625a-b0e8-4803-aaea-a9150a909dd4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.374 186962 DEBUG oslo_concurrency.lockutils [req-accc1670-a6c0-476b-9987-e9c20b494e4a req-fa83625a-b0e8-4803-aaea-a9150a909dd4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.374 186962 DEBUG nova.compute.manager [req-accc1670-a6c0-476b-9987-e9c20b494e4a req-fa83625a-b0e8-4803-aaea-a9150a909dd4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Processing event network-vif-plugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.374 186962 DEBUG nova.compute.manager [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.378 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403112.3780296, 0a864665-adb3-44b7-8550-4dcd7f7e8251 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.378 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.380 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.383 186962 INFO nova.virt.libvirt.driver [-] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Instance spawned successfully.#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.383 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.402 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.409 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.414 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.415 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.416 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.416 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.417 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.418 186962 DEBUG nova.virt.libvirt.driver [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.458 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.505 186962 DEBUG nova.network.neutron [req-7bc0ccbf-86be-46bf-ad96-5c0d2abc3345 req-5bd82f8a-bd30-4259-bfc1-4fffb12f78ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Updated VIF entry in instance network info cache for port 127f3745-5a5a-40fe-a9dd-c135fbf7d109. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.506 186962 DEBUG nova.network.neutron [req-7bc0ccbf-86be-46bf-ad96-5c0d2abc3345 req-5bd82f8a-bd30-4259-bfc1-4fffb12f78ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Updating instance_info_cache with network_info: [{"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.539 186962 DEBUG oslo_concurrency.lockutils [req-7bc0ccbf-86be-46bf-ad96-5c0d2abc3345 req-5bd82f8a-bd30-4259-bfc1-4fffb12f78ee 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.541 186962 INFO nova.compute.manager [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Took 4.41 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.541 186962 DEBUG nova.compute.manager [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.624 186962 INFO nova.compute.manager [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Took 5.58 seconds to build instance.#033[00m
Nov 29 02:58:32 np0005539505 nova_compute[186958]: 2025-11-29 07:58:32.651 186962 DEBUG oslo_concurrency.lockutils [None req-df789e58-6f09-4ab2-bb23-87dd6bfe72f5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:32 np0005539505 podman[255003]: 2025-11-29 07:58:32.662172806 +0000 UTC m=+0.063281341 container create 7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:58:32 np0005539505 systemd[1]: Started libpod-conmon-7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2.scope.
Nov 29 02:58:32 np0005539505 podman[255003]: 2025-11-29 07:58:32.62625291 +0000 UTC m=+0.027361465 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:58:32 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:58:32 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15a787e32c5460130364a391f61d4f18063a139068af901cca56967ba85e4499/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:58:32 np0005539505 podman[255003]: 2025-11-29 07:58:32.752191801 +0000 UTC m=+0.153300366 container init 7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 29 02:58:32 np0005539505 podman[255003]: 2025-11-29 07:58:32.758918271 +0000 UTC m=+0.160026806 container start 7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 29 02:58:32 np0005539505 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[255018]: [NOTICE]   (255023) : New worker (255025) forked
Nov 29 02:58:32 np0005539505 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[255018]: [NOTICE]   (255023) : Loading success.
Nov 29 02:58:33 np0005539505 nova_compute[186958]: 2025-11-29 07:58:33.809 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:34 np0005539505 nova_compute[186958]: 2025-11-29 07:58:34.480 186962 DEBUG nova.compute.manager [req-fc875b18-b3f6-4de9-9769-300595a20cc7 req-bca6cc82-beeb-46a6-8e9d-13b8a9301aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Received event network-vif-plugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:58:34 np0005539505 nova_compute[186958]: 2025-11-29 07:58:34.481 186962 DEBUG oslo_concurrency.lockutils [req-fc875b18-b3f6-4de9-9769-300595a20cc7 req-bca6cc82-beeb-46a6-8e9d-13b8a9301aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:34 np0005539505 nova_compute[186958]: 2025-11-29 07:58:34.482 186962 DEBUG oslo_concurrency.lockutils [req-fc875b18-b3f6-4de9-9769-300595a20cc7 req-bca6cc82-beeb-46a6-8e9d-13b8a9301aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:34 np0005539505 nova_compute[186958]: 2025-11-29 07:58:34.482 186962 DEBUG oslo_concurrency.lockutils [req-fc875b18-b3f6-4de9-9769-300595a20cc7 req-bca6cc82-beeb-46a6-8e9d-13b8a9301aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:34 np0005539505 nova_compute[186958]: 2025-11-29 07:58:34.482 186962 DEBUG nova.compute.manager [req-fc875b18-b3f6-4de9-9769-300595a20cc7 req-bca6cc82-beeb-46a6-8e9d-13b8a9301aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] No waiting events found dispatching network-vif-plugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:58:34 np0005539505 nova_compute[186958]: 2025-11-29 07:58:34.483 186962 WARNING nova.compute.manager [req-fc875b18-b3f6-4de9-9769-300595a20cc7 req-bca6cc82-beeb-46a6-8e9d-13b8a9301aa5 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Received unexpected event network-vif-plugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:58:34 np0005539505 podman[255035]: 2025-11-29 07:58:34.749882059 +0000 UTC m=+0.073479969 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 29 02:58:34 np0005539505 podman[255034]: 2025-11-29 07:58:34.757662419 +0000 UTC m=+0.084973634 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 29 02:58:35 np0005539505 nova_compute[186958]: 2025-11-29 07:58:35.830 186962 DEBUG nova.compute.manager [req-b6694e17-b171-4748-856a-2948a23861bc req-d0806df4-e5c1-4629-b879-455ed21da036 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Received event network-changed-127f3745-5a5a-40fe-a9dd-c135fbf7d109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:58:35 np0005539505 nova_compute[186958]: 2025-11-29 07:58:35.830 186962 DEBUG nova.compute.manager [req-b6694e17-b171-4748-856a-2948a23861bc req-d0806df4-e5c1-4629-b879-455ed21da036 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Refreshing instance network info cache due to event network-changed-127f3745-5a5a-40fe-a9dd-c135fbf7d109. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:58:35 np0005539505 nova_compute[186958]: 2025-11-29 07:58:35.831 186962 DEBUG oslo_concurrency.lockutils [req-b6694e17-b171-4748-856a-2948a23861bc req-d0806df4-e5c1-4629-b879-455ed21da036 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:58:35 np0005539505 nova_compute[186958]: 2025-11-29 07:58:35.831 186962 DEBUG oslo_concurrency.lockutils [req-b6694e17-b171-4748-856a-2948a23861bc req-d0806df4-e5c1-4629-b879-455ed21da036 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:58:35 np0005539505 nova_compute[186958]: 2025-11-29 07:58:35.831 186962 DEBUG nova.network.neutron [req-b6694e17-b171-4748-856a-2948a23861bc req-d0806df4-e5c1-4629-b879-455ed21da036 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Refreshing network info cache for port 127f3745-5a5a-40fe-a9dd-c135fbf7d109 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:58:36 np0005539505 nova_compute[186958]: 2025-11-29 07:58:36.256 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:37 np0005539505 nova_compute[186958]: 2025-11-29 07:58:37.166 186962 DEBUG nova.network.neutron [req-b6694e17-b171-4748-856a-2948a23861bc req-d0806df4-e5c1-4629-b879-455ed21da036 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Updated VIF entry in instance network info cache for port 127f3745-5a5a-40fe-a9dd-c135fbf7d109. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:58:37 np0005539505 nova_compute[186958]: 2025-11-29 07:58:37.167 186962 DEBUG nova.network.neutron [req-b6694e17-b171-4748-856a-2948a23861bc req-d0806df4-e5c1-4629-b879-455ed21da036 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Updating instance_info_cache with network_info: [{"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:58:37 np0005539505 nova_compute[186958]: 2025-11-29 07:58:37.194 186962 DEBUG oslo_concurrency.lockutils [req-b6694e17-b171-4748-856a-2948a23861bc req-d0806df4-e5c1-4629-b879-455ed21da036 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:58:37 np0005539505 nova_compute[186958]: 2025-11-29 07:58:37.508 186962 DEBUG nova.compute.manager [req-3b45890f-60bc-4aa6-ac59-9b45adfb313f req-5bc28112-13ed-4419-907c-a2b859319fb1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Received event network-changed-127f3745-5a5a-40fe-a9dd-c135fbf7d109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:58:37 np0005539505 nova_compute[186958]: 2025-11-29 07:58:37.509 186962 DEBUG nova.compute.manager [req-3b45890f-60bc-4aa6-ac59-9b45adfb313f req-5bc28112-13ed-4419-907c-a2b859319fb1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Refreshing instance network info cache due to event network-changed-127f3745-5a5a-40fe-a9dd-c135fbf7d109. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:58:37 np0005539505 nova_compute[186958]: 2025-11-29 07:58:37.509 186962 DEBUG oslo_concurrency.lockutils [req-3b45890f-60bc-4aa6-ac59-9b45adfb313f req-5bc28112-13ed-4419-907c-a2b859319fb1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:58:37 np0005539505 nova_compute[186958]: 2025-11-29 07:58:37.510 186962 DEBUG oslo_concurrency.lockutils [req-3b45890f-60bc-4aa6-ac59-9b45adfb313f req-5bc28112-13ed-4419-907c-a2b859319fb1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:58:37 np0005539505 nova_compute[186958]: 2025-11-29 07:58:37.510 186962 DEBUG nova.network.neutron [req-3b45890f-60bc-4aa6-ac59-9b45adfb313f req-5bc28112-13ed-4419-907c-a2b859319fb1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Refreshing network info cache for port 127f3745-5a5a-40fe-a9dd-c135fbf7d109 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:58:38 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:38.786 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:58:38 np0005539505 nova_compute[186958]: 2025-11-29 07:58:38.810 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:39 np0005539505 nova_compute[186958]: 2025-11-29 07:58:39.055 186962 DEBUG nova.network.neutron [req-3b45890f-60bc-4aa6-ac59-9b45adfb313f req-5bc28112-13ed-4419-907c-a2b859319fb1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Updated VIF entry in instance network info cache for port 127f3745-5a5a-40fe-a9dd-c135fbf7d109. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:58:39 np0005539505 nova_compute[186958]: 2025-11-29 07:58:39.056 186962 DEBUG nova.network.neutron [req-3b45890f-60bc-4aa6-ac59-9b45adfb313f req-5bc28112-13ed-4419-907c-a2b859319fb1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Updating instance_info_cache with network_info: [{"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:58:39 np0005539505 nova_compute[186958]: 2025-11-29 07:58:39.076 186962 DEBUG oslo_concurrency.lockutils [req-3b45890f-60bc-4aa6-ac59-9b45adfb313f req-5bc28112-13ed-4419-907c-a2b859319fb1 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-0a864665-adb3-44b7-8550-4dcd7f7e8251" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:58:41 np0005539505 nova_compute[186958]: 2025-11-29 07:58:41.260 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:43 np0005539505 nova_compute[186958]: 2025-11-29 07:58:43.812 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:44 np0005539505 nova_compute[186958]: 2025-11-29 07:58:44.391 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:45 np0005539505 ovn_controller[95143]: 2025-11-29T07:58:45Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:65:8f:87 10.100.0.6
Nov 29 02:58:45 np0005539505 ovn_controller[95143]: 2025-11-29T07:58:45Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:8f:87 10.100.0.6
Nov 29 02:58:46 np0005539505 nova_compute[186958]: 2025-11-29 07:58:46.264 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.114 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b6', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'hostId': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.144 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.read.latency volume: 176633559 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.144 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.read.latency volume: 25490234 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c7660c2-8fe3-4ce6-81ea-901ef85c4dff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 176633559, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-vda', 'timestamp': '2025-11-29T07:58:48.115189', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c70fc98-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': '85f2326c96007060658eab56a1a275633da2877da6425dca02204bb60228bc55'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25490234, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-sda', 'timestamp': '2025-11-29T07:58:48.115189', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c710918-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': 'd4188e366f8c55148e8baa3136c172be82f53cf11a79f64ff2ba7705a07e3ae9'}]}, 'timestamp': '2025-11-29 07:58:48.145038', '_unique_id': '8e70cdf8c8fe4d639b0c6111a159f280'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.161 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.161 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1697dfa4-dea9-44c9-81b2-38dc8abd0bc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-vda', 'timestamp': '2025-11-29T07:58:48.147099', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c73985e-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.787950603, 'message_signature': 'bd09a3384167240663094610a9409ec96fa2cffbf7d3caff2a222f7f142d1fc5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-sda', 'timestamp': '2025-11-29T07:58:48.147099', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c73a5f6-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.787950603, 'message_signature': '194aed8b75f4d961b7b20d30084d6512a23fad9414af7b649beab560302abf53'}]}, 'timestamp': '2025-11-29 07:58:48.162175', '_unique_id': 'd88a900608f44027b27cb1f3f2496d9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.163 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.164 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.182 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26dac27d-8f8d-4199-9623-8812d74f18fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'timestamp': '2025-11-29T07:58:48.164732', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3c76e626-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.823455307, 'message_signature': 'd4998fa82e59a896c1e29fe795401675b5cae9b00f91608214db3e7dca4c41f8'}]}, 'timestamp': '2025-11-29 07:58:48.183613', '_unique_id': '00a5d1d577ce43678cb0706be7ac4167'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.185 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.read.requests volume: 1108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.186 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c84e57e2-0820-4c1e-87de-970387c29829', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1108, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-vda', 'timestamp': '2025-11-29T07:58:48.185955', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c775174-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': '7099ca914e02ef6d59f7ded2b630cc77ac6bfe04d93079f6d0328ba6451940e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-sda', 'timestamp': '2025-11-29T07:58:48.185955', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c775b7e-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': 'a15600252f1cba21a8a60b4d4e13c9f82f5c6ea4ca2abcfdf2b372a1dcc4a335'}]}, 'timestamp': '2025-11-29 07:58:48.186473', '_unique_id': 'b884572b0f0144d9bffddc1ea5526ab5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.187 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/cpu volume: 10930000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6d08b82-b767-4ded-abf9-1072af30dc4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10930000000, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'timestamp': '2025-11-29T07:58:48.187746', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3c7796d4-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.823455307, 'message_signature': '4d0e87e785efff9a0d47a18e1d50016043ce69695cf23f1235e9ffec9a72e405'}]}, 'timestamp': '2025-11-29 07:58:48.187980', '_unique_id': '965173b3f6184812b9d4956362f61328'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.188 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.189 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.189 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9e5db63-2f73-4470-8ffd-4490fb4bcdb1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-vda', 'timestamp': '2025-11-29T07:58:48.189131', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c77cd5c-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.787950603, 'message_signature': '5934786be3ac8997bcd88645e2dad61764951a28510ea79ad4025e5f8914c2e1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-sda', 'timestamp': '2025-11-29T07:58:48.189131', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c77d676-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.787950603, 'message_signature': '263060da11e94b68fdb3677957a661f73384867dc6d81616910cdb3a846b93be'}]}, 'timestamp': '2025-11-29 07:58:48.189603', '_unique_id': '4d594b958b5d431fba80dbdc6d9a9ec9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.190 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.194 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0a864665-adb3-44b7-8550-4dcd7f7e8251 / tap127f3745-5a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.194 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f841e25-c75c-487c-871a-e84432be701d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b6-0a864665-adb3-44b7-8550-4dcd7f7e8251-tap127f3745-5a', 'timestamp': '2025-11-29T07:58:48.191158', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'tap127f3745-5a', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:8f:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap127f3745-5a'}, 'message_id': '3c78a222-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.8320486, 'message_signature': '6f1593de959108745ebc4d54a0de985150aae9c763d01e5dfe36f8edeff84c23'}]}, 'timestamp': '2025-11-29 07:58:48.194880', '_unique_id': 'c022c6b263d9413385211dcada6c01ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.196 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.write.bytes volume: 72753152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.196 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0976a8a9-4dc7-47ea-9d44-61d72080458a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72753152, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-vda', 'timestamp': '2025-11-29T07:58:48.196434', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c78eb42-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': '9807b3e4d7e5bab527dcc90f9e8b725d7088b22ffcdfe115c307283e3f80c7e9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-sda', 'timestamp': '2025-11-29T07:58:48.196434', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c78f402-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': '8f5f8aa5c5a16a1ce82add9eb692c5f601ea205aab5fcc165498b50676b0e26f'}]}, 'timestamp': '2025-11-29 07:58:48.196908', '_unique_id': 'a99df941a2ca429b9ff43b516ad28924'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.198 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.write.requests volume: 297 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.198 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '950dabfc-f632-45c2-ac06-d297f137133d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 297, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-vda', 'timestamp': '2025-11-29T07:58:48.198088', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c792a8a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': 'c16046d210f1c9fe3908d8918a752b623a70d042221ca9366b407b2e5dd1e8f4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-sda', 'timestamp': '2025-11-29T07:58:48.198088', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c79341c-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': '10731f94d05aa99589f31f52b84fa2ec0c6bfc3f91d0e691069de2ba961bca6b'}]}, 'timestamp': '2025-11-29 07:58:48.198553', '_unique_id': 'feea084880f54284a36646be37613164'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.199 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403>]
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.200 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.200 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dd8c780-a34f-4df4-b39e-105cbc4bb722', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b6-0a864665-adb3-44b7-8550-4dcd7f7e8251-tap127f3745-5a', 'timestamp': '2025-11-29T07:58:48.200265', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'tap127f3745-5a', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:8f:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap127f3745-5a'}, 'message_id': '3c7981a6-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.8320486, 'message_signature': '01fdcd197ebd3550c48613a73889d036a557b4a6e855bfbb0becbfd4d42594ac'}]}, 'timestamp': '2025-11-29 07:58:48.200594', '_unique_id': '812d9941a3314db289f54b25aa2e55e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.202 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/network.outgoing.bytes volume: 1326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9ca4164-42bc-446e-a5ea-eeb0e8380d34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1326, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b6-0a864665-adb3-44b7-8550-4dcd7f7e8251-tap127f3745-5a', 'timestamp': '2025-11-29T07:58:48.202122', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'tap127f3745-5a', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:8f:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap127f3745-5a'}, 'message_id': '3c79c9fe-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.8320486, 'message_signature': '50b455f04b37f88321c86b9faec1df1d25cc42a656a2e526d11def8ccb097a6d'}]}, 'timestamp': '2025-11-29 07:58:48.202446', '_unique_id': 'ee95b882218449819a816a65f25dfcb5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.204 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.read.bytes volume: 30583296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.204 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da0415f7-c839-4a89-86e9-fd39ed94fbb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30583296, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-vda', 'timestamp': '2025-11-29T07:58:48.204033', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c7a13f0-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': 'a28bab99da53e08afb350b473275b282939b498904173860374c5f948c838dd6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-sda', 'timestamp': '2025-11-29T07:58:48.204033', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c7a1d96-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': 'c4b01f8bec48eaecb03a4b24e351eeaedbba146945291e1165c732760c5d5158'}]}, 'timestamp': '2025-11-29 07:58:48.204529', '_unique_id': '623c4fa188184e6ca19b23c079394f35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.205 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/network.incoming.bytes volume: 1940 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc89365a-6e83-4390-8020-b5fb6e396aed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1940, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b6-0a864665-adb3-44b7-8550-4dcd7f7e8251-tap127f3745-5a', 'timestamp': '2025-11-29T07:58:48.205681', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'tap127f3745-5a', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:8f:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap127f3745-5a'}, 'message_id': '3c7a53e2-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.8320486, 'message_signature': '27ca6719ee9f554216123a11192ac0597bb81471eab5577e42a5a7dfc9673a7c'}]}, 'timestamp': '2025-11-29 07:58:48.205931', '_unique_id': 'f5a63bcaffb546fe9c5a2104615e0a5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.207 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.207 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403>]
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.207 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37df6aee-40d3-4f6b-8a07-e52b45e10fd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b6-0a864665-adb3-44b7-8550-4dcd7f7e8251-tap127f3745-5a', 'timestamp': '2025-11-29T07:58:48.207319', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'tap127f3745-5a', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:8f:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap127f3745-5a'}, 'message_id': '3c7a938e-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.8320486, 'message_signature': '2b2ffb98df41a03df71071ae3dcb5b6c52a78c4bd8926584b5c2937dffff668a'}]}, 'timestamp': '2025-11-29 07:58:48.207561', '_unique_id': 'b37aa69e71464fbd88a1a2126f2ad9e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.208 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac3c43f7-9234-4ac9-a50e-862082ea9b40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b6-0a864665-adb3-44b7-8550-4dcd7f7e8251-tap127f3745-5a', 'timestamp': '2025-11-29T07:58:48.208693', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'tap127f3745-5a', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:8f:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap127f3745-5a'}, 'message_id': '3c7ac9c6-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.8320486, 'message_signature': 'c3f81d50d1fca6c7f5cff3a91fbe66f2dbc654b1ae67618b60ac77d154438451'}]}, 'timestamp': '2025-11-29 07:58:48.208962', '_unique_id': '2ce00cbeb65a41318821048700ecc6c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.209 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.210 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72837057-4433-4cb2-871b-64ef55c66126', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b6-0a864665-adb3-44b7-8550-4dcd7f7e8251-tap127f3745-5a', 'timestamp': '2025-11-29T07:58:48.210385', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'tap127f3745-5a', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:8f:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap127f3745-5a'}, 'message_id': '3c7b0c4c-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.8320486, 'message_signature': 'd7206306ffdc973123f4526e30d574632d86b339911beceb68efc06dd8a00947'}]}, 'timestamp': '2025-11-29 07:58:48.210698', '_unique_id': 'ea178316f1764fdfb984e04deb8dc0f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.211 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.212 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8dbca676-a9ff-4c65-a141-f8c708ef1dec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b6-0a864665-adb3-44b7-8550-4dcd7f7e8251-tap127f3745-5a', 'timestamp': '2025-11-29T07:58:48.212197', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'tap127f3745-5a', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:8f:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap127f3745-5a'}, 'message_id': '3c7b53a0-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.8320486, 'message_signature': '5735d673b875bfd10c75bca29550e556f0449ea34ea6c4d9d1fc95c1d57f1561'}]}, 'timestamp': '2025-11-29 07:58:48.212520', '_unique_id': 'c4f80c3957ed49a3975be50c8d2da79f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.213 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ed673e5-67cb-4ad7-bb76-f25d284cea70', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b6-0a864665-adb3-44b7-8550-4dcd7f7e8251-tap127f3745-5a', 'timestamp': '2025-11-29T07:58:48.213971', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'tap127f3745-5a', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:8f:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap127f3745-5a'}, 'message_id': '3c7b96e4-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.8320486, 'message_signature': '8b8ce8f9d988263510313dc479661353199510cf016660796907ff29f90a19b7'}]}, 'timestamp': '2025-11-29 07:58:48.214202', '_unique_id': '1f9d6aa30b3a4908ad0e07259d99685c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.214 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.215 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.215 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.215 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403>]
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.215 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.215 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.write.latency volume: 7110937597 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.215 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd83157e-dc50-44ab-951e-d3dda1e7941f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7110937597, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-vda', 'timestamp': '2025-11-29T07:58:48.215690', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c7bdb4a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': '22a0f212c1fb7ac6ba5c48f1bbdeaba12252b6daa696be039097c8e29c0525d0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-sda', 'timestamp': '2025-11-29T07:58:48.215690', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c7be34c-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.75604184, 'message_signature': '74a0848ef085ca2c6417b0b9c0917ce93e96b1b53a9447033bf75f7f53eed7b0'}]}, 'timestamp': '2025-11-29 07:58:48.216141', '_unique_id': '87fd65a021974fca81e06d2561d62583'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.216 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.217 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.217 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403>]
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.217 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.217 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff24d458-3f40-4841-b9b5-e1678832a9bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-vda', 'timestamp': '2025-11-29T07:58:48.217571', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c7c2370-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.787950603, 'message_signature': 'a4254ef451155d7674f047649bfd6f8656f513fc1b61ddbf291d5313ac861639'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251-sda', 'timestamp': '2025-11-29T07:58:48.217571', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'instance-000000b6', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c7c2b5e-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.787950603, 'message_signature': '211434a8c35b1cdf6712c8cf17fa9843888f5716addbfd71474dc9934a7f80c0'}]}, 'timestamp': '2025-11-29 07:58:48.217986', '_unique_id': '8c9d6e27ba7442e9955723364d7ffe99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 DEBUG ceilometer.compute.pollsters [-] 0a864665-adb3-44b7-8550-4dcd7f7e8251/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a538040f-6d16-45cd-bba5-f1eebce1baeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b6-0a864665-adb3-44b7-8550-4dcd7f7e8251-tap127f3745-5a', 'timestamp': '2025-11-29T07:58:48.219054', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403', 'name': 'tap127f3745-5a', 'instance_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:8f:87', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap127f3745-5a'}, 'message_id': '3c7c5d5e-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8575.8320486, 'message_signature': '5890109cb5bd953bec3366e09e3c7b211041b7d26be53bb02edfde479458d49a'}]}, 'timestamp': '2025-11-29 07:58:48.219311', '_unique_id': 'b6c500384b9b4bf2aca81f89be0a8a03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 02:58:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 07:58:48.219 12 ERROR oslo_messaging.notify.messaging 
Nov 29 02:58:48 np0005539505 nova_compute[186958]: 2025-11-29 07:58:48.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:48 np0005539505 nova_compute[186958]: 2025-11-29 07:58:48.380 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:48 np0005539505 nova_compute[186958]: 2025-11-29 07:58:48.380 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:58:48 np0005539505 podman[255091]: 2025-11-29 07:58:48.722242968 +0000 UTC m=+0.052919747 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:58:48 np0005539505 podman[255090]: 2025-11-29 07:58:48.756324932 +0000 UTC m=+0.090271864 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 29 02:58:48 np0005539505 nova_compute[186958]: 2025-11-29 07:58:48.814 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.360 186962 DEBUG oslo_concurrency.lockutils [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "0a864665-adb3-44b7-8550-4dcd7f7e8251" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.361 186962 DEBUG oslo_concurrency.lockutils [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.361 186962 DEBUG oslo_concurrency.lockutils [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.361 186962 DEBUG oslo_concurrency.lockutils [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.361 186962 DEBUG oslo_concurrency.lockutils [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.378 186962 INFO nova.compute.manager [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Terminating instance#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.390 186962 DEBUG nova.compute.manager [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 02:58:50 np0005539505 kernel: tap127f3745-5a (unregistering): left promiscuous mode
Nov 29 02:58:50 np0005539505 NetworkManager[55134]: <info>  [1764403130.4123] device (tap127f3745-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.421 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:58:50Z|00849|binding|INFO|Releasing lport 127f3745-5a5a-40fe-a9dd-c135fbf7d109 from this chassis (sb_readonly=0)
Nov 29 02:58:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:58:50Z|00850|binding|INFO|Setting lport 127f3745-5a5a-40fe-a9dd-c135fbf7d109 down in Southbound
Nov 29 02:58:50 np0005539505 ovn_controller[95143]: 2025-11-29T07:58:50Z|00851|binding|INFO|Removing iface tap127f3745-5a ovn-installed in OVS
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.422 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.428 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:8f:87 10.100.0.6', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0a864665-adb3-44b7-8550-4dcd7f7e8251', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3099386-1dbe-4af7-95d0-de6761c24471, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=127f3745-5a5a-40fe-a9dd-c135fbf7d109) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.429 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 127f3745-5a5a-40fe-a9dd-c135fbf7d109 in datapath e25d5113-a42d-44ca-8e65-a777d9e11f48 unbound from our chassis#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.431 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e25d5113-a42d-44ca-8e65-a777d9e11f48, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.431 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[86e20594-6f90-41a1-bbdc-cb5a264b0df6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.432 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48 namespace which is not needed anymore#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.437 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:50 np0005539505 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b6.scope: Deactivated successfully.
Nov 29 02:58:50 np0005539505 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b6.scope: Consumed 12.257s CPU time.
Nov 29 02:58:50 np0005539505 systemd-machined[153285]: Machine qemu-88-instance-000000b6 terminated.
Nov 29 02:58:50 np0005539505 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[255018]: [NOTICE]   (255023) : haproxy version is 2.8.14-c23fe91
Nov 29 02:58:50 np0005539505 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[255018]: [NOTICE]   (255023) : path to executable is /usr/sbin/haproxy
Nov 29 02:58:50 np0005539505 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[255018]: [WARNING]  (255023) : Exiting Master process...
Nov 29 02:58:50 np0005539505 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[255018]: [ALERT]    (255023) : Current worker (255025) exited with code 143 (Terminated)
Nov 29 02:58:50 np0005539505 neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48[255018]: [WARNING]  (255023) : All workers exited. Exiting... (0)
Nov 29 02:58:50 np0005539505 systemd[1]: libpod-7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2.scope: Deactivated successfully.
Nov 29 02:58:50 np0005539505 podman[255158]: 2025-11-29 07:58:50.563598936 +0000 UTC m=+0.045055935 container died 7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:58:50 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2-userdata-shm.mount: Deactivated successfully.
Nov 29 02:58:50 np0005539505 systemd[1]: var-lib-containers-storage-overlay-15a787e32c5460130364a391f61d4f18063a139068af901cca56967ba85e4499-merged.mount: Deactivated successfully.
Nov 29 02:58:50 np0005539505 podman[255158]: 2025-11-29 07:58:50.612105388 +0000 UTC m=+0.093562377 container cleanup 7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:58:50 np0005539505 systemd[1]: libpod-conmon-7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2.scope: Deactivated successfully.
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.666 186962 INFO nova.virt.libvirt.driver [-] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Instance destroyed successfully.#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.667 186962 DEBUG nova.objects.instance [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'resources' on Instance uuid 0a864665-adb3-44b7-8550-4dcd7f7e8251 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:58:50 np0005539505 podman[255193]: 2025-11-29 07:58:50.678984609 +0000 UTC m=+0.042073130 container remove 7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.680 186962 DEBUG nova.virt.libvirt.vif [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:58:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-gen-1-1604603403',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ge',id=182,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCbO5lg6MOx4cmqxiU7BbZWHBZr0hw1OFyx1wi8ylE3DD5XkolVPeqN7tTosQEkEUch/54VxCcm6kKSgh3IzPEqFnEAKzoXbjFEzrMqt3IWtOqFGx5kqMbL3gOxuWmKn/A==',key_name='tempest-TestSecurityGroupsBasicOps-519869731',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:58:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-aeiperp7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:58:32Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=0a864665-adb3-44b7-8550-4dcd7f7e8251,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.681 186962 DEBUG nova.network.os_vif_util [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "address": "fa:16:3e:65:8f:87", "network": {"id": "e25d5113-a42d-44ca-8e65-a777d9e11f48", "bridge": "br-int", "label": "tempest-network-smoke--1591284109", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap127f3745-5a", "ovs_interfaceid": "127f3745-5a5a-40fe-a9dd-c135fbf7d109", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.681 186962 DEBUG nova.network.os_vif_util [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:8f:87,bridge_name='br-int',has_traffic_filtering=True,id=127f3745-5a5a-40fe-a9dd-c135fbf7d109,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap127f3745-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.681 186962 DEBUG os_vif [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:8f:87,bridge_name='br-int',has_traffic_filtering=True,id=127f3745-5a5a-40fe-a9dd-c135fbf7d109,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap127f3745-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.683 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.683 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap127f3745-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.686 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.686 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[027d1950-8866-4d6a-b358-d1db22dccc28]: (4, ('Sat Nov 29 07:58:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48 (7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2)\n7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2\nSat Nov 29 07:58:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48 (7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2)\n7c6baa1e59d254cd44cc26c27a937e1e7222319e594ed879be7dded1f92839d2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.689 186962 INFO os_vif [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:8f:87,bridge_name='br-int',has_traffic_filtering=True,id=127f3745-5a5a-40fe-a9dd-c135fbf7d109,network=Network(e25d5113-a42d-44ca-8e65-a777d9e11f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap127f3745-5a')#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.689 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[96e13093-97ad-4092-a108-244fc5047bbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.690 186962 INFO nova.virt.libvirt.driver [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Deleting instance files /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251_del#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.690 186962 INFO nova.virt.libvirt.driver [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Deletion of /var/lib/nova/instances/0a864665-adb3-44b7-8550-4dcd7f7e8251_del complete#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.691 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape25d5113-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.692 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:50 np0005539505 kernel: tape25d5113-a0: left promiscuous mode
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.695 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.698 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b71a32-a779-4fe2-94e9-68092963e7ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.707 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.722 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[619d5d1d-9ed8-49ee-970b-7587e70ce25c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.724 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8e8dde84-c0a2-48a9-857e-86cc9f64c2d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.739 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[729488f3-3c38-4a3d-afbf-e32888289e90]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 855956, 'reachable_time': 24692, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255218, 'error': None, 'target': 'ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.742 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e25d5113-a42d-44ca-8e65-a777d9e11f48 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 02:58:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:58:50.742 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[72c1a477-271b-45ae-a016-c522655fbcf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:58:50 np0005539505 systemd[1]: run-netns-ovnmeta\x2de25d5113\x2da42d\x2d44ca\x2d8e65\x2da777d9e11f48.mount: Deactivated successfully.
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.752 186962 INFO nova.compute.manager [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.753 186962 DEBUG oslo.service.loopingcall [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.753 186962 DEBUG nova.compute.manager [-] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 02:58:50 np0005539505 nova_compute[186958]: 2025-11-29 07:58:50.754 186962 DEBUG nova.network.neutron [-] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.318 186962 DEBUG nova.compute.manager [req-0948d1c1-5da3-4564-9bcb-05d55c121cdd req-12f461c4-7a8c-468a-8bd1-8abdce11d5c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Received event network-vif-unplugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.318 186962 DEBUG oslo_concurrency.lockutils [req-0948d1c1-5da3-4564-9bcb-05d55c121cdd req-12f461c4-7a8c-468a-8bd1-8abdce11d5c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.319 186962 DEBUG oslo_concurrency.lockutils [req-0948d1c1-5da3-4564-9bcb-05d55c121cdd req-12f461c4-7a8c-468a-8bd1-8abdce11d5c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.319 186962 DEBUG oslo_concurrency.lockutils [req-0948d1c1-5da3-4564-9bcb-05d55c121cdd req-12f461c4-7a8c-468a-8bd1-8abdce11d5c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.319 186962 DEBUG nova.compute.manager [req-0948d1c1-5da3-4564-9bcb-05d55c121cdd req-12f461c4-7a8c-468a-8bd1-8abdce11d5c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] No waiting events found dispatching network-vif-unplugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.319 186962 DEBUG nova.compute.manager [req-0948d1c1-5da3-4564-9bcb-05d55c121cdd req-12f461c4-7a8c-468a-8bd1-8abdce11d5c6 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Received event network-vif-unplugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.406 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.407 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.407 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.407 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.563 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.564 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5682MB free_disk=73.07085800170898GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.564 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.564 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.755 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 0a864665-adb3-44b7-8550-4dcd7f7e8251 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.755 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.755 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.801 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:58:51 np0005539505 nova_compute[186958]: 2025-11-29 07:58:51.815 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.242 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.243 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.391 186962 DEBUG nova.network.neutron [-] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.408 186962 INFO nova.compute.manager [-] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Took 1.65 seconds to deallocate network for instance.#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.462 186962 DEBUG nova.compute.manager [req-321efd1b-2ecd-417a-96c8-1d503604ab8a req-dca5f4d4-347a-4e34-9e09-2b2f879fd806 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Received event network-vif-deleted-127f3745-5a5a-40fe-a9dd-c135fbf7d109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.489 186962 DEBUG oslo_concurrency.lockutils [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.490 186962 DEBUG oslo_concurrency.lockutils [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.548 186962 DEBUG nova.compute.provider_tree [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.565 186962 DEBUG nova.scheduler.client.report [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.587 186962 DEBUG oslo_concurrency.lockutils [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.609 186962 INFO nova.scheduler.client.report [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Deleted allocations for instance 0a864665-adb3-44b7-8550-4dcd7f7e8251#033[00m
Nov 29 02:58:52 np0005539505 nova_compute[186958]: 2025-11-29 07:58:52.705 186962 DEBUG oslo_concurrency.lockutils [None req-4db531c4-2e0a-49e5-be45-a40f6259c2d5 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:52 np0005539505 podman[255220]: 2025-11-29 07:58:52.722802451 +0000 UTC m=+0.052909417 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.243 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.449 186962 DEBUG nova.compute.manager [req-73edcdd9-3d70-4e5d-9c90-239631996404 req-eab512fe-f0b4-4b62-8ba4-f2be52434761 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Received event network-vif-plugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.449 186962 DEBUG oslo_concurrency.lockutils [req-73edcdd9-3d70-4e5d-9c90-239631996404 req-eab512fe-f0b4-4b62-8ba4-f2be52434761 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.449 186962 DEBUG oslo_concurrency.lockutils [req-73edcdd9-3d70-4e5d-9c90-239631996404 req-eab512fe-f0b4-4b62-8ba4-f2be52434761 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.449 186962 DEBUG oslo_concurrency.lockutils [req-73edcdd9-3d70-4e5d-9c90-239631996404 req-eab512fe-f0b4-4b62-8ba4-f2be52434761 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "0a864665-adb3-44b7-8550-4dcd7f7e8251-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.450 186962 DEBUG nova.compute.manager [req-73edcdd9-3d70-4e5d-9c90-239631996404 req-eab512fe-f0b4-4b62-8ba4-f2be52434761 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] No waiting events found dispatching network-vif-plugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.450 186962 WARNING nova.compute.manager [req-73edcdd9-3d70-4e5d-9c90-239631996404 req-eab512fe-f0b4-4b62-8ba4-f2be52434761 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Received unexpected event network-vif-plugged-127f3745-5a5a-40fe-a9dd-c135fbf7d109 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.451 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.451 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:53 np0005539505 nova_compute[186958]: 2025-11-29 07:58:53.816 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:55 np0005539505 nova_compute[186958]: 2025-11-29 07:58:55.686 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:58:57 np0005539505 nova_compute[186958]: 2025-11-29 07:58:57.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:58:58 np0005539505 nova_compute[186958]: 2025-11-29 07:58:58.841 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:00 np0005539505 nova_compute[186958]: 2025-11-29 07:59:00.687 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:00 np0005539505 podman[255240]: 2025-11-29 07:59:00.72201966 +0000 UTC m=+0.051889178 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:59:00 np0005539505 podman[255241]: 2025-11-29 07:59:00.762232667 +0000 UTC m=+0.086944499 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 02:59:01 np0005539505 nova_compute[186958]: 2025-11-29 07:59:01.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:03 np0005539505 nova_compute[186958]: 2025-11-29 07:59:03.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:03 np0005539505 nova_compute[186958]: 2025-11-29 07:59:03.842 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:05 np0005539505 nova_compute[186958]: 2025-11-29 07:59:05.664 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764403130.662708, 0a864665-adb3-44b7-8550-4dcd7f7e8251 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:59:05 np0005539505 nova_compute[186958]: 2025-11-29 07:59:05.664 186962 INFO nova.compute.manager [-] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] VM Stopped (Lifecycle Event)#033[00m
Nov 29 02:59:05 np0005539505 nova_compute[186958]: 2025-11-29 07:59:05.689 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:05 np0005539505 podman[255291]: 2025-11-29 07:59:05.717153085 +0000 UTC m=+0.052681301 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 02:59:05 np0005539505 podman[255292]: 2025-11-29 07:59:05.721096276 +0000 UTC m=+0.053597986 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:59:06 np0005539505 nova_compute[186958]: 2025-11-29 07:59:06.152 186962 DEBUG nova.compute.manager [None req-f8e71eee-1719-4132-9255-bf362b413820 - - - - - -] [instance: 0a864665-adb3-44b7-8550-4dcd7f7e8251] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:59:08 np0005539505 nova_compute[186958]: 2025-11-29 07:59:08.844 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:10 np0005539505 nova_compute[186958]: 2025-11-29 07:59:10.692 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:13 np0005539505 nova_compute[186958]: 2025-11-29 07:59:13.846 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:15 np0005539505 nova_compute[186958]: 2025-11-29 07:59:15.695 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:18 np0005539505 nova_compute[186958]: 2025-11-29 07:59:18.481 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:18 np0005539505 nova_compute[186958]: 2025-11-29 07:59:18.621 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:18 np0005539505 nova_compute[186958]: 2025-11-29 07:59:18.848 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:19 np0005539505 podman[255331]: 2025-11-29 07:59:19.737077402 +0000 UTC m=+0.066792820 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 02:59:19 np0005539505 podman[255330]: 2025-11-29 07:59:19.747151477 +0000 UTC m=+0.078747778 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Nov 29 02:59:20 np0005539505 nova_compute[186958]: 2025-11-29 07:59:20.696 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:23 np0005539505 podman[255373]: 2025-11-29 07:59:23.706841424 +0000 UTC m=+0.044491999 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:59:23 np0005539505 nova_compute[186958]: 2025-11-29 07:59:23.850 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:25 np0005539505 nova_compute[186958]: 2025-11-29 07:59:25.699 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:27.548 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:27.549 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:27.549 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:28 np0005539505 nova_compute[186958]: 2025-11-29 07:59:28.853 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:30 np0005539505 nova_compute[186958]: 2025-11-29 07:59:30.700 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:31 np0005539505 podman[255394]: 2025-11-29 07:59:31.713238727 +0000 UTC m=+0.049119460 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 02:59:31 np0005539505 podman[255395]: 2025-11-29 07:59:31.744278485 +0000 UTC m=+0.077948326 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 02:59:33 np0005539505 nova_compute[186958]: 2025-11-29 07:59:33.853 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:35 np0005539505 nova_compute[186958]: 2025-11-29 07:59:35.703 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:36 np0005539505 podman[255446]: 2025-11-29 07:59:36.710131542 +0000 UTC m=+0.047348670 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:59:36 np0005539505 podman[255447]: 2025-11-29 07:59:36.738682519 +0000 UTC m=+0.067113928 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 02:59:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:37.851 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:59:37 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:37.852 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 02:59:37 np0005539505 nova_compute[186958]: 2025-11-29 07:59:37.853 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:38 np0005539505 nova_compute[186958]: 2025-11-29 07:59:38.855 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:39 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:39.854 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:59:40 np0005539505 nova_compute[186958]: 2025-11-29 07:59:40.705 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.634 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.634 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.659 186962 DEBUG nova.compute.manager [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.787 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.788 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.795 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.795 186962 INFO nova.compute.claims [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.857 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.944 186962 DEBUG nova.compute.provider_tree [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.967 186962 DEBUG nova.scheduler.client.report [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.997 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:43 np0005539505 nova_compute[186958]: 2025-11-29 07:59:43.998 186962 DEBUG nova.compute.manager [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.072 186962 DEBUG nova.compute.manager [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.073 186962 DEBUG nova.network.neutron [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.095 186962 INFO nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.113 186962 DEBUG nova.compute.manager [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.478 186962 DEBUG nova.compute.manager [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.480 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.480 186962 INFO nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Creating image(s)#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.481 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "/var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.481 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.481 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "/var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.497 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.557 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.559 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.560 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.572 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.640 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.641 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.688 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.689 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.690 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.755 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.757 186962 DEBUG nova.virt.disk.api [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Checking if we can resize image /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.757 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.826 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.827 186962 DEBUG nova.virt.disk.api [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Cannot resize image /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.828 186962 DEBUG nova.objects.instance [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'migration_context' on Instance uuid 4a55b16d-d538-42b5-8f50-87c39d63b4eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.847 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.848 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Ensure instance console log exists: /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.848 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.849 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:44 np0005539505 nova_compute[186958]: 2025-11-29 07:59:44.849 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:45 np0005539505 nova_compute[186958]: 2025-11-29 07:59:45.216 186962 DEBUG nova.policy [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 02:59:45 np0005539505 nova_compute[186958]: 2025-11-29 07:59:45.707 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:48 np0005539505 nova_compute[186958]: 2025-11-29 07:59:48.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:48 np0005539505 nova_compute[186958]: 2025-11-29 07:59:48.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 02:59:48 np0005539505 nova_compute[186958]: 2025-11-29 07:59:48.469 186962 DEBUG nova.network.neutron [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Successfully created port: 4907b21d-7156-4b51-b9bb-e881a3f55cb8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 02:59:48 np0005539505 nova_compute[186958]: 2025-11-29 07:59:48.858 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:49 np0005539505 nova_compute[186958]: 2025-11-29 07:59:49.124 186962 DEBUG nova.network.neutron [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Successfully updated port: 4907b21d-7156-4b51-b9bb-e881a3f55cb8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 02:59:49 np0005539505 nova_compute[186958]: 2025-11-29 07:59:49.138 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:59:49 np0005539505 nova_compute[186958]: 2025-11-29 07:59:49.138 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquired lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:59:49 np0005539505 nova_compute[186958]: 2025-11-29 07:59:49.138 186962 DEBUG nova.network.neutron [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 02:59:49 np0005539505 nova_compute[186958]: 2025-11-29 07:59:49.229 186962 DEBUG nova.compute.manager [req-912b9b55-c958-4541-bebd-6710eb4b4caa req-3698eb8a-4462-4b29-89dc-f6dd5bfc4dcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Received event network-changed-4907b21d-7156-4b51-b9bb-e881a3f55cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:59:49 np0005539505 nova_compute[186958]: 2025-11-29 07:59:49.229 186962 DEBUG nova.compute.manager [req-912b9b55-c958-4541-bebd-6710eb4b4caa req-3698eb8a-4462-4b29-89dc-f6dd5bfc4dcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Refreshing instance network info cache due to event network-changed-4907b21d-7156-4b51-b9bb-e881a3f55cb8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:59:49 np0005539505 nova_compute[186958]: 2025-11-29 07:59:49.229 186962 DEBUG oslo_concurrency.lockutils [req-912b9b55-c958-4541-bebd-6710eb4b4caa req-3698eb8a-4462-4b29-89dc-f6dd5bfc4dcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:59:49 np0005539505 nova_compute[186958]: 2025-11-29 07:59:49.276 186962 DEBUG nova.network.neutron [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 02:59:49 np0005539505 nova_compute[186958]: 2025-11-29 07:59:49.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:50 np0005539505 nova_compute[186958]: 2025-11-29 07:59:50.709 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:50 np0005539505 podman[255503]: 2025-11-29 07:59:50.718689577 +0000 UTC m=+0.045930230 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 02:59:50 np0005539505 podman[255502]: 2025-11-29 07:59:50.71915505 +0000 UTC m=+0.050032956 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.188 186962 DEBUG nova.network.neutron [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updating instance_info_cache with network_info: [{"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.870 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Releasing lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.871 186962 DEBUG nova.compute.manager [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Instance network_info: |[{"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.872 186962 DEBUG oslo_concurrency.lockutils [req-912b9b55-c958-4541-bebd-6710eb4b4caa req-3698eb8a-4462-4b29-89dc-f6dd5bfc4dcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.872 186962 DEBUG nova.network.neutron [req-912b9b55-c958-4541-bebd-6710eb4b4caa req-3698eb8a-4462-4b29-89dc-f6dd5bfc4dcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Refreshing network info cache for port 4907b21d-7156-4b51-b9bb-e881a3f55cb8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.875 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Start _get_guest_xml network_info=[{"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.881 186962 WARNING nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.889 186962 DEBUG nova.virt.libvirt.host [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.890 186962 DEBUG nova.virt.libvirt.host [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.895 186962 DEBUG nova.virt.libvirt.host [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.896 186962 DEBUG nova.virt.libvirt.host [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.897 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.898 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.898 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.899 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.899 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.899 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.899 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.900 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.900 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.900 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.900 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.901 186962 DEBUG nova.virt.hardware [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.904 186962 DEBUG nova.virt.libvirt.vif [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:59:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=183,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA6QStI4WZvM8rjCdYdMBIewiD6P/FbGD7wLorCZFoLA7wHCNE+3A9eASDdB+YjnY2gBekoYo19AK1G+4bESS2fKDisJFlhhgBaK7LaZRlIAVhbJNU4lhjciEXZNpjSjZw==',key_name='tempest-TestSecurityGroupsBasicOps-1392443600',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-iy9xu73f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:59:44Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=4a55b16d-d538-42b5-8f50-87c39d63b4eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.905 186962 DEBUG nova.network.os_vif_util [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.906 186962 DEBUG nova.network.os_vif_util [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:96:04,bridge_name='br-int',has_traffic_filtering=True,id=4907b21d-7156-4b51-b9bb-e881a3f55cb8,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4907b21d-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.906 186962 DEBUG nova.objects.instance [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4a55b16d-d538-42b5-8f50-87c39d63b4eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.957 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] End _get_guest_xml xml=<domain type="kvm">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  <uuid>4a55b16d-d538-42b5-8f50-87c39d63b4eb</uuid>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  <name>instance-000000b7</name>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208</nova:name>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 07:59:51</nova:creationTime>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:        <nova:user uuid="dec30fbde18e4b2382ea2c59847d067f">tempest-TestSecurityGroupsBasicOps-2022058758-project-member</nova:user>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:        <nova:project uuid="e8e45e91223b45a79dd698a82af4a2a5">tempest-TestSecurityGroupsBasicOps-2022058758</nova:project>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:        <nova:port uuid="4907b21d-7156-4b51-b9bb-e881a3f55cb8">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <system>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <entry name="serial">4a55b16d-d538-42b5-8f50-87c39d63b4eb</entry>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <entry name="uuid">4a55b16d-d538-42b5-8f50-87c39d63b4eb</entry>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    </system>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  <os>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  </os>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  <features>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  </features>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  </clock>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  <devices>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.config"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    </disk>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:8b:96:04"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <target dev="tap4907b21d-71"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    </interface>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/console.log" append="off"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    </serial>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <video>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    </video>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    </rng>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 02:59:51 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 02:59:51 np0005539505 nova_compute[186958]:  </devices>
Nov 29 02:59:51 np0005539505 nova_compute[186958]: </domain>
Nov 29 02:59:51 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.958 186962 DEBUG nova.compute.manager [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Preparing to wait for external event network-vif-plugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.959 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.959 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.959 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.960 186962 DEBUG nova.virt.libvirt.vif [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T07:59:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=183,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA6QStI4WZvM8rjCdYdMBIewiD6P/FbGD7wLorCZFoLA7wHCNE+3A9eASDdB+YjnY2gBekoYo19AK1G+4bESS2fKDisJFlhhgBaK7LaZRlIAVhbJNU4lhjciEXZNpjSjZw==',key_name='tempest-TestSecurityGroupsBasicOps-1392443600',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-iy9xu73f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T07:59:44Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=4a55b16d-d538-42b5-8f50-87c39d63b4eb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.960 186962 DEBUG nova.network.os_vif_util [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.961 186962 DEBUG nova.network.os_vif_util [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:96:04,bridge_name='br-int',has_traffic_filtering=True,id=4907b21d-7156-4b51-b9bb-e881a3f55cb8,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4907b21d-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.961 186962 DEBUG os_vif [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:96:04,bridge_name='br-int',has_traffic_filtering=True,id=4907b21d-7156-4b51-b9bb-e881a3f55cb8,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4907b21d-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.961 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.962 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.962 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.965 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.966 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4907b21d-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.966 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4907b21d-71, col_values=(('external_ids', {'iface-id': '4907b21d-7156-4b51-b9bb-e881a3f55cb8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:96:04', 'vm-uuid': '4a55b16d-d538-42b5-8f50-87c39d63b4eb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.967 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:51 np0005539505 NetworkManager[55134]: <info>  [1764403191.9685] manager: (tap4907b21d-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/425)
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.970 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.973 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:51 np0005539505 nova_compute[186958]: 2025-11-29 07:59:51.974 186962 INFO os_vif [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:96:04,bridge_name='br-int',has_traffic_filtering=True,id=4907b21d-7156-4b51-b9bb-e881a3f55cb8,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4907b21d-71')#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.118 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.119 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.119 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] No VIF found with MAC fa:16:3e:8b:96:04, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.119 186962 INFO nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Using config drive#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.401 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.401 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.402 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.402 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.452 186962 INFO nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Creating config drive at /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.config#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.458 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1wqg9amv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.482 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.544 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.545 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.581 186962 DEBUG oslo_concurrency.processutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1wqg9amv" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.602 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 02:59:52 np0005539505 kernel: tap4907b21d-71: entered promiscuous mode
Nov 29 02:59:52 np0005539505 NetworkManager[55134]: <info>  [1764403192.6557] manager: (tap4907b21d-71): new Tun device (/org/freedesktop/NetworkManager/Devices/426)
Nov 29 02:59:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:59:52Z|00852|binding|INFO|Claiming lport 4907b21d-7156-4b51-b9bb-e881a3f55cb8 for this chassis.
Nov 29 02:59:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:59:52Z|00853|binding|INFO|4907b21d-7156-4b51-b9bb-e881a3f55cb8: Claiming fa:16:3e:8b:96:04 10.100.0.10
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.657 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.661 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.665 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.674 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:96:04 10.100.0.10'], port_security=['fa:16:3e:8b:96:04 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6649faa1-db80-42dc-8e4b-bb2b1a7f56ec f2239602-f250-44be-ba22-c84ac97adb08', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d94b7d30-29ce-4bb4-a204-dc237d12f274, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=4907b21d-7156-4b51-b9bb-e881a3f55cb8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.676 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 4907b21d-7156-4b51-b9bb-e881a3f55cb8 in datapath ff3fc050-ba7a-4fdb-b763-76384fb9149e bound to our chassis#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.677 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ff3fc050-ba7a-4fdb-b763-76384fb9149e#033[00m
Nov 29 02:59:52 np0005539505 systemd-udevd[255573]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.690 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4f608631-a6fd-4b35-a495-79e8a8cd5d38]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.690 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapff3fc050-b1 in ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 02:59:52 np0005539505 systemd-machined[153285]: New machine qemu-89-instance-000000b7.
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.696 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapff3fc050-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.696 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[9e59eb1a-c34e-488f-a6d3-20247e7303b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.698 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[e32b1749-997f-463c-8db3-0a82a88f3f35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 NetworkManager[55134]: <info>  [1764403192.7068] device (tap4907b21d-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 02:59:52 np0005539505 NetworkManager[55134]: <info>  [1764403192.7079] device (tap4907b21d-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.711 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[4546aa63-a74f-401a-8068-4dae580abe04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 systemd[1]: Started Virtual Machine qemu-89-instance-000000b7.
Nov 29 02:59:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:59:52Z|00854|binding|INFO|Setting lport 4907b21d-7156-4b51-b9bb-e881a3f55cb8 ovn-installed in OVS
Nov 29 02:59:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:59:52Z|00855|binding|INFO|Setting lport 4907b21d-7156-4b51-b9bb-e881a3f55cb8 up in Southbound
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.721 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.725 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.727 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2e80bc79-6285-4fac-b1dc-8c95c7250722]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.763 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[93361cc9-0d3f-443d-a665-56edd569dd38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.768 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fad749f0-2016-46aa-8d6c-7a20f48a63ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 NetworkManager[55134]: <info>  [1764403192.7695] manager: (tapff3fc050-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/427)
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.796 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[a24e83db-4c44-44f4-befa-b2ce44337ad3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.800 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[f12d279c-ff4e-47c3-b32b-f97d758f3b44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 NetworkManager[55134]: <info>  [1764403192.8234] device (tapff3fc050-b0): carrier: link connected
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.828 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[1b31a370-ffcc-4fb5-804e-108c8637853b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.846 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[247a3351-c74b-45a2-bd70-848ab9d2c240]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff3fc050-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:47:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 265], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 864040, 'reachable_time': 41104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255605, 'error': None, 'target': 'ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.859 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.861 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5719MB free_disk=73.0706558227539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.861 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.861 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.863 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2ccfa6f5-67c2-4efd-b6b0-bf12688a6594]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:47d1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 864040, 'tstamp': 864040}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255606, 'error': None, 'target': 'ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.879 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[32e1854a-de29-4a50-bb64-05e2d5406c5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff3fc050-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:47:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 265], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 864040, 'reachable_time': 41104, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255607, 'error': None, 'target': 'ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.909 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[1f9ffe65-a7da-4cd2-96fc-99609ef16ee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.970 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a1e788cd-8c38-482a-a54f-be6d2b8c49c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.971 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff3fc050-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.971 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.972 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff3fc050-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:59:52 np0005539505 NetworkManager[55134]: <info>  [1764403192.9745] manager: (tapff3fc050-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/428)
Nov 29 02:59:52 np0005539505 kernel: tapff3fc050-b0: entered promiscuous mode
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.977 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapff3fc050-b0, col_values=(('external_ids', {'iface-id': '965f3bec-4819-4a7d-a97c-c5af8f6aa242'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 02:59:52 np0005539505 ovn_controller[95143]: 2025-11-29T07:59:52Z|00856|binding|INFO|Releasing lport 965f3bec-4819-4a7d-a97c-c5af8f6aa242 from this chassis (sb_readonly=0)
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.977 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.991 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ff3fc050-ba7a-4fdb-b763-76384fb9149e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ff3fc050-ba7a-4fdb-b763-76384fb9149e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 02:59:52 np0005539505 nova_compute[186958]: 2025-11-29 07:59:52.992 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.993 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[aab866de-1a88-40b0-8590-6f5a71fdac6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.994 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-ff3fc050-ba7a-4fdb-b763-76384fb9149e
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/ff3fc050-ba7a-4fdb-b763-76384fb9149e.pid.haproxy
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID ff3fc050-ba7a-4fdb-b763-76384fb9149e
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 02:59:52 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 07:59:52.994 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'env', 'PROCESS_TAG=haproxy-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ff3fc050-ba7a-4fdb-b763-76384fb9149e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.006 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 4a55b16d-d538-42b5-8f50-87c39d63b4eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.006 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.006 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.098 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.112 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.279 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.279 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.371 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403193.3708997, 4a55b16d-d538-42b5-8f50-87c39d63b4eb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.371 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] VM Started (Lifecycle Event)#033[00m
Nov 29 02:59:53 np0005539505 podman[255645]: 2025-11-29 07:59:53.374605706 +0000 UTC m=+0.046724592 container create e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.400 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.404 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403193.3713825, 4a55b16d-d538-42b5-8f50-87c39d63b4eb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.405 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] VM Paused (Lifecycle Event)#033[00m
Nov 29 02:59:53 np0005539505 systemd[1]: Started libpod-conmon-e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f.scope.
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.413 186962 DEBUG nova.compute.manager [req-c24a798f-5626-4292-8170-a97216dac9c7 req-105ea4fc-7e6a-4302-8659-5b62845ae5e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Received event network-vif-plugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.413 186962 DEBUG oslo_concurrency.lockutils [req-c24a798f-5626-4292-8170-a97216dac9c7 req-105ea4fc-7e6a-4302-8659-5b62845ae5e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.414 186962 DEBUG oslo_concurrency.lockutils [req-c24a798f-5626-4292-8170-a97216dac9c7 req-105ea4fc-7e6a-4302-8659-5b62845ae5e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.414 186962 DEBUG oslo_concurrency.lockutils [req-c24a798f-5626-4292-8170-a97216dac9c7 req-105ea4fc-7e6a-4302-8659-5b62845ae5e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.414 186962 DEBUG nova.compute.manager [req-c24a798f-5626-4292-8170-a97216dac9c7 req-105ea4fc-7e6a-4302-8659-5b62845ae5e4 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Processing event network-vif-plugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.415 186962 DEBUG nova.compute.manager [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.419 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.421 186962 INFO nova.virt.libvirt.driver [-] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Instance spawned successfully.#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.422 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.426 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.431 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403193.4181337, 4a55b16d-d538-42b5-8f50-87c39d63b4eb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.432 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] VM Resumed (Lifecycle Event)#033[00m
Nov 29 02:59:53 np0005539505 systemd[1]: Started libcrun container.
Nov 29 02:59:53 np0005539505 podman[255645]: 2025-11-29 07:59:53.349839776 +0000 UTC m=+0.021958692 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 02:59:53 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da951229cc36c4f754f1622a613efdd183d751e981d92e0677f348ca61732f0a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.454 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.455 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.455 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.456 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.456 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.457 186962 DEBUG nova.virt.libvirt.driver [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 02:59:53 np0005539505 podman[255645]: 2025-11-29 07:59:53.461424901 +0000 UTC m=+0.133543817 container init e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 02:59:53 np0005539505 podman[255645]: 2025-11-29 07:59:53.466184736 +0000 UTC m=+0.138303622 container start e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.466 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.469 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 02:59:53 np0005539505 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[255662]: [NOTICE]   (255666) : New worker (255668) forked
Nov 29 02:59:53 np0005539505 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[255662]: [NOTICE]   (255666) : Loading success.
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.500 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.547 186962 INFO nova.compute.manager [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Took 9.07 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.548 186962 DEBUG nova.compute.manager [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.632 186962 INFO nova.compute.manager [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Took 9.91 seconds to build instance.#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.651 186962 DEBUG oslo_concurrency.lockutils [None req-a7557bf0-0dfd-49ee-9ecd-7018bfcfbfe2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.016s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:53 np0005539505 nova_compute[186958]: 2025-11-29 07:59:53.864 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:54 np0005539505 nova_compute[186958]: 2025-11-29 07:59:54.237 186962 DEBUG nova.network.neutron [req-912b9b55-c958-4541-bebd-6710eb4b4caa req-3698eb8a-4462-4b29-89dc-f6dd5bfc4dcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updated VIF entry in instance network info cache for port 4907b21d-7156-4b51-b9bb-e881a3f55cb8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:59:54 np0005539505 nova_compute[186958]: 2025-11-29 07:59:54.239 186962 DEBUG nova.network.neutron [req-912b9b55-c958-4541-bebd-6710eb4b4caa req-3698eb8a-4462-4b29-89dc-f6dd5bfc4dcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updating instance_info_cache with network_info: [{"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:59:54 np0005539505 nova_compute[186958]: 2025-11-29 07:59:54.258 186962 DEBUG oslo_concurrency.lockutils [req-912b9b55-c958-4541-bebd-6710eb4b4caa req-3698eb8a-4462-4b29-89dc-f6dd5bfc4dcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:59:54 np0005539505 podman[255677]: 2025-11-29 07:59:54.716484831 +0000 UTC m=+0.048753350 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 29 02:59:55 np0005539505 nova_compute[186958]: 2025-11-29 07:59:55.281 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:55 np0005539505 nova_compute[186958]: 2025-11-29 07:59:55.282 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 02:59:55 np0005539505 nova_compute[186958]: 2025-11-29 07:59:55.282 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 02:59:55 np0005539505 nova_compute[186958]: 2025-11-29 07:59:55.478 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:59:55 np0005539505 nova_compute[186958]: 2025-11-29 07:59:55.479 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:59:55 np0005539505 nova_compute[186958]: 2025-11-29 07:59:55.479 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 02:59:55 np0005539505 nova_compute[186958]: 2025-11-29 07:59:55.480 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4a55b16d-d538-42b5-8f50-87c39d63b4eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 02:59:56 np0005539505 nova_compute[186958]: 2025-11-29 07:59:56.378 186962 DEBUG nova.compute.manager [req-34eb5d25-c428-44aa-80fa-aad801954b00 req-cb63dd6c-3b91-488d-b785-d877caaa9942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Received event network-vif-plugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:59:56 np0005539505 nova_compute[186958]: 2025-11-29 07:59:56.379 186962 DEBUG oslo_concurrency.lockutils [req-34eb5d25-c428-44aa-80fa-aad801954b00 req-cb63dd6c-3b91-488d-b785-d877caaa9942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 02:59:56 np0005539505 nova_compute[186958]: 2025-11-29 07:59:56.379 186962 DEBUG oslo_concurrency.lockutils [req-34eb5d25-c428-44aa-80fa-aad801954b00 req-cb63dd6c-3b91-488d-b785-d877caaa9942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 02:59:56 np0005539505 nova_compute[186958]: 2025-11-29 07:59:56.380 186962 DEBUG oslo_concurrency.lockutils [req-34eb5d25-c428-44aa-80fa-aad801954b00 req-cb63dd6c-3b91-488d-b785-d877caaa9942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 02:59:56 np0005539505 nova_compute[186958]: 2025-11-29 07:59:56.380 186962 DEBUG nova.compute.manager [req-34eb5d25-c428-44aa-80fa-aad801954b00 req-cb63dd6c-3b91-488d-b785-d877caaa9942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] No waiting events found dispatching network-vif-plugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 02:59:56 np0005539505 nova_compute[186958]: 2025-11-29 07:59:56.380 186962 WARNING nova.compute.manager [req-34eb5d25-c428-44aa-80fa-aad801954b00 req-cb63dd6c-3b91-488d-b785-d877caaa9942 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Received unexpected event network-vif-plugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 for instance with vm_state active and task_state None.#033[00m
Nov 29 02:59:56 np0005539505 nova_compute[186958]: 2025-11-29 07:59:56.969 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:57 np0005539505 nova_compute[186958]: 2025-11-29 07:59:57.668 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updating instance_info_cache with network_info: [{"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:59:57 np0005539505 nova_compute[186958]: 2025-11-29 07:59:57.699 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 02:59:57 np0005539505 nova_compute[186958]: 2025-11-29 07:59:57.699 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 02:59:57 np0005539505 nova_compute[186958]: 2025-11-29 07:59:57.700 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:57 np0005539505 nova_compute[186958]: 2025-11-29 07:59:57.700 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:57 np0005539505 nova_compute[186958]: 2025-11-29 07:59:57.700 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 02:59:57 np0005539505 NetworkManager[55134]: <info>  [1764403197.9536] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/429)
Nov 29 02:59:57 np0005539505 NetworkManager[55134]: <info>  [1764403197.9548] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/430)
Nov 29 02:59:57 np0005539505 nova_compute[186958]: 2025-11-29 07:59:57.954 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:58 np0005539505 nova_compute[186958]: 2025-11-29 07:59:58.073 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:58 np0005539505 ovn_controller[95143]: 2025-11-29T07:59:58Z|00857|binding|INFO|Releasing lport 965f3bec-4819-4a7d-a97c-c5af8f6aa242 from this chassis (sb_readonly=0)
Nov 29 02:59:58 np0005539505 nova_compute[186958]: 2025-11-29 07:59:58.088 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:58 np0005539505 nova_compute[186958]: 2025-11-29 07:59:58.426 186962 DEBUG nova.compute.manager [req-fe12cf89-90a3-4231-8941-c2d76c4b5e35 req-909a5492-cb62-44ad-a03c-2c512f336124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Received event network-changed-4907b21d-7156-4b51-b9bb-e881a3f55cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 02:59:58 np0005539505 nova_compute[186958]: 2025-11-29 07:59:58.427 186962 DEBUG nova.compute.manager [req-fe12cf89-90a3-4231-8941-c2d76c4b5e35 req-909a5492-cb62-44ad-a03c-2c512f336124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Refreshing instance network info cache due to event network-changed-4907b21d-7156-4b51-b9bb-e881a3f55cb8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 02:59:58 np0005539505 nova_compute[186958]: 2025-11-29 07:59:58.428 186962 DEBUG oslo_concurrency.lockutils [req-fe12cf89-90a3-4231-8941-c2d76c4b5e35 req-909a5492-cb62-44ad-a03c-2c512f336124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 02:59:58 np0005539505 nova_compute[186958]: 2025-11-29 07:59:58.428 186962 DEBUG oslo_concurrency.lockutils [req-fe12cf89-90a3-4231-8941-c2d76c4b5e35 req-909a5492-cb62-44ad-a03c-2c512f336124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 02:59:58 np0005539505 nova_compute[186958]: 2025-11-29 07:59:58.428 186962 DEBUG nova.network.neutron [req-fe12cf89-90a3-4231-8941-c2d76c4b5e35 req-909a5492-cb62-44ad-a03c-2c512f336124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Refreshing network info cache for port 4907b21d-7156-4b51-b9bb-e881a3f55cb8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 02:59:58 np0005539505 nova_compute[186958]: 2025-11-29 07:59:58.866 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 02:59:59 np0005539505 nova_compute[186958]: 2025-11-29 07:59:59.330 186962 DEBUG nova.network.neutron [req-fe12cf89-90a3-4231-8941-c2d76c4b5e35 req-909a5492-cb62-44ad-a03c-2c512f336124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updated VIF entry in instance network info cache for port 4907b21d-7156-4b51-b9bb-e881a3f55cb8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 02:59:59 np0005539505 nova_compute[186958]: 2025-11-29 07:59:59.331 186962 DEBUG nova.network.neutron [req-fe12cf89-90a3-4231-8941-c2d76c4b5e35 req-909a5492-cb62-44ad-a03c-2c512f336124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updating instance_info_cache with network_info: [{"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 02:59:59 np0005539505 nova_compute[186958]: 2025-11-29 07:59:59.357 186962 DEBUG oslo_concurrency.lockutils [req-fe12cf89-90a3-4231-8941-c2d76c4b5e35 req-909a5492-cb62-44ad-a03c-2c512f336124 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 03:00:01 np0005539505 nova_compute[186958]: 2025-11-29 08:00:01.972 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:02 np0005539505 podman[255700]: 2025-11-29 08:00:02.73619281 +0000 UTC m=+0.058384492 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 03:00:02 np0005539505 podman[255701]: 2025-11-29 08:00:02.810846711 +0000 UTC m=+0.127837075 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:00:03 np0005539505 nova_compute[186958]: 2025-11-29 08:00:03.869 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:04 np0005539505 nova_compute[186958]: 2025-11-29 08:00:04.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:04 np0005539505 nova_compute[186958]: 2025-11-29 08:00:04.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:06 np0005539505 ovn_controller[95143]: 2025-11-29T08:00:06Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:96:04 10.100.0.10
Nov 29 03:00:06 np0005539505 ovn_controller[95143]: 2025-11-29T08:00:06Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:96:04 10.100.0.10
Nov 29 03:00:06 np0005539505 nova_compute[186958]: 2025-11-29 08:00:06.975 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:07 np0005539505 podman[255762]: 2025-11-29 08:00:07.728694891 +0000 UTC m=+0.052316500 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 29 03:00:07 np0005539505 podman[255763]: 2025-11-29 08:00:07.760193672 +0000 UTC m=+0.082373230 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 29 03:00:08 np0005539505 nova_compute[186958]: 2025-11-29 08:00:08.869 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:11 np0005539505 nova_compute[186958]: 2025-11-29 08:00:11.980 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:13 np0005539505 nova_compute[186958]: 2025-11-29 08:00:13.871 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:16 np0005539505 nova_compute[186958]: 2025-11-29 08:00:16.984 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:18 np0005539505 nova_compute[186958]: 2025-11-29 08:00:18.873 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:21 np0005539505 podman[255803]: 2025-11-29 08:00:21.732119522 +0000 UTC m=+0.057278811 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 03:00:21 np0005539505 podman[255802]: 2025-11-29 08:00:21.737054671 +0000 UTC m=+0.065909504 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vendor=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, config_id=edpm)
Nov 29 03:00:21 np0005539505 nova_compute[186958]: 2025-11-29 08:00:21.987 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:23 np0005539505 nova_compute[186958]: 2025-11-29 08:00:23.875 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:25 np0005539505 podman[255843]: 2025-11-29 08:00:25.714512391 +0000 UTC m=+0.045402775 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 29 03:00:26 np0005539505 nova_compute[186958]: 2025-11-29 08:00:26.990 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:00:27.549 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:00:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:00:27.550 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:00:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:00:27.551 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:00:28 np0005539505 nova_compute[186958]: 2025-11-29 08:00:28.877 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:31 np0005539505 nova_compute[186958]: 2025-11-29 08:00:31.993 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:33 np0005539505 podman[255863]: 2025-11-29 08:00:33.725261847 +0000 UTC m=+0.057707153 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 03:00:33 np0005539505 podman[255864]: 2025-11-29 08:00:33.793440105 +0000 UTC m=+0.119785389 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:00:33 np0005539505 nova_compute[186958]: 2025-11-29 08:00:33.878 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:36 np0005539505 nova_compute[186958]: 2025-11-29 08:00:36.996 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:38 np0005539505 podman[255912]: 2025-11-29 08:00:38.740659646 +0000 UTC m=+0.062579220 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:00:38 np0005539505 podman[255911]: 2025-11-29 08:00:38.752023078 +0000 UTC m=+0.079982263 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:00:38 np0005539505 nova_compute[186958]: 2025-11-29 08:00:38.880 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:41 np0005539505 nova_compute[186958]: 2025-11-29 08:00:41.999 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:43 np0005539505 nova_compute[186958]: 2025-11-29 08:00:43.884 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:47 np0005539505 nova_compute[186958]: 2025-11-29 08:00:47.048 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.115 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b7', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'hostId': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.116 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.144 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.read.requests volume: 1057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.144 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7713120-c388-4950-8a05-fc8bcc8a3c20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1057, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-vda', 'timestamp': '2025-11-29T08:00:48.116243', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83f78686-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': '3c741e1960878aa27174a3d4a1a24ce4b9c0843fce527325554101555a68853e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-sda', 'timestamp': '2025-11-29T08:00:48.116243', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83f795e0-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': '15693a417874c71f346d2c5052365ddf8e66bf77ca31c0157a7e27a27e9c310e'}]}, 'timestamp': '2025-11-29 08:00:48.145093', '_unique_id': 'f87bcffaaefa49cc9c65594054ae0ed4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.146 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.150 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4a55b16d-d538-42b5-8f50-87c39d63b4eb / tap4907b21d-71 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.151 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/network.incoming.bytes volume: 28365 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '051f317b-e762-4021-b09d-245f9f6814de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28365, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b7-4a55b16d-d538-42b5-8f50-87c39d63b4eb-tap4907b21d-71', 'timestamp': '2025-11-29T08:00:48.147423', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'tap4907b21d-71', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:96:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4907b21d-71'}, 'message_id': '83f8997c-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.788286825, 'message_signature': 'a55ed942b89029cb331a13e7d56ad902612ac0e0984510960374185f4d0e19b7'}]}, 'timestamp': '2025-11-29 08:00:48.151833', '_unique_id': '8c1fd289e9da42a9acf5e490bba4c85d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.152 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.154 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.154 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208>]
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.154 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.171 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/cpu volume: 11940000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cec804b5-437f-484a-8e6d-c44ddc1174d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11940000000, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'timestamp': '2025-11-29T08:00:48.154552', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '83fbab1c-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.811998295, 'message_signature': '67dfa741f2b1b0a14181ebbe9a14be2b53d06e1863db493b9058f42d286c88a9'}]}, 'timestamp': '2025-11-29 08:00:48.171907', '_unique_id': '41dbcc77872a47eaa7708dadd5d60b05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.172 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.173 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec8e37c2-815f-4284-b0ca-c6df22896124', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b7-4a55b16d-d538-42b5-8f50-87c39d63b4eb-tap4907b21d-71', 'timestamp': '2025-11-29T08:00:48.173829', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'tap4907b21d-71', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:96:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4907b21d-71'}, 'message_id': '83fc0508-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.788286825, 'message_signature': '3866910e2fb8f0f10465cb18605c768fc5829fe2464e4e0286007955aa51d3ea'}]}, 'timestamp': '2025-11-29 08:00:48.174171', '_unique_id': '4303a16686ab499db6181af2deef9641'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.174 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.175 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.175 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/memory.usage volume: 46.58203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b122c846-2f3d-483d-96b4-053883fc53c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.58203125, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'timestamp': '2025-11-29T08:00:48.175725', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '83fc4e28-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.811998295, 'message_signature': '8f7d17ca4cb812e00ca59ec2bf9a845dccb5a986a2504f2117eeed16a78ccebb'}]}, 'timestamp': '2025-11-29 08:00:48.176044', '_unique_id': '5a28b5fdfa76413490f878c121e3a73a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.176 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.177 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.177 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.read.latency volume: 198058333 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.177 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.read.latency volume: 24581196 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e17207e2-e6b2-4cdd-97a3-99ba51fbcf51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 198058333, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-vda', 'timestamp': '2025-11-29T08:00:48.177504', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83fc941e-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': 'f7d6848fe60b4a4ae544e2990498e114a8f41447d78ccf6e3efd076d6a383172'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24581196, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-sda', 'timestamp': '2025-11-29T08:00:48.177504', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83fc9f04-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': '65b69fde9ee7385b73facb4a52b9ee2f5b0744818b2f191b3a8da0d7142ef4b0'}]}, 'timestamp': '2025-11-29 08:00:48.178051', '_unique_id': '774767458b3c4730baee89e454112a87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.178 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.179 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5110de97-cb25-478a-a5d1-9271ead2d220', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b7-4a55b16d-d538-42b5-8f50-87c39d63b4eb-tap4907b21d-71', 'timestamp': '2025-11-29T08:00:48.179669', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'tap4907b21d-71', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:96:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4907b21d-71'}, 'message_id': '83fce72a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.788286825, 'message_signature': 'f84165f1c2416ddcd4552bd4c2acfe8ab407148168c236f5210209a92ea2040c'}]}, 'timestamp': '2025-11-29 08:00:48.179905', '_unique_id': 'b98cecc5bddc4a6abac5ba77cddccb43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.180 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.181 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.181 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208>]
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.181 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/network.outgoing.bytes volume: 25004 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '606dd49c-f65f-4d1a-9240-36ce53a1d477', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25004, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b7-4a55b16d-d538-42b5-8f50-87c39d63b4eb-tap4907b21d-71', 'timestamp': '2025-11-29T08:00:48.181372', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'tap4907b21d-71', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:96:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4907b21d-71'}, 'message_id': '83fd2992-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.788286825, 'message_signature': '0551c19bd8b52ae6e3ba2e56ad81b412eaadfac7bec7a754c910f5d0a99e4929'}]}, 'timestamp': '2025-11-29 08:00:48.181608', '_unique_id': '51fac3caec6240beb74810b6dacbf028'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.182 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc09c2e8-44a5-40ae-8dae-620e2ac9c8ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b7-4a55b16d-d538-42b5-8f50-87c39d63b4eb-tap4907b21d-71', 'timestamp': '2025-11-29T08:00:48.182723', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'tap4907b21d-71', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:96:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4907b21d-71'}, 'message_id': '83fd5ebc-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.788286825, 'message_signature': '7bec9210c9b6f698e395a020bef03ccc1ee2caf76bc13a75e4892e08f739157a'}]}, 'timestamp': '2025-11-29 08:00:48.182965', '_unique_id': '72ecdc7969864da5b3c68f9cab7df337'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.184 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '138bd4ee-a4f0-4888-8faa-b5a1f909e2d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b7-4a55b16d-d538-42b5-8f50-87c39d63b4eb-tap4907b21d-71', 'timestamp': '2025-11-29T08:00:48.184273', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'tap4907b21d-71', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:96:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4907b21d-71'}, 'message_id': '83fd9cec-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.788286825, 'message_signature': '24223832e83eadbe84eec642d35e4ec7ba413a9ab2c890b7cbfdaae6311bb345'}]}, 'timestamp': '2025-11-29 08:00:48.184580', '_unique_id': 'a1daaec5ac2748fab1fa1e5c4e331657'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.185 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208>]
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.186 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.196 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.196 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc89e44b-28ad-4079-a1ce-32909e3627eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-vda', 'timestamp': '2025-11-29T08:00:48.186070', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83ff7e72-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.826912547, 'message_signature': '4a7056d561c471f96300720050a50a9bf93e39629d9db78be4ec5333d68dbe4e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-sda', 'timestamp': '2025-11-29T08:00:48.186070', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83ff88cc-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.826912547, 'message_signature': '39b0e5f1ed0d7e3194187b268759a7772adb392d54e3956072c31e90c47439d2'}]}, 'timestamp': '2025-11-29 08:00:48.197143', '_unique_id': '9affd6b2cbd24b209e89534d0ace22c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.197 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.198 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac1ae1b5-9a5a-43da-8b3f-c5e263cedfa5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b7-4a55b16d-d538-42b5-8f50-87c39d63b4eb-tap4907b21d-71', 'timestamp': '2025-11-29T08:00:48.198927', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'tap4907b21d-71', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:96:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4907b21d-71'}, 'message_id': '83ffd7fa-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.788286825, 'message_signature': '01a7929389bf436f8be01f1555cfc607663e0bf8d44c40192b36fdcb5db2ca70'}]}, 'timestamp': '2025-11-29 08:00:48.199186', '_unique_id': 'a408d6ea9c6c483b913c9c1ece914dc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.199 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.200 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.200 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.200 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0c6ee9b-d0fb-41ef-b03c-d03ca7449915', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-vda', 'timestamp': '2025-11-29T08:00:48.200480', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8400153a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.826912547, 'message_signature': '2e0a02b610841c15292b3b5882f9f9a46d3024e4d562482d404c6ceebc180ae1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-sda', 'timestamp': '2025-11-29T08:00:48.200480', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '84001d78-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.826912547, 'message_signature': '228302ab4ffd12b5a26186bf2550f024f1b0772b1a4eb68a729d7e26fc8ed649'}]}, 'timestamp': '2025-11-29 08:00:48.200960', '_unique_id': '523faaac9b014aa58ff68348e2615358'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.201 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.202 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/network.incoming.packets volume: 151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88390bb8-3f13-4f36-a470-8c36992271bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 151, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b7-4a55b16d-d538-42b5-8f50-87c39d63b4eb-tap4907b21d-71', 'timestamp': '2025-11-29T08:00:48.202250', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'tap4907b21d-71', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:96:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4907b21d-71'}, 'message_id': '84005a04-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.788286825, 'message_signature': '4495409da7cf0e3a83975095aa98cea177c731c384ffc55736744b7fc60fec81'}]}, 'timestamp': '2025-11-29 08:00:48.202520', '_unique_id': '5af4015590294e6aa2fc8c2972cdc9fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.203 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e10c5343-37ea-4c7a-a63f-6716a2e18218', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b7-4a55b16d-d538-42b5-8f50-87c39d63b4eb-tap4907b21d-71', 'timestamp': '2025-11-29T08:00:48.203825', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'tap4907b21d-71', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:96:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4907b21d-71'}, 'message_id': '84009780-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.788286825, 'message_signature': '764138028bdced836703ee2964882f37050459a7c84a15d3685dcd37aa8c5e1a'}]}, 'timestamp': '2025-11-29 08:00:48.204118', '_unique_id': '0319d24d36ed4ae9807e3c724c5b4bde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.205 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.205 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.write.requests volume: 332 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.205 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d2ed1b2-f5b0-49e0-94ac-11ad7bd5d316', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 332, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-vda', 'timestamp': '2025-11-29T08:00:48.205449', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8400d754-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': '902ae35b16be92d48f0f9f827b8bae6c5c5f7280a2f90655b35582d01e95975c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-sda', 'timestamp': '2025-11-29T08:00:48.205449', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8400e19a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': 'a64262756c106c9c694e938ca012868965f6de5739b2834e0aa767fa80f8b57a'}]}, 'timestamp': '2025-11-29 08:00:48.205966', '_unique_id': 'bb2a974d256947b3b1c0f7ac8ef24497'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.206 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.207 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.read.bytes volume: 29399552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.207 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b159d79-f5e1-4b30-8fc7-a8d2c69096a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29399552, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-vda', 'timestamp': '2025-11-29T08:00:48.207399', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '84012236-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': '53af5966a4090e44e6bb102eef9f8182fe2cc335611fe61f72f9902ce60f55d2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-sda', 'timestamp': '2025-11-29T08:00:48.207399', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '84012a42-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': '0b259b5e2d4af2370dc3d8ee1cb00a7439737fccf82b268eb15cdcff752c13be'}]}, 'timestamp': '2025-11-29 08:00:48.207820', '_unique_id': 'fe4e6e408bd1475ebb88b300e4976732'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.208 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.209 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.209 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2a2c0a2-ebff-4cd9-bfb7-7cd889f17db2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-vda', 'timestamp': '2025-11-29T08:00:48.209137', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '840166b0-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.826912547, 'message_signature': '4023efdb09211a8e3c122d90c63a467cf1b22ad8ea589bfa60355409cc791da5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-sda', 'timestamp': '2025-11-29T08:00:48.209137', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '84016f52-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.826912547, 'message_signature': 'fdeb7417f9bd2ad93d00ef7f71ebb36b611bb82fcffbac9652f422c440367ffb'}]}, 'timestamp': '2025-11-29 08:00:48.209588', '_unique_id': 'cc9e41ba5c114d2f80963a9fe6cf5c8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208>]
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.211 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.write.latency volume: 3869825505 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.211 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '143cda45-e890-44c7-b00a-5afbb8d4bfbe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3869825505, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-vda', 'timestamp': '2025-11-29T08:00:48.211011', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8401af6c-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': '4cc57f1d8e34d4efc970077e47d46d8e010e1f604aea60f50357e3c8441c73ac'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-sda', 'timestamp': '2025-11-29T08:00:48.211011', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8401b98a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': '276a9fa7e8b2171af44d612e52ff297c870e6ecfecb2124adf61b074f35e6bb4'}]}, 'timestamp': '2025-11-29 08:00:48.211531', '_unique_id': '48f7f3538318447bab016e338df5cefb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.212 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/network.outgoing.packets volume: 175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd33a8e5f-4ea9-4f33-acad-6567405dff4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 175, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': 'instance-000000b7-4a55b16d-d538-42b5-8f50-87c39d63b4eb-tap4907b21d-71', 'timestamp': '2025-11-29T08:00:48.212721', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'tap4907b21d-71', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8b:96:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4907b21d-71'}, 'message_id': '8401f35a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.788286825, 'message_signature': '8033b765ddf53cf13c4180685e8e1c76869eba73130f779901b2992f8c892569'}]}, 'timestamp': '2025-11-29 08:00:48.212996', '_unique_id': '8cd0c305099d4e97952a9db39f13eb2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.213 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.214 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.write.bytes volume: 73121792 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.214 12 DEBUG ceilometer.compute.pollsters [-] 4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9133abcd-b919-48a8-80d8-d02802b14884', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73121792, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-vda', 'timestamp': '2025-11-29T08:00:48.214159', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '84022b4a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': '5af665c630c40f9fcdb4d6ad36b13b42c7f4eab38697017d514b705397ae74b9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'dec30fbde18e4b2382ea2c59847d067f', 'user_name': None, 'project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'project_name': None, 'resource_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb-sda', 'timestamp': '2025-11-29T08:00:48.214159', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208', 'name': 'instance-000000b7', 'instance_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'instance_type': 'm1.nano', 'host': '21e1d83233ffd80d4b769a08c9a04f5a81f85e24e60507e51a887b8c', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '840233a6-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8695.757086512, 'message_signature': 'e5d55de05967bbf9bb6d0f63ce6abcec2410e56dc38c8b1d75098733dc0f516b'}]}, 'timestamp': '2025-11-29 08:00:48.214618', '_unique_id': 'a643b6a7e8434b1fb5664cf51e585422'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:00:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:00:48.215 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:00:48 np0005539505 nova_compute[186958]: 2025-11-29 08:00:48.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:48 np0005539505 nova_compute[186958]: 2025-11-29 08:00:48.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:00:48 np0005539505 nova_compute[186958]: 2025-11-29 08:00:48.885 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:49 np0005539505 nova_compute[186958]: 2025-11-29 08:00:49.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:50 np0005539505 nova_compute[186958]: 2025-11-29 08:00:50.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:52 np0005539505 nova_compute[186958]: 2025-11-29 08:00:52.270 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:52 np0005539505 nova_compute[186958]: 2025-11-29 08:00:52.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:52 np0005539505 nova_compute[186958]: 2025-11-29 08:00:52.587 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:00:52 np0005539505 nova_compute[186958]: 2025-11-29 08:00:52.587 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:00:52 np0005539505 nova_compute[186958]: 2025-11-29 08:00:52.587 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:00:52 np0005539505 nova_compute[186958]: 2025-11-29 08:00:52.588 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:00:52 np0005539505 ovn_controller[95143]: 2025-11-29T08:00:52Z|00858|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Nov 29 03:00:52 np0005539505 podman[255957]: 2025-11-29 08:00:52.677332668 +0000 UTC m=+0.044571081 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 03:00:52 np0005539505 podman[255956]: 2025-11-29 08:00:52.683463072 +0000 UTC m=+0.053502584 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Nov 29 03:00:52 np0005539505 nova_compute[186958]: 2025-11-29 08:00:52.733 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 03:00:52 np0005539505 nova_compute[186958]: 2025-11-29 08:00:52.805 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 03:00:52 np0005539505 nova_compute[186958]: 2025-11-29 08:00:52.806 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 03:00:52 np0005539505 nova_compute[186958]: 2025-11-29 08:00:52.863 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.002 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.003 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5535MB free_disk=73.0421028137207GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.004 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.004 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:00:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:00:53.190 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.190 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:00:53.191 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 03:00:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:00:53.192 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.627 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 4a55b16d-d538-42b5-8f50-87c39d63b4eb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.628 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.628 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.757 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.854 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.888 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.951 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:00:53 np0005539505 nova_compute[186958]: 2025-11-29 08:00:53.952 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.948s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:00:56 np0005539505 podman[256004]: 2025-11-29 08:00:56.719519968 +0000 UTC m=+0.051332463 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:00:56 np0005539505 nova_compute[186958]: 2025-11-29 08:00:56.952 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:00:56 np0005539505 nova_compute[186958]: 2025-11-29 08:00:56.952 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:00:56 np0005539505 nova_compute[186958]: 2025-11-29 08:00:56.952 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:00:57 np0005539505 nova_compute[186958]: 2025-11-29 08:00:57.126 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 03:00:57 np0005539505 nova_compute[186958]: 2025-11-29 08:00:57.127 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquired lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 03:00:57 np0005539505 nova_compute[186958]: 2025-11-29 08:00:57.127 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 29 03:00:57 np0005539505 nova_compute[186958]: 2025-11-29 08:00:57.127 186962 DEBUG nova.objects.instance [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4a55b16d-d538-42b5-8f50-87c39d63b4eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 03:00:57 np0005539505 nova_compute[186958]: 2025-11-29 08:00:57.273 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:00:58 np0005539505 nova_compute[186958]: 2025-11-29 08:00:58.889 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.377 186962 DEBUG nova.compute.manager [req-12cf4f91-5634-4062-ae55-069c0ff937f0 req-7767217f-cacd-4a86-b1e2-a7a536488c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Received event network-changed-4907b21d-7156-4b51-b9bb-e881a3f55cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.378 186962 DEBUG nova.compute.manager [req-12cf4f91-5634-4062-ae55-069c0ff937f0 req-7767217f-cacd-4a86-b1e2-a7a536488c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Refreshing instance network info cache due to event network-changed-4907b21d-7156-4b51-b9bb-e881a3f55cb8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.378 186962 DEBUG oslo_concurrency.lockutils [req-12cf4f91-5634-4062-ae55-069c0ff937f0 req-7767217f-cacd-4a86-b1e2-a7a536488c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.403 186962 DEBUG nova.network.neutron [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updating instance_info_cache with network_info: [{"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.428 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Releasing lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.429 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.429 186962 DEBUG oslo_concurrency.lockutils [req-12cf4f91-5634-4062-ae55-069c0ff937f0 req-7767217f-cacd-4a86-b1e2-a7a536488c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.429 186962 DEBUG nova.network.neutron [req-12cf4f91-5634-4062-ae55-069c0ff937f0 req-7767217f-cacd-4a86-b1e2-a7a536488c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Refreshing network info cache for port 4907b21d-7156-4b51-b9bb-e881a3f55cb8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.430 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.431 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.432 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.467 186962 DEBUG oslo_concurrency.lockutils [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.467 186962 DEBUG oslo_concurrency.lockutils [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.467 186962 DEBUG oslo_concurrency.lockutils [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.467 186962 DEBUG oslo_concurrency.lockutils [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.468 186962 DEBUG oslo_concurrency.lockutils [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.483 186962 INFO nova.compute.manager [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Terminating instance#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.496 186962 DEBUG nova.compute.manager [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 03:01:00 np0005539505 kernel: tap4907b21d-71 (unregistering): left promiscuous mode
Nov 29 03:01:00 np0005539505 NetworkManager[55134]: <info>  [1764403260.5211] device (tap4907b21d-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 03:01:00 np0005539505 ovn_controller[95143]: 2025-11-29T08:01:00Z|00859|binding|INFO|Releasing lport 4907b21d-7156-4b51-b9bb-e881a3f55cb8 from this chassis (sb_readonly=0)
Nov 29 03:01:00 np0005539505 ovn_controller[95143]: 2025-11-29T08:01:00Z|00860|binding|INFO|Setting lport 4907b21d-7156-4b51-b9bb-e881a3f55cb8 down in Southbound
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.537 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:00 np0005539505 ovn_controller[95143]: 2025-11-29T08:01:00Z|00861|binding|INFO|Removing iface tap4907b21d-71 ovn-installed in OVS
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.540 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.544 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:96:04 10.100.0.10'], port_security=['fa:16:3e:8b:96:04 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4a55b16d-d538-42b5-8f50-87c39d63b4eb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8e45e91223b45a79dd698a82af4a2a5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6649faa1-db80-42dc-8e4b-bb2b1a7f56ec f2239602-f250-44be-ba22-c84ac97adb08', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d94b7d30-29ce-4bb4-a204-dc237d12f274, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=4907b21d-7156-4b51-b9bb-e881a3f55cb8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.545 104094 INFO neutron.agent.ovn.metadata.agent [-] Port 4907b21d-7156-4b51-b9bb-e881a3f55cb8 in datapath ff3fc050-ba7a-4fdb-b763-76384fb9149e unbound from our chassis#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.546 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff3fc050-ba7a-4fdb-b763-76384fb9149e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.548 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ecdf3614-1366-40bf-9334-c8c8510fc161]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.549 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e namespace which is not needed anymore#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.568 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:00 np0005539505 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000b7.scope: Deactivated successfully.
Nov 29 03:01:00 np0005539505 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000b7.scope: Consumed 15.406s CPU time.
Nov 29 03:01:00 np0005539505 systemd-machined[153285]: Machine qemu-89-instance-000000b7 terminated.
Nov 29 03:01:00 np0005539505 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[255662]: [NOTICE]   (255666) : haproxy version is 2.8.14-c23fe91
Nov 29 03:01:00 np0005539505 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[255662]: [NOTICE]   (255666) : path to executable is /usr/sbin/haproxy
Nov 29 03:01:00 np0005539505 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[255662]: [WARNING]  (255666) : Exiting Master process...
Nov 29 03:01:00 np0005539505 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[255662]: [ALERT]    (255666) : Current worker (255668) exited with code 143 (Terminated)
Nov 29 03:01:00 np0005539505 neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e[255662]: [WARNING]  (255666) : All workers exited. Exiting... (0)
Nov 29 03:01:00 np0005539505 systemd[1]: libpod-e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f.scope: Deactivated successfully.
Nov 29 03:01:00 np0005539505 podman[256049]: 2025-11-29 08:01:00.676676783 +0000 UTC m=+0.048807812 container died e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:01:00 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f-userdata-shm.mount: Deactivated successfully.
Nov 29 03:01:00 np0005539505 systemd[1]: var-lib-containers-storage-overlay-da951229cc36c4f754f1622a613efdd183d751e981d92e0677f348ca61732f0a-merged.mount: Deactivated successfully.
Nov 29 03:01:00 np0005539505 podman[256049]: 2025-11-29 08:01:00.724768652 +0000 UTC m=+0.096899671 container cleanup e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.726 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:00 np0005539505 systemd[1]: libpod-conmon-e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f.scope: Deactivated successfully.
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.733 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.769 186962 INFO nova.virt.libvirt.driver [-] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Instance destroyed successfully.#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.769 186962 DEBUG nova.objects.instance [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lazy-loading 'resources' on Instance uuid 4a55b16d-d538-42b5-8f50-87c39d63b4eb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.783 186962 DEBUG nova.virt.libvirt.vif [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T07:59:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2022058758-access_point-160666208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2022058758-ac',id=183,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA6QStI4WZvM8rjCdYdMBIewiD6P/FbGD7wLorCZFoLA7wHCNE+3A9eASDdB+YjnY2gBekoYo19AK1G+4bESS2fKDisJFlhhgBaK7LaZRlIAVhbJNU4lhjciEXZNpjSjZw==',key_name='tempest-TestSecurityGroupsBasicOps-1392443600',keypairs=<?>,launch_index=0,launched_at=2025-11-29T07:59:53Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e8e45e91223b45a79dd698a82af4a2a5',ramdisk_id='',reservation_id='r-iy9xu73f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2022058758',owner_user_name='tempest-TestSecurityGroupsBasicOps-2022058758-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T07:59:53Z,user_data=None,user_id='dec30fbde18e4b2382ea2c59847d067f',uuid=4a55b16d-d538-42b5-8f50-87c39d63b4eb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.783 186962 DEBUG nova.network.os_vif_util [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converting VIF {"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.784 186962 DEBUG nova.network.os_vif_util [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:96:04,bridge_name='br-int',has_traffic_filtering=True,id=4907b21d-7156-4b51-b9bb-e881a3f55cb8,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4907b21d-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.784 186962 DEBUG os_vif [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:96:04,bridge_name='br-int',has_traffic_filtering=True,id=4907b21d-7156-4b51-b9bb-e881a3f55cb8,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4907b21d-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.786 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.786 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4907b21d-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.788 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.790 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.792 186962 INFO os_vif [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:96:04,bridge_name='br-int',has_traffic_filtering=True,id=4907b21d-7156-4b51-b9bb-e881a3f55cb8,network=Network(ff3fc050-ba7a-4fdb-b763-76384fb9149e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4907b21d-71')#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.793 186962 INFO nova.virt.libvirt.driver [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Deleting instance files /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb_del#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.793 186962 INFO nova.virt.libvirt.driver [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Deletion of /var/lib/nova/instances/4a55b16d-d538-42b5-8f50-87c39d63b4eb_del complete#033[00m
Nov 29 03:01:00 np0005539505 podman[256083]: 2025-11-29 08:01:00.797100737 +0000 UTC m=+0.049055038 container remove e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.802 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[659a2ac6-786e-45a2-abeb-8817e269e677]: (4, ('Sat Nov 29 08:01:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e (e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f)\ne006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f\nSat Nov 29 08:01:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e (e006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f)\ne006daef9bc73fe53e1689e119a42089bc08f50a13cc76a939fbd077368af44f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.804 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[724c0567-7be3-40da-be7b-31087c681581]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.805 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff3fc050-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:01:00 np0005539505 kernel: tapff3fc050-b0: left promiscuous mode
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.807 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.818 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.821 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[5e001bd5-8840-4628-afb0-7835b6c724e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.836 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[89006b25-c244-468b-912d-7a616288aa87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.838 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[86144700-e025-40cf-80f9-9a4646ebf9ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.855 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e67057-aa59-4ed5-ae2a-1e02eec7db19]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 864034, 'reachable_time': 24931, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256109, 'error': None, 'target': 'ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.858 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ff3fc050-ba7a-4fdb-b763-76384fb9149e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 03:01:00 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:00.858 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[60212063-ee50-4514-81db-90eb1bec3add]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:01:00 np0005539505 systemd[1]: run-netns-ovnmeta\x2dff3fc050\x2dba7a\x2d4fdb\x2db763\x2d76384fb9149e.mount: Deactivated successfully.
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.873 186962 INFO nova.compute.manager [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.873 186962 DEBUG oslo.service.loopingcall [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.874 186962 DEBUG nova.compute.manager [-] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 03:01:00 np0005539505 nova_compute[186958]: 2025-11-29 08:01:00.874 186962 DEBUG nova.network.neutron [-] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 03:01:01 np0005539505 nova_compute[186958]: 2025-11-29 08:01:01.516 186962 DEBUG nova.compute.manager [req-b0af2a38-03c4-49ce-8705-b868928c3030 req-4e34e748-b12c-4bf0-b5a8-cf38c3d173ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Received event network-vif-unplugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 03:01:01 np0005539505 nova_compute[186958]: 2025-11-29 08:01:01.516 186962 DEBUG oslo_concurrency.lockutils [req-b0af2a38-03c4-49ce-8705-b868928c3030 req-4e34e748-b12c-4bf0-b5a8-cf38c3d173ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:01 np0005539505 nova_compute[186958]: 2025-11-29 08:01:01.516 186962 DEBUG oslo_concurrency.lockutils [req-b0af2a38-03c4-49ce-8705-b868928c3030 req-4e34e748-b12c-4bf0-b5a8-cf38c3d173ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:01 np0005539505 nova_compute[186958]: 2025-11-29 08:01:01.517 186962 DEBUG oslo_concurrency.lockutils [req-b0af2a38-03c4-49ce-8705-b868928c3030 req-4e34e748-b12c-4bf0-b5a8-cf38c3d173ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:01 np0005539505 nova_compute[186958]: 2025-11-29 08:01:01.517 186962 DEBUG nova.compute.manager [req-b0af2a38-03c4-49ce-8705-b868928c3030 req-4e34e748-b12c-4bf0-b5a8-cf38c3d173ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] No waiting events found dispatching network-vif-unplugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 03:01:01 np0005539505 nova_compute[186958]: 2025-11-29 08:01:01.517 186962 DEBUG nova.compute.manager [req-b0af2a38-03c4-49ce-8705-b868928c3030 req-4e34e748-b12c-4bf0-b5a8-cf38c3d173ed 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Received event network-vif-unplugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 03:01:02 np0005539505 nova_compute[186958]: 2025-11-29 08:01:02.451 186962 DEBUG nova.network.neutron [-] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 03:01:02 np0005539505 nova_compute[186958]: 2025-11-29 08:01:02.469 186962 INFO nova.compute.manager [-] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Took 1.60 seconds to deallocate network for instance.#033[00m
Nov 29 03:01:02 np0005539505 nova_compute[186958]: 2025-11-29 08:01:02.567 186962 DEBUG oslo_concurrency.lockutils [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:02 np0005539505 nova_compute[186958]: 2025-11-29 08:01:02.567 186962 DEBUG oslo_concurrency.lockutils [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:02 np0005539505 nova_compute[186958]: 2025-11-29 08:01:02.581 186962 DEBUG nova.compute.manager [req-d9a87ad4-5a43-4066-863c-77b65e7c64e6 req-790049e4-cb0e-4bf1-8496-9ea40b72532e 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Received event network-vif-deleted-4907b21d-7156-4b51-b9bb-e881a3f55cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 03:01:02 np0005539505 nova_compute[186958]: 2025-11-29 08:01:02.640 186962 DEBUG nova.compute.provider_tree [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:01:02 np0005539505 nova_compute[186958]: 2025-11-29 08:01:02.662 186962 DEBUG nova.scheduler.client.report [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:01:02 np0005539505 nova_compute[186958]: 2025-11-29 08:01:02.690 186962 DEBUG oslo_concurrency.lockutils [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:02 np0005539505 nova_compute[186958]: 2025-11-29 08:01:02.718 186962 INFO nova.scheduler.client.report [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Deleted allocations for instance 4a55b16d-d538-42b5-8f50-87c39d63b4eb#033[00m
Nov 29 03:01:02 np0005539505 nova_compute[186958]: 2025-11-29 08:01:02.801 186962 DEBUG oslo_concurrency.lockutils [None req-b821a03d-15da-4bdf-99c2-9b4a09b1fca2 dec30fbde18e4b2382ea2c59847d067f e8e45e91223b45a79dd698a82af4a2a5 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:03 np0005539505 nova_compute[186958]: 2025-11-29 08:01:03.152 186962 DEBUG nova.network.neutron [req-12cf4f91-5634-4062-ae55-069c0ff937f0 req-7767217f-cacd-4a86-b1e2-a7a536488c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updated VIF entry in instance network info cache for port 4907b21d-7156-4b51-b9bb-e881a3f55cb8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 03:01:03 np0005539505 nova_compute[186958]: 2025-11-29 08:01:03.153 186962 DEBUG nova.network.neutron [req-12cf4f91-5634-4062-ae55-069c0ff937f0 req-7767217f-cacd-4a86-b1e2-a7a536488c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Updating instance_info_cache with network_info: [{"id": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "address": "fa:16:3e:8b:96:04", "network": {"id": "ff3fc050-ba7a-4fdb-b763-76384fb9149e", "bridge": "br-int", "label": "tempest-network-smoke--1131964355", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e8e45e91223b45a79dd698a82af4a2a5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4907b21d-71", "ovs_interfaceid": "4907b21d-7156-4b51-b9bb-e881a3f55cb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 03:01:03 np0005539505 nova_compute[186958]: 2025-11-29 08:01:03.173 186962 DEBUG oslo_concurrency.lockutils [req-12cf4f91-5634-4062-ae55-069c0ff937f0 req-7767217f-cacd-4a86-b1e2-a7a536488c57 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-4a55b16d-d538-42b5-8f50-87c39d63b4eb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 03:01:03 np0005539505 nova_compute[186958]: 2025-11-29 08:01:03.589 186962 DEBUG nova.compute.manager [req-b6e25f42-f9ff-4726-90fa-04c5a0aa6c98 req-6ee7c079-d4cc-455c-b057-21892485dab0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Received event network-vif-plugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 03:01:03 np0005539505 nova_compute[186958]: 2025-11-29 08:01:03.589 186962 DEBUG oslo_concurrency.lockutils [req-b6e25f42-f9ff-4726-90fa-04c5a0aa6c98 req-6ee7c079-d4cc-455c-b057-21892485dab0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:03 np0005539505 nova_compute[186958]: 2025-11-29 08:01:03.590 186962 DEBUG oslo_concurrency.lockutils [req-b6e25f42-f9ff-4726-90fa-04c5a0aa6c98 req-6ee7c079-d4cc-455c-b057-21892485dab0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:03 np0005539505 nova_compute[186958]: 2025-11-29 08:01:03.590 186962 DEBUG oslo_concurrency.lockutils [req-b6e25f42-f9ff-4726-90fa-04c5a0aa6c98 req-6ee7c079-d4cc-455c-b057-21892485dab0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "4a55b16d-d538-42b5-8f50-87c39d63b4eb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:03 np0005539505 nova_compute[186958]: 2025-11-29 08:01:03.590 186962 DEBUG nova.compute.manager [req-b6e25f42-f9ff-4726-90fa-04c5a0aa6c98 req-6ee7c079-d4cc-455c-b057-21892485dab0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] No waiting events found dispatching network-vif-plugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 03:01:03 np0005539505 nova_compute[186958]: 2025-11-29 08:01:03.590 186962 WARNING nova.compute.manager [req-b6e25f42-f9ff-4726-90fa-04c5a0aa6c98 req-6ee7c079-d4cc-455c-b057-21892485dab0 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Received unexpected event network-vif-plugged-4907b21d-7156-4b51-b9bb-e881a3f55cb8 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 03:01:03 np0005539505 nova_compute[186958]: 2025-11-29 08:01:03.892 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:04 np0005539505 podman[256121]: 2025-11-29 08:01:04.73118461 +0000 UTC m=+0.060039938 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:01:04 np0005539505 podman[256122]: 2025-11-29 08:01:04.765585563 +0000 UTC m=+0.094430971 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 29 03:01:05 np0005539505 nova_compute[186958]: 2025-11-29 08:01:05.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:05 np0005539505 nova_compute[186958]: 2025-11-29 08:01:05.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:05 np0005539505 nova_compute[186958]: 2025-11-29 08:01:05.788 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:08 np0005539505 nova_compute[186958]: 2025-11-29 08:01:08.591 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:08 np0005539505 nova_compute[186958]: 2025-11-29 08:01:08.756 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:08 np0005539505 nova_compute[186958]: 2025-11-29 08:01:08.893 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:09 np0005539505 podman[256169]: 2025-11-29 08:01:09.735101455 +0000 UTC m=+0.066430679 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.schema-version=1.0)
Nov 29 03:01:09 np0005539505 podman[256170]: 2025-11-29 08:01:09.764705032 +0000 UTC m=+0.091143598 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:01:10 np0005539505 nova_compute[186958]: 2025-11-29 08:01:10.791 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:13 np0005539505 nova_compute[186958]: 2025-11-29 08:01:13.895 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:15 np0005539505 nova_compute[186958]: 2025-11-29 08:01:15.769 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764403260.7676065, 4a55b16d-d538-42b5-8f50-87c39d63b4eb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 03:01:15 np0005539505 nova_compute[186958]: 2025-11-29 08:01:15.769 186962 INFO nova.compute.manager [-] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] VM Stopped (Lifecycle Event)#033[00m
Nov 29 03:01:15 np0005539505 nova_compute[186958]: 2025-11-29 08:01:15.795 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:15 np0005539505 nova_compute[186958]: 2025-11-29 08:01:15.813 186962 DEBUG nova.compute.manager [None req-a485f362-2415-4f9a-ab0a-d0fd4b7200f9 - - - - - -] [instance: 4a55b16d-d538-42b5-8f50-87c39d63b4eb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 03:01:18 np0005539505 nova_compute[186958]: 2025-11-29 08:01:18.897 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:20 np0005539505 nova_compute[186958]: 2025-11-29 08:01:20.797 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:23 np0005539505 podman[256207]: 2025-11-29 08:01:23.724917691 +0000 UTC m=+0.056420977 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:01:23 np0005539505 podman[256206]: 2025-11-29 08:01:23.729393507 +0000 UTC m=+0.066498851 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Nov 29 03:01:23 np0005539505 nova_compute[186958]: 2025-11-29 08:01:23.901 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:25 np0005539505 nova_compute[186958]: 2025-11-29 08:01:25.799 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:27.551 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:27.551 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:27.551 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:27 np0005539505 podman[256251]: 2025-11-29 08:01:27.753247277 +0000 UTC m=+0.082178895 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 03:01:28 np0005539505 nova_compute[186958]: 2025-11-29 08:01:28.901 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:30 np0005539505 nova_compute[186958]: 2025-11-29 08:01:30.802 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:33 np0005539505 nova_compute[186958]: 2025-11-29 08:01:33.902 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:35 np0005539505 podman[256272]: 2025-11-29 08:01:35.735092616 +0000 UTC m=+0.071540413 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:01:35 np0005539505 podman[256273]: 2025-11-29 08:01:35.745426008 +0000 UTC m=+0.079166929 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 03:01:35 np0005539505 nova_compute[186958]: 2025-11-29 08:01:35.804 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:38 np0005539505 nova_compute[186958]: 2025-11-29 08:01:38.903 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:40 np0005539505 podman[256322]: 2025-11-29 08:01:40.718693156 +0000 UTC m=+0.052735421 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:01:40 np0005539505 podman[256323]: 2025-11-29 08:01:40.731259631 +0000 UTC m=+0.062327962 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 03:01:40 np0005539505 nova_compute[186958]: 2025-11-29 08:01:40.805 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:43 np0005539505 nova_compute[186958]: 2025-11-29 08:01:43.904 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:45 np0005539505 nova_compute[186958]: 2025-11-29 08:01:45.808 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.380 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.381 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.382 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.382 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.383 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.383 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.431 186962 DEBUG nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Adding ephemeral_1_0706d66 into backend ephemeral images _store_ephemeral_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:100#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.457 186962 DEBUG nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.458 186962 WARNING nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.458 186962 WARNING nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.459 186962 WARNING nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.459 186962 WARNING nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.459 186962 WARNING nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.459 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Removable base files: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4 /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123 /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7 /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.460 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.460 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/699710de794702bf7c50d3f51aa45a0dd64d5fc4#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.460 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/923f30c548f83d073f1130ce28fd6a6debb4b123#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.460 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/d537cc6a7df5615191df4c72ff29adbe892591b7#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.460 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/e69d3f3d9d2adc72437096d883077081c1258369#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.461 186962 DEBUG nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.461 186962 DEBUG nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.461 186962 DEBUG nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.461 186962 INFO nova.virt.libvirt.imagecache [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ephemeral_1_0706d66#033[00m
Nov 29 03:01:48 np0005539505 nova_compute[186958]: 2025-11-29 08:01:48.905 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:50 np0005539505 nova_compute[186958]: 2025-11-29 08:01:50.462 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:50 np0005539505 nova_compute[186958]: 2025-11-29 08:01:50.851 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.403 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.403 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.404 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.404 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.576 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.577 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5708MB free_disk=73.07085037231445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.577 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.577 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.645 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.646 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.664 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.699 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.700 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.717 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.747 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.767 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.786 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.807 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.808 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:01:53 np0005539505 nova_compute[186958]: 2025-11-29 08:01:53.906 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:54 np0005539505 podman[256364]: 2025-11-29 08:01:54.711292348 +0000 UTC m=+0.043657876 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:01:54 np0005539505 podman[256363]: 2025-11-29 08:01:54.720983622 +0000 UTC m=+0.056645423 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc.)
Nov 29 03:01:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:55.260 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 03:01:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:01:55.261 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 03:01:55 np0005539505 nova_compute[186958]: 2025-11-29 08:01:55.261 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:55 np0005539505 nova_compute[186958]: 2025-11-29 08:01:55.808 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:55 np0005539505 nova_compute[186958]: 2025-11-29 08:01:55.853 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:56 np0005539505 nova_compute[186958]: 2025-11-29 08:01:56.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:56 np0005539505 nova_compute[186958]: 2025-11-29 08:01:56.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:01:56 np0005539505 nova_compute[186958]: 2025-11-29 08:01:56.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:01:56 np0005539505 nova_compute[186958]: 2025-11-29 08:01:56.396 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:01:56 np0005539505 nova_compute[186958]: 2025-11-29 08:01:56.397 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:01:58 np0005539505 podman[256405]: 2025-11-29 08:01:58.716972154 +0000 UTC m=+0.050423086 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 03:01:58 np0005539505 nova_compute[186958]: 2025-11-29 08:01:58.908 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:01:59 np0005539505 nova_compute[186958]: 2025-11-29 08:01:59.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:00 np0005539505 nova_compute[186958]: 2025-11-29 08:02:00.855 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:03 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:03.263 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:02:03 np0005539505 nova_compute[186958]: 2025-11-29 08:02:03.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:03 np0005539505 nova_compute[186958]: 2025-11-29 08:02:03.910 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:05 np0005539505 nova_compute[186958]: 2025-11-29 08:02:05.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:05 np0005539505 nova_compute[186958]: 2025-11-29 08:02:05.857 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:06 np0005539505 podman[256425]: 2025-11-29 08:02:06.742023275 +0000 UTC m=+0.072287535 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:02:06 np0005539505 podman[256424]: 2025-11-29 08:02:06.762144744 +0000 UTC m=+0.086567669 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 03:02:08 np0005539505 nova_compute[186958]: 2025-11-29 08:02:08.912 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:10 np0005539505 nova_compute[186958]: 2025-11-29 08:02:10.859 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:11 np0005539505 podman[256469]: 2025-11-29 08:02:11.722087095 +0000 UTC m=+0.058150115 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 29 03:02:11 np0005539505 podman[256470]: 2025-11-29 08:02:11.747345799 +0000 UTC m=+0.079982112 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:02:13 np0005539505 nova_compute[186958]: 2025-11-29 08:02:13.912 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:15 np0005539505 nova_compute[186958]: 2025-11-29 08:02:15.885 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.204 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Acquiring lock "058df251-6e51-4b9f-937f-08868563ee24" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.204 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.222 186962 DEBUG nova.compute.manager [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.366 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.367 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.374 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.375 186962 INFO nova.compute.claims [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Claim successful on node compute-2.ctlplane.example.com#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.514 186962 DEBUG nova.compute.provider_tree [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.528 186962 DEBUG nova.scheduler.client.report [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.551 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.552 186962 DEBUG nova.compute.manager [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.606 186962 DEBUG nova.compute.manager [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.606 186962 DEBUG nova.network.neutron [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.627 186962 INFO nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.642 186962 DEBUG nova.compute.manager [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.793 186962 DEBUG nova.compute.manager [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.794 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.795 186962 INFO nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Creating image(s)#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.795 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Acquiring lock "/var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.795 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "/var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.796 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "/var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.809 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.872 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.873 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Acquiring lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.874 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.886 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.945 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.946 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.983 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28,backing_fmt=raw /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.984 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "2eaa7b927781a5e92a9ac0df18b4323517195e28" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:16 np0005539505 nova_compute[186958]: 2025-11-29 08:02:16.984 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.053 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2eaa7b927781a5e92a9ac0df18b4323517195e28 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.054 186962 DEBUG nova.virt.disk.api [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Checking if we can resize image /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.054 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.110 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.111 186962 DEBUG nova.virt.disk.api [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Cannot resize image /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.112 186962 DEBUG nova.objects.instance [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lazy-loading 'migration_context' on Instance uuid 058df251-6e51-4b9f-937f-08868563ee24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.125 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.125 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Ensure instance console log exists: /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.126 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.126 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.127 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:17 np0005539505 nova_compute[186958]: 2025-11-29 08:02:17.263 186962 DEBUG nova.policy [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 29 03:02:18 np0005539505 nova_compute[186958]: 2025-11-29 08:02:18.916 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:19 np0005539505 nova_compute[186958]: 2025-11-29 08:02:19.824 186962 DEBUG nova.network.neutron [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Successfully created port: d86104b7-e43a-474d-9e35-200858602d45 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 29 03:02:20 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:20Z|00862|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 29 03:02:20 np0005539505 nova_compute[186958]: 2025-11-29 08:02:20.887 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:20 np0005539505 nova_compute[186958]: 2025-11-29 08:02:20.899 186962 DEBUG nova.network.neutron [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Successfully updated port: d86104b7-e43a-474d-9e35-200858602d45 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 29 03:02:21 np0005539505 nova_compute[186958]: 2025-11-29 08:02:21.003 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Acquiring lock "refresh_cache-058df251-6e51-4b9f-937f-08868563ee24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 03:02:21 np0005539505 nova_compute[186958]: 2025-11-29 08:02:21.003 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Acquired lock "refresh_cache-058df251-6e51-4b9f-937f-08868563ee24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 03:02:21 np0005539505 nova_compute[186958]: 2025-11-29 08:02:21.004 186962 DEBUG nova.network.neutron [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 29 03:02:21 np0005539505 nova_compute[186958]: 2025-11-29 08:02:21.095 186962 DEBUG nova.compute.manager [req-5284017d-a79e-46fe-a3c8-4dbe423adea8 req-c9fc547c-4e55-46ad-bd99-b7701ad4df3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Received event network-changed-d86104b7-e43a-474d-9e35-200858602d45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 03:02:21 np0005539505 nova_compute[186958]: 2025-11-29 08:02:21.095 186962 DEBUG nova.compute.manager [req-5284017d-a79e-46fe-a3c8-4dbe423adea8 req-c9fc547c-4e55-46ad-bd99-b7701ad4df3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Refreshing instance network info cache due to event network-changed-d86104b7-e43a-474d-9e35-200858602d45. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 03:02:21 np0005539505 nova_compute[186958]: 2025-11-29 08:02:21.095 186962 DEBUG oslo_concurrency.lockutils [req-5284017d-a79e-46fe-a3c8-4dbe423adea8 req-c9fc547c-4e55-46ad-bd99-b7701ad4df3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-058df251-6e51-4b9f-937f-08868563ee24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 03:02:21 np0005539505 nova_compute[186958]: 2025-11-29 08:02:21.375 186962 DEBUG nova.network.neutron [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.011 186962 DEBUG nova.network.neutron [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Updating instance_info_cache with network_info: [{"id": "d86104b7-e43a-474d-9e35-200858602d45", "address": "fa:16:3e:5b:97:03", "network": {"id": "a053bd92-7419-4340-b338-8c9eda334695", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1014759778-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3c11b975eeef454fa22662a762affc9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86104b7-e4", "ovs_interfaceid": "d86104b7-e43a-474d-9e35-200858602d45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.071 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Releasing lock "refresh_cache-058df251-6e51-4b9f-937f-08868563ee24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.072 186962 DEBUG nova.compute.manager [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Instance network_info: |[{"id": "d86104b7-e43a-474d-9e35-200858602d45", "address": "fa:16:3e:5b:97:03", "network": {"id": "a053bd92-7419-4340-b338-8c9eda334695", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1014759778-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3c11b975eeef454fa22662a762affc9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86104b7-e4", "ovs_interfaceid": "d86104b7-e43a-474d-9e35-200858602d45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.072 186962 DEBUG oslo_concurrency.lockutils [req-5284017d-a79e-46fe-a3c8-4dbe423adea8 req-c9fc547c-4e55-46ad-bd99-b7701ad4df3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-058df251-6e51-4b9f-937f-08868563ee24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.072 186962 DEBUG nova.network.neutron [req-5284017d-a79e-46fe-a3c8-4dbe423adea8 req-c9fc547c-4e55-46ad-bd99-b7701ad4df3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Refreshing network info cache for port d86104b7-e43a-474d-9e35-200858602d45 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.075 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Start _get_guest_xml network_info=[{"id": "d86104b7-e43a-474d-9e35-200858602d45", "address": "fa:16:3e:5b:97:03", "network": {"id": "a053bd92-7419-4340-b338-8c9eda334695", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1014759778-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3c11b975eeef454fa22662a762affc9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86104b7-e4", "ovs_interfaceid": "d86104b7-e43a-474d-9e35-200858602d45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'size': 0, 'boot_index': 0, 'device_type': 'disk', 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encrypted': False, 'guest_format': None, 'image_id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.079 186962 WARNING nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.083 186962 DEBUG nova.virt.libvirt.host [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.084 186962 DEBUG nova.virt.libvirt.host [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.086 186962 DEBUG nova.virt.libvirt.host [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.087 186962 DEBUG nova.virt.libvirt.host [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.088 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.088 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T06:47:00Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T06:47:02Z,direct_url=<?>,disk_format='qcow2',id=5d270706-931c-4fd1-846d-ba6ddeac2a79,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='a5c64dbe41154199a092e2cf8640d8ba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T06:47:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.088 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.089 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.089 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.089 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.089 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.089 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.090 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.090 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.090 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.090 186962 DEBUG nova.virt.hardware [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.094 186962 DEBUG nova.virt.libvirt.vif [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T08:02:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1731420900',display_name='tempest-TestServerBasicOps-server-1731420900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1731420900',id=185,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBeU1cc1A3oubUgmFqzVQGGjzs+lE8IJZUVZaoKpr6cfNMuO5sy18fQd7r4kC0tonosditk7SlhijtVMWQXRjXt3dWTLk3CHE2OvAH026U0SXODUaMLaWMEufDtXy84EFQ==',key_name='tempest-TestServerBasicOps-619853107',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3c11b975eeef454fa22662a762affc9c',ramdisk_id='',reservation_id='r-k0z16k4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-511476965',owner_user_name='tempest-TestServerBasicOps-511476965-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T08:02:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='28d5e849ea254d80ae2b0b1654f39d29',uuid=058df251-6e51-4b9f-937f-08868563ee24,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d86104b7-e43a-474d-9e35-200858602d45", "address": "fa:16:3e:5b:97:03", "network": {"id": "a053bd92-7419-4340-b338-8c9eda334695", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1014759778-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3c11b975eeef454fa22662a762affc9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86104b7-e4", "ovs_interfaceid": "d86104b7-e43a-474d-9e35-200858602d45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.094 186962 DEBUG nova.network.os_vif_util [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Converting VIF {"id": "d86104b7-e43a-474d-9e35-200858602d45", "address": "fa:16:3e:5b:97:03", "network": {"id": "a053bd92-7419-4340-b338-8c9eda334695", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1014759778-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3c11b975eeef454fa22662a762affc9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86104b7-e4", "ovs_interfaceid": "d86104b7-e43a-474d-9e35-200858602d45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.095 186962 DEBUG nova.network.os_vif_util [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:97:03,bridge_name='br-int',has_traffic_filtering=True,id=d86104b7-e43a-474d-9e35-200858602d45,network=Network(a053bd92-7419-4340-b338-8c9eda334695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86104b7-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.096 186962 DEBUG nova.objects.instance [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lazy-loading 'pci_devices' on Instance uuid 058df251-6e51-4b9f-937f-08868563ee24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.113 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] End _get_guest_xml xml=<domain type="kvm">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  <uuid>058df251-6e51-4b9f-937f-08868563ee24</uuid>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  <name>instance-000000b9</name>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  <memory>131072</memory>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  <vcpu>1</vcpu>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  <metadata>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <nova:name>tempest-TestServerBasicOps-server-1731420900</nova:name>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <nova:creationTime>2025-11-29 08:02:22</nova:creationTime>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <nova:flavor name="m1.nano">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:        <nova:memory>128</nova:memory>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:        <nova:disk>1</nova:disk>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:        <nova:swap>0</nova:swap>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:        <nova:ephemeral>0</nova:ephemeral>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:        <nova:vcpus>1</nova:vcpus>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      </nova:flavor>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <nova:owner>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:        <nova:user uuid="28d5e849ea254d80ae2b0b1654f39d29">tempest-TestServerBasicOps-511476965-project-member</nova:user>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:        <nova:project uuid="3c11b975eeef454fa22662a762affc9c">tempest-TestServerBasicOps-511476965</nova:project>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      </nova:owner>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <nova:root type="image" uuid="5d270706-931c-4fd1-846d-ba6ddeac2a79"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <nova:ports>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:        <nova:port uuid="d86104b7-e43a-474d-9e35-200858602d45">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:        </nova:port>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      </nova:ports>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    </nova:instance>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  </metadata>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  <sysinfo type="smbios">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <system>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <entry name="manufacturer">RDO</entry>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <entry name="product">OpenStack Compute</entry>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <entry name="serial">058df251-6e51-4b9f-937f-08868563ee24</entry>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <entry name="uuid">058df251-6e51-4b9f-937f-08868563ee24</entry>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <entry name="family">Virtual Machine</entry>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    </system>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  </sysinfo>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  <os>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <boot dev="hd"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <smbios mode="sysinfo"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  </os>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  <features>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <acpi/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <apic/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <vmcoreinfo/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  </features>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  <clock offset="utc">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <timer name="pit" tickpolicy="delay"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <timer name="hpet" present="no"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  </clock>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  <cpu mode="custom" match="exact">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <model>Nehalem</model>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <topology sockets="1" cores="1" threads="1"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  </cpu>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  <devices>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <disk type="file" device="disk">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <target dev="vda" bus="virtio"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    </disk>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <disk type="file" device="cdrom">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <driver name="qemu" type="raw" cache="none"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <source file="/var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk.config"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <target dev="sda" bus="sata"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    </disk>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <interface type="ethernet">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <mac address="fa:16:3e:5b:97:03"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <driver name="vhost" rx_queue_size="512"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <mtu size="1442"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <target dev="tapd86104b7-e4"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    </interface>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <serial type="pty">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <log file="/var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/console.log" append="off"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    </serial>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <video>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <model type="virtio"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    </video>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <input type="tablet" bus="usb"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <rng model="virtio">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <backend model="random">/dev/urandom</backend>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    </rng>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="pci" model="pcie-root-port"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <controller type="usb" index="0"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    <memballoon model="virtio">
Nov 29 03:02:22 np0005539505 nova_compute[186958]:      <stats period="10"/>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:    </memballoon>
Nov 29 03:02:22 np0005539505 nova_compute[186958]:  </devices>
Nov 29 03:02:22 np0005539505 nova_compute[186958]: </domain>
Nov 29 03:02:22 np0005539505 nova_compute[186958]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.114 186962 DEBUG nova.compute.manager [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Preparing to wait for external event network-vif-plugged-d86104b7-e43a-474d-9e35-200858602d45 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.114 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Acquiring lock "058df251-6e51-4b9f-937f-08868563ee24-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.115 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.115 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.115 186962 DEBUG nova.virt.libvirt.vif [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T08:02:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1731420900',display_name='tempest-TestServerBasicOps-server-1731420900',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1731420900',id=185,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBeU1cc1A3oubUgmFqzVQGGjzs+lE8IJZUVZaoKpr6cfNMuO5sy18fQd7r4kC0tonosditk7SlhijtVMWQXRjXt3dWTLk3CHE2OvAH026U0SXODUaMLaWMEufDtXy84EFQ==',key_name='tempest-TestServerBasicOps-619853107',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3c11b975eeef454fa22662a762affc9c',ramdisk_id='',reservation_id='r-k0z16k4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-511476965',owner_user_name='tempest-TestServerBasicOps-511476965-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T08:02:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='28d5e849ea254d80ae2b0b1654f39d29',uuid=058df251-6e51-4b9f-937f-08868563ee24,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d86104b7-e43a-474d-9e35-200858602d45", "address": "fa:16:3e:5b:97:03", "network": {"id": "a053bd92-7419-4340-b338-8c9eda334695", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1014759778-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3c11b975eeef454fa22662a762affc9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86104b7-e4", "ovs_interfaceid": "d86104b7-e43a-474d-9e35-200858602d45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.116 186962 DEBUG nova.network.os_vif_util [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Converting VIF {"id": "d86104b7-e43a-474d-9e35-200858602d45", "address": "fa:16:3e:5b:97:03", "network": {"id": "a053bd92-7419-4340-b338-8c9eda334695", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1014759778-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3c11b975eeef454fa22662a762affc9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86104b7-e4", "ovs_interfaceid": "d86104b7-e43a-474d-9e35-200858602d45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.117 186962 DEBUG nova.network.os_vif_util [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:97:03,bridge_name='br-int',has_traffic_filtering=True,id=d86104b7-e43a-474d-9e35-200858602d45,network=Network(a053bd92-7419-4340-b338-8c9eda334695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86104b7-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.117 186962 DEBUG os_vif [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:97:03,bridge_name='br-int',has_traffic_filtering=True,id=d86104b7-e43a-474d-9e35-200858602d45,network=Network(a053bd92-7419-4340-b338-8c9eda334695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86104b7-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.118 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.118 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.118 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.123 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.123 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd86104b7-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.123 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd86104b7-e4, col_values=(('external_ids', {'iface-id': 'd86104b7-e43a-474d-9e35-200858602d45', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:97:03', 'vm-uuid': '058df251-6e51-4b9f-937f-08868563ee24'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.125 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:22 np0005539505 NetworkManager[55134]: <info>  [1764403342.1261] manager: (tapd86104b7-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/431)
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.127 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.131 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.132 186962 INFO os_vif [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:97:03,bridge_name='br-int',has_traffic_filtering=True,id=d86104b7-e43a-474d-9e35-200858602d45,network=Network(a053bd92-7419-4340-b338-8c9eda334695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86104b7-e4')#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.194 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.195 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.195 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] No VIF found with MAC fa:16:3e:5b:97:03, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 29 03:02:22 np0005539505 nova_compute[186958]: 2025-11-29 08:02:22.195 186962 INFO nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Using config drive#033[00m
Nov 29 03:02:23 np0005539505 nova_compute[186958]: 2025-11-29 08:02:23.918 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:24 np0005539505 nova_compute[186958]: 2025-11-29 08:02:24.265 186962 INFO nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Creating config drive at /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk.config#033[00m
Nov 29 03:02:24 np0005539505 nova_compute[186958]: 2025-11-29 08:02:24.272 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmponx_jk2d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 03:02:24 np0005539505 nova_compute[186958]: 2025-11-29 08:02:24.402 186962 DEBUG oslo_concurrency.processutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmponx_jk2d" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 03:02:24 np0005539505 kernel: tapd86104b7-e4: entered promiscuous mode
Nov 29 03:02:24 np0005539505 NetworkManager[55134]: <info>  [1764403344.4780] manager: (tapd86104b7-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/432)
Nov 29 03:02:24 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:24Z|00863|binding|INFO|Claiming lport d86104b7-e43a-474d-9e35-200858602d45 for this chassis.
Nov 29 03:02:24 np0005539505 nova_compute[186958]: 2025-11-29 08:02:24.477 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:24 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:24Z|00864|binding|INFO|d86104b7-e43a-474d-9e35-200858602d45: Claiming fa:16:3e:5b:97:03 10.100.0.9
Nov 29 03:02:24 np0005539505 nova_compute[186958]: 2025-11-29 08:02:24.482 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:24 np0005539505 systemd-udevd[256541]: Network interface NamePolicy= disabled on kernel command line.
Nov 29 03:02:24 np0005539505 NetworkManager[55134]: <info>  [1764403344.5354] device (tapd86104b7-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 29 03:02:24 np0005539505 NetworkManager[55134]: <info>  [1764403344.5367] device (tapd86104b7-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 29 03:02:24 np0005539505 systemd-machined[153285]: New machine qemu-90-instance-000000b9.
Nov 29 03:02:24 np0005539505 nova_compute[186958]: 2025-11-29 08:02:24.542 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:24 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:24Z|00865|binding|INFO|Setting lport d86104b7-e43a-474d-9e35-200858602d45 ovn-installed in OVS
Nov 29 03:02:24 np0005539505 nova_compute[186958]: 2025-11-29 08:02:24.553 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:24 np0005539505 systemd[1]: Started Virtual Machine qemu-90-instance-000000b9.
Nov 29 03:02:24 np0005539505 nova_compute[186958]: 2025-11-29 08:02:24.862 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403344.8614354, 058df251-6e51-4b9f-937f-08868563ee24 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 03:02:24 np0005539505 nova_compute[186958]: 2025-11-29 08:02:24.862 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 058df251-6e51-4b9f-937f-08868563ee24] VM Started (Lifecycle Event)#033[00m
Nov 29 03:02:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:24.910 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:97:03 10.100.0.9'], port_security=['fa:16:3e:5b:97:03 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '058df251-6e51-4b9f-937f-08868563ee24', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a053bd92-7419-4340-b338-8c9eda334695', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c11b975eeef454fa22662a762affc9c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5088d4cc-c787-4b31-b84f-cd25421e032f a5148027-7b1d-4285-977d-e0cf6cfc66a2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a226e0f9-4cbd-4be3-91d3-c87f616ac2b3, chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=d86104b7-e43a-474d-9e35-200858602d45) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 03:02:24 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:24Z|00866|binding|INFO|Setting lport d86104b7-e43a-474d-9e35-200858602d45 up in Southbound
Nov 29 03:02:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:24.912 104094 INFO neutron.agent.ovn.metadata.agent [-] Port d86104b7-e43a-474d-9e35-200858602d45 in datapath a053bd92-7419-4340-b338-8c9eda334695 bound to our chassis#033[00m
Nov 29 03:02:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:24.913 104094 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a053bd92-7419-4340-b338-8c9eda334695#033[00m
Nov 29 03:02:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:24.925 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8ed16a-4ff9-45b6-8d07-b9509c10418e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:24.927 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa053bd92-71 in ovnmeta-a053bd92-7419-4340-b338-8c9eda334695 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 29 03:02:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:24.928 213906 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa053bd92-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 29 03:02:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:24.929 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[f7e94b71-1afa-43e8-87d1-60607257f7a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:24.929 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[c86b477e-ec4e-45da-9458-f821862e4bbe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:24.942 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[27f286cb-b824-42d2-bcf4-325f6cc0503d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:24 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:24.968 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[cb247830-09ef-4441-8573-b16e748f1f85]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:24.999 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[90bc6b80-bb18-49c5-ba79-06c7048f887c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.004 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ab8a4c-0a07-4a49-9b1d-d4f6f626504c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 NetworkManager[55134]: <info>  [1764403345.0057] manager: (tapa053bd92-70): new Veth device (/org/freedesktop/NetworkManager/Devices/433)
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.039 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[26e587bd-be47-4fbb-8e2b-4d79c2195682]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.042 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[30bd433b-eeb2-401b-8189-f47d1dd67f48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 podman[256560]: 2025-11-29 08:02:25.043836817 +0000 UTC m=+0.061818009 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 29 03:02:25 np0005539505 podman[256562]: 2025-11-29 08:02:25.063361859 +0000 UTC m=+0.078850941 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 03:02:25 np0005539505 NetworkManager[55134]: <info>  [1764403345.0640] device (tapa053bd92-70): carrier: link connected
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.068 213984 DEBUG oslo.privsep.daemon [-] privsep: reply[3de5bf88-78ae-4c88-873b-8ef1590c92d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.085 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[024730f0-1160-49c5-a35b-58c057eb2fb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa053bd92-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:bb:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 268], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 879264, 'reachable_time': 31391, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256624, 'error': None, 'target': 'ovnmeta-a053bd92-7419-4340-b338-8c9eda334695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.100 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[93d33438-32ec-4c88-83fa-628cc7a4fa18]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:bb1e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 879264, 'tstamp': 879264}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256625, 'error': None, 'target': 'ovnmeta-a053bd92-7419-4340-b338-8c9eda334695', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.116 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[d65c5520-88af-4184-bc2a-ccd3c17b7f57]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa053bd92-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:bb:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 268], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 879264, 'reachable_time': 31391, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 256626, 'error': None, 'target': 'ovnmeta-a053bd92-7419-4340-b338-8c9eda334695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.147 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[bc1ed463-9a3b-425b-b80b-d15fee327a95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.205 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[8298e70b-ed73-4876-9d70-1d6d26ff8883]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.206 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa053bd92-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.206 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.207 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa053bd92-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:02:25 np0005539505 nova_compute[186958]: 2025-11-29 08:02:25.208 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:25 np0005539505 kernel: tapa053bd92-70: entered promiscuous mode
Nov 29 03:02:25 np0005539505 NetworkManager[55134]: <info>  [1764403345.2091] manager: (tapa053bd92-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/434)
Nov 29 03:02:25 np0005539505 nova_compute[186958]: 2025-11-29 08:02:25.210 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.211 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa053bd92-70, col_values=(('external_ids', {'iface-id': '4a52cd36-ebba-480f-9fd2-2a597179d0ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:02:25 np0005539505 nova_compute[186958]: 2025-11-29 08:02:25.212 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:25 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:25Z|00867|binding|INFO|Releasing lport 4a52cd36-ebba-480f-9fd2-2a597179d0ee from this chassis (sb_readonly=1)
Nov 29 03:02:25 np0005539505 nova_compute[186958]: 2025-11-29 08:02:25.223 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.224 104094 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a053bd92-7419-4340-b338-8c9eda334695.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a053bd92-7419-4340-b338-8c9eda334695.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.225 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[909ba226-9e7e-4534-a67f-c0cc7b37bed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.225 104094 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: global
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    log         /dev/log local0 debug
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    log-tag     haproxy-metadata-proxy-a053bd92-7419-4340-b338-8c9eda334695
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    user        root
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    group       root
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    maxconn     1024
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    pidfile     /var/lib/neutron/external/pids/a053bd92-7419-4340-b338-8c9eda334695.pid.haproxy
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    daemon
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: defaults
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    log global
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    mode http
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    option httplog
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    option dontlognull
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    option http-server-close
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    option forwardfor
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    retries                 3
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    timeout http-request    30s
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    timeout connect         30s
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    timeout client          32s
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    timeout server          32s
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    timeout http-keep-alive 30s
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: listen listener
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    bind 169.254.169.254:80
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    server metadata /var/lib/neutron/metadata_proxy
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]:    http-request add-header X-OVN-Network-ID a053bd92-7419-4340-b338-8c9eda334695
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 29 03:02:25 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:25.226 104094 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a053bd92-7419-4340-b338-8c9eda334695', 'env', 'PROCESS_TAG=haproxy-a053bd92-7419-4340-b338-8c9eda334695', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a053bd92-7419-4340-b338-8c9eda334695.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 29 03:02:25 np0005539505 nova_compute[186958]: 2025-11-29 08:02:25.299 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 03:02:25 np0005539505 nova_compute[186958]: 2025-11-29 08:02:25.304 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403344.8616748, 058df251-6e51-4b9f-937f-08868563ee24 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 03:02:25 np0005539505 nova_compute[186958]: 2025-11-29 08:02:25.305 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 058df251-6e51-4b9f-937f-08868563ee24] VM Paused (Lifecycle Event)#033[00m
Nov 29 03:02:25 np0005539505 podman[256658]: 2025-11-29 08:02:25.587603463 +0000 UTC m=+0.047515265 container create 0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 03:02:25 np0005539505 systemd[1]: Started libpod-conmon-0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af.scope.
Nov 29 03:02:25 np0005539505 systemd[1]: Started libcrun container.
Nov 29 03:02:25 np0005539505 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b98f05982ab35dbaef72ec7a12697b8ad3e629093a2d25f39a3145fd704462ee/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 29 03:02:25 np0005539505 podman[256658]: 2025-11-29 08:02:25.561817574 +0000 UTC m=+0.021729406 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 29 03:02:25 np0005539505 podman[256658]: 2025-11-29 08:02:25.73179122 +0000 UTC m=+0.191703082 container init 0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:02:25 np0005539505 podman[256658]: 2025-11-29 08:02:25.737560263 +0000 UTC m=+0.197472085 container start 0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 29 03:02:25 np0005539505 neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695[256673]: [NOTICE]   (256677) : New worker (256679) forked
Nov 29 03:02:25 np0005539505 neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695[256673]: [NOTICE]   (256677) : Loading success.
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.250 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.255 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.316 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 058df251-6e51-4b9f-937f-08868563ee24] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.435 186962 DEBUG nova.network.neutron [req-5284017d-a79e-46fe-a3c8-4dbe423adea8 req-c9fc547c-4e55-46ad-bd99-b7701ad4df3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Updated VIF entry in instance network info cache for port d86104b7-e43a-474d-9e35-200858602d45. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.436 186962 DEBUG nova.network.neutron [req-5284017d-a79e-46fe-a3c8-4dbe423adea8 req-c9fc547c-4e55-46ad-bd99-b7701ad4df3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Updating instance_info_cache with network_info: [{"id": "d86104b7-e43a-474d-9e35-200858602d45", "address": "fa:16:3e:5b:97:03", "network": {"id": "a053bd92-7419-4340-b338-8c9eda334695", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1014759778-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3c11b975eeef454fa22662a762affc9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86104b7-e4", "ovs_interfaceid": "d86104b7-e43a-474d-9e35-200858602d45", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.516 186962 DEBUG oslo_concurrency.lockutils [req-5284017d-a79e-46fe-a3c8-4dbe423adea8 req-c9fc547c-4e55-46ad-bd99-b7701ad4df3c 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-058df251-6e51-4b9f-937f-08868563ee24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.647 186962 DEBUG nova.compute.manager [req-1afa92c5-6225-423d-85e5-cc97750e91d4 req-2a80f57a-038b-442e-862e-bae5eba0d46d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Received event network-vif-plugged-d86104b7-e43a-474d-9e35-200858602d45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.647 186962 DEBUG oslo_concurrency.lockutils [req-1afa92c5-6225-423d-85e5-cc97750e91d4 req-2a80f57a-038b-442e-862e-bae5eba0d46d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "058df251-6e51-4b9f-937f-08868563ee24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.648 186962 DEBUG oslo_concurrency.lockutils [req-1afa92c5-6225-423d-85e5-cc97750e91d4 req-2a80f57a-038b-442e-862e-bae5eba0d46d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.648 186962 DEBUG oslo_concurrency.lockutils [req-1afa92c5-6225-423d-85e5-cc97750e91d4 req-2a80f57a-038b-442e-862e-bae5eba0d46d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.648 186962 DEBUG nova.compute.manager [req-1afa92c5-6225-423d-85e5-cc97750e91d4 req-2a80f57a-038b-442e-862e-bae5eba0d46d 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Processing event network-vif-plugged-d86104b7-e43a-474d-9e35-200858602d45 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.649 186962 DEBUG nova.compute.manager [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.653 186962 DEBUG nova.virt.driver [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] Emitting event <LifecycleEvent: 1764403346.653036, 058df251-6e51-4b9f-937f-08868563ee24 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.653 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 058df251-6e51-4b9f-937f-08868563ee24] VM Resumed (Lifecycle Event)#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.655 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.658 186962 INFO nova.virt.libvirt.driver [-] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Instance spawned successfully.#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.659 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.674 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.678 186962 DEBUG nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.682 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.682 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.683 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.683 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.683 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.684 186962 DEBUG nova.virt.libvirt.driver [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.707 186962 INFO nova.compute.manager [None req-88f2d9af-2b4a-4a3d-be99-88de56ac7c92 - - - - - -] [instance: 058df251-6e51-4b9f-937f-08868563ee24] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.870 186962 INFO nova.compute.manager [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Took 10.08 seconds to spawn the instance on the hypervisor.#033[00m
Nov 29 03:02:26 np0005539505 nova_compute[186958]: 2025-11-29 08:02:26.870 186962 DEBUG nova.compute.manager [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 03:02:27 np0005539505 nova_compute[186958]: 2025-11-29 08:02:27.115 186962 INFO nova.compute.manager [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Took 10.79 seconds to build instance.#033[00m
Nov 29 03:02:27 np0005539505 nova_compute[186958]: 2025-11-29 08:02:27.127 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:27 np0005539505 nova_compute[186958]: 2025-11-29 08:02:27.131 186962 DEBUG oslo_concurrency.lockutils [None req-80e14d52-7f68-4f8a-ba43-fcfd3a1dc159 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:27.552 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:27.553 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:27.553 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:28 np0005539505 nova_compute[186958]: 2025-11-29 08:02:28.819 186962 DEBUG nova.compute.manager [req-0992eff9-e9ce-4a76-a9e5-d1d003eee73c req-c189d7a7-5858-4110-82c3-7236c8ae4b2b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Received event network-vif-plugged-d86104b7-e43a-474d-9e35-200858602d45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 03:02:28 np0005539505 nova_compute[186958]: 2025-11-29 08:02:28.820 186962 DEBUG oslo_concurrency.lockutils [req-0992eff9-e9ce-4a76-a9e5-d1d003eee73c req-c189d7a7-5858-4110-82c3-7236c8ae4b2b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "058df251-6e51-4b9f-937f-08868563ee24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:28 np0005539505 nova_compute[186958]: 2025-11-29 08:02:28.821 186962 DEBUG oslo_concurrency.lockutils [req-0992eff9-e9ce-4a76-a9e5-d1d003eee73c req-c189d7a7-5858-4110-82c3-7236c8ae4b2b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:28 np0005539505 nova_compute[186958]: 2025-11-29 08:02:28.821 186962 DEBUG oslo_concurrency.lockutils [req-0992eff9-e9ce-4a76-a9e5-d1d003eee73c req-c189d7a7-5858-4110-82c3-7236c8ae4b2b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:28 np0005539505 nova_compute[186958]: 2025-11-29 08:02:28.821 186962 DEBUG nova.compute.manager [req-0992eff9-e9ce-4a76-a9e5-d1d003eee73c req-c189d7a7-5858-4110-82c3-7236c8ae4b2b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] No waiting events found dispatching network-vif-plugged-d86104b7-e43a-474d-9e35-200858602d45 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 03:02:28 np0005539505 nova_compute[186958]: 2025-11-29 08:02:28.822 186962 WARNING nova.compute.manager [req-0992eff9-e9ce-4a76-a9e5-d1d003eee73c req-c189d7a7-5858-4110-82c3-7236c8ae4b2b 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Received unexpected event network-vif-plugged-d86104b7-e43a-474d-9e35-200858602d45 for instance with vm_state active and task_state None.#033[00m
Nov 29 03:02:28 np0005539505 nova_compute[186958]: 2025-11-29 08:02:28.921 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:29 np0005539505 podman[256688]: 2025-11-29 08:02:29.716895216 +0000 UTC m=+0.052167076 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 29 03:02:29 np0005539505 NetworkManager[55134]: <info>  [1764403349.7812] manager: (patch-br-int-to-provnet-896b5abb-5247-4344-8023-26d3ba67647f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/435)
Nov 29 03:02:29 np0005539505 NetworkManager[55134]: <info>  [1764403349.7821] manager: (patch-provnet-896b5abb-5247-4344-8023-26d3ba67647f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/436)
Nov 29 03:02:29 np0005539505 nova_compute[186958]: 2025-11-29 08:02:29.780 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:29 np0005539505 nova_compute[186958]: 2025-11-29 08:02:29.930 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:29 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:29Z|00868|binding|INFO|Releasing lport 4a52cd36-ebba-480f-9fd2-2a597179d0ee from this chassis (sb_readonly=0)
Nov 29 03:02:29 np0005539505 nova_compute[186958]: 2025-11-29 08:02:29.948 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:30 np0005539505 nova_compute[186958]: 2025-11-29 08:02:30.721 186962 DEBUG nova.compute.manager [req-8497d0d4-32f0-4d6c-aab7-7adb05526a98 req-696b947f-8d88-4d1f-a8ae-9b2964a83db8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Received event network-changed-d86104b7-e43a-474d-9e35-200858602d45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 03:02:30 np0005539505 nova_compute[186958]: 2025-11-29 08:02:30.722 186962 DEBUG nova.compute.manager [req-8497d0d4-32f0-4d6c-aab7-7adb05526a98 req-696b947f-8d88-4d1f-a8ae-9b2964a83db8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Refreshing instance network info cache due to event network-changed-d86104b7-e43a-474d-9e35-200858602d45. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 29 03:02:30 np0005539505 nova_compute[186958]: 2025-11-29 08:02:30.722 186962 DEBUG oslo_concurrency.lockutils [req-8497d0d4-32f0-4d6c-aab7-7adb05526a98 req-696b947f-8d88-4d1f-a8ae-9b2964a83db8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "refresh_cache-058df251-6e51-4b9f-937f-08868563ee24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 29 03:02:30 np0005539505 nova_compute[186958]: 2025-11-29 08:02:30.722 186962 DEBUG oslo_concurrency.lockutils [req-8497d0d4-32f0-4d6c-aab7-7adb05526a98 req-696b947f-8d88-4d1f-a8ae-9b2964a83db8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquired lock "refresh_cache-058df251-6e51-4b9f-937f-08868563ee24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 29 03:02:30 np0005539505 nova_compute[186958]: 2025-11-29 08:02:30.723 186962 DEBUG nova.network.neutron [req-8497d0d4-32f0-4d6c-aab7-7adb05526a98 req-696b947f-8d88-4d1f-a8ae-9b2964a83db8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Refreshing network info cache for port d86104b7-e43a-474d-9e35-200858602d45 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 29 03:02:32 np0005539505 nova_compute[186958]: 2025-11-29 08:02:32.131 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:32 np0005539505 nova_compute[186958]: 2025-11-29 08:02:32.430 186962 DEBUG nova.network.neutron [req-8497d0d4-32f0-4d6c-aab7-7adb05526a98 req-696b947f-8d88-4d1f-a8ae-9b2964a83db8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Updated VIF entry in instance network info cache for port d86104b7-e43a-474d-9e35-200858602d45. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 29 03:02:32 np0005539505 nova_compute[186958]: 2025-11-29 08:02:32.431 186962 DEBUG nova.network.neutron [req-8497d0d4-32f0-4d6c-aab7-7adb05526a98 req-696b947f-8d88-4d1f-a8ae-9b2964a83db8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Updating instance_info_cache with network_info: [{"id": "d86104b7-e43a-474d-9e35-200858602d45", "address": "fa:16:3e:5b:97:03", "network": {"id": "a053bd92-7419-4340-b338-8c9eda334695", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1014759778-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3c11b975eeef454fa22662a762affc9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86104b7-e4", "ovs_interfaceid": "d86104b7-e43a-474d-9e35-200858602d45", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 03:02:32 np0005539505 nova_compute[186958]: 2025-11-29 08:02:32.626 186962 DEBUG oslo_concurrency.lockutils [req-8497d0d4-32f0-4d6c-aab7-7adb05526a98 req-696b947f-8d88-4d1f-a8ae-9b2964a83db8 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Releasing lock "refresh_cache-058df251-6e51-4b9f-937f-08868563ee24" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 29 03:02:33 np0005539505 nova_compute[186958]: 2025-11-29 08:02:33.922 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:37 np0005539505 nova_compute[186958]: 2025-11-29 08:02:37.182 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:37 np0005539505 podman[256708]: 2025-11-29 08:02:37.738343034 +0000 UTC m=+0.061913842 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 29 03:02:37 np0005539505 podman[256709]: 2025-11-29 08:02:37.768055004 +0000 UTC m=+0.087335411 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:02:38 np0005539505 nova_compute[186958]: 2025-11-29 08:02:38.924 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:39 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:39Z|00104|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:97:03 10.100.0.9
Nov 29 03:02:39 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:39Z|00105|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:97:03 10.100.0.9
Nov 29 03:02:42 np0005539505 nova_compute[186958]: 2025-11-29 08:02:42.202 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:42 np0005539505 podman[256776]: 2025-11-29 08:02:42.72011936 +0000 UTC m=+0.053449722 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:02:42 np0005539505 podman[256777]: 2025-11-29 08:02:42.727270613 +0000 UTC m=+0.054438241 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:02:43 np0005539505 nova_compute[186958]: 2025-11-29 08:02:43.927 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:47 np0005539505 nova_compute[186958]: 2025-11-29 08:02:47.206 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.115 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '058df251-6e51-4b9f-937f-08868563ee24', 'name': 'tempest-TestServerBasicOps-server-1731420900', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b9', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3c11b975eeef454fa22662a762affc9c', 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'hostId': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.128 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.128 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6167c774-2b84-4f81-aa8e-a5d49ce95636', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-vda', 'timestamp': '2025-11-29T08:02:48.115808', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb7ba8de-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.756641708, 'message_signature': 'f49fd31e2bda880b1a7fc225bb67f30309e3fe7117394b916cb172fd233d48ff'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512000, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-sda', 'timestamp': '2025-11-29T08:02:48.115808', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb7bb860-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.756641708, 'message_signature': 'd1f098683b6fb8c8b18ea0b7fc4c30f7f329cfc0581ddd2bcb3d00abc992c65b'}]}, 'timestamp': '2025-11-29 08:02:48.129324', '_unique_id': '62f4a99bb62a461aaf4653fd744f54da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.130 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.154 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.read.requests volume: 1105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.155 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c48f1230-4eb6-4a0f-a0a1-159035534542', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1105, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-vda', 'timestamp': '2025-11-29T08:02:48.131567', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb7fa290-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': '5c95e7c763b1a958b03ba714bc255bda14b6e6e844fd6340a7e4001440429e7e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 120, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-sda', 'timestamp': '2025-11-29T08:02:48.131567', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb7fb0a0-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': '8e3e78c069e637486a02d24a2c683a1c99441ceb49dc54406b1d654c1ee36b7d'}]}, 'timestamp': '2025-11-29 08:02:48.155293', '_unique_id': '8a5c2709d1444cd1b3c392006ec11d7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.159 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 058df251-6e51-4b9f-937f-08868563ee24 / tapd86104b7-e4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.159 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9eff1e6-bb37-49fc-9f32-be17c5280e3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': 'instance-000000b9-058df251-6e51-4b9f-937f-08868563ee24-tapd86104b7-e4', 'timestamp': '2025-11-29T08:02:48.156933', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'tapd86104b7-e4', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:97:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86104b7-e4'}, 'message_id': 'cb805f96-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.797795501, 'message_signature': 'ba5e98a2efa0667f9426982336fb1251c9dbe7eb86bb2089b315abc39eda8a6a'}]}, 'timestamp': '2025-11-29 08:02:48.159765', '_unique_id': 'fc2a2760999d4e2fb9089a898b6d2e03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.160 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.175 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/memory.usage volume: 42.7421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be4cb9bf-a0da-47ef-a765-fed3bbadc789', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7421875, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24', 'timestamp': '2025-11-29T08:02:48.160940', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'cb82f3e6-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.816602172, 'message_signature': '17c50d3577957219d72d619a9ab1af3448f44a6990bc0f957a848f5083253d88'}]}, 'timestamp': '2025-11-29 08:02:48.176718', '_unique_id': 'b3c393938b924c4fa5629ad261df1a9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.177 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.178 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.178 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.read.bytes volume: 30525952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.178 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59d425d5-4af6-4507-8c9c-dcf0057d6e9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30525952, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-vda', 'timestamp': '2025-11-29T08:02:48.178609', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb834a08-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': '92c5f8821900334706f480f3bf2970fffa15a1b144c48c68a079e01f0399c9c5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 299326, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-sda', 'timestamp': '2025-11-29T08:02:48.178609', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb835674-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': 'ab5d959242766d63710c435f77a1b6c1323bbab11abd7bd770af00e750d48746'}]}, 'timestamp': '2025-11-29 08:02:48.179199', '_unique_id': 'a01587663b4b4be99952e851248aa729'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.179 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.180 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.180 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.180 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1731420900>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1731420900>]
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/network.outgoing.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39a7e255-35bd-4474-98cb-45bff1c055e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': 'instance-000000b9-058df251-6e51-4b9f-937f-08868563ee24-tapd86104b7-e4', 'timestamp': '2025-11-29T08:02:48.181108', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'tapd86104b7-e4', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:97:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86104b7-e4'}, 'message_id': 'cb83ac64-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.797795501, 'message_signature': 'd72ba0f5cf599e0a5875650300915e3b19ca3c0753e9230eba6b3e9410d2e293'}]}, 'timestamp': '2025-11-29 08:02:48.181390', '_unique_id': '293be12f58d34f58a2e3f51e303064d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.181 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.182 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.write.bytes volume: 72916992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.182 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c277e3a3-7254-4c6f-b425-14ec1cbedc76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72916992, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-vda', 'timestamp': '2025-11-29T08:02:48.182620', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb83e63e-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': 'ce8bcc7e60bbcd70dd2245ed3f7566b949c45c667f08c84254854788e33858f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-sda', 'timestamp': '2025-11-29T08:02:48.182620', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb83ef8a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': 'd8a0fcdf0c6289e8b4a242c06d0a79b2aaa3a848fcbbaf93734764f5297b003a'}]}, 'timestamp': '2025-11-29 08:02:48.183072', '_unique_id': '8fd21d5d3a174f419acbb141f884fe85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.183 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba95d518-2afa-40a5-9ae4-80692eff907a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': 'instance-000000b9-058df251-6e51-4b9f-937f-08868563ee24-tapd86104b7-e4', 'timestamp': '2025-11-29T08:02:48.184195', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'tapd86104b7-e4', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:97:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86104b7-e4'}, 'message_id': 'cb8425b8-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.797795501, 'message_signature': '175e1bffe478e637c11ab140bb15a9e60c3afba094f470c307863571785cb6c3'}]}, 'timestamp': '2025-11-29 08:02:48.184476', '_unique_id': '9913b7a4807d4a41a9aaf38becd1bd9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.184 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.185 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/cpu volume: 11790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf800e3b-b525-4d17-a527-e04acce700f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11790000000, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24', 'timestamp': '2025-11-29T08:02:48.185521', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'cb845768-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.816602172, 'message_signature': '0525619aabead8fa8d3784016cb2c66065a9e768929e332df9baf2ce8338ba3d'}]}, 'timestamp': '2025-11-29 08:02:48.185739', '_unique_id': '96010a8e52b64f1ca6473674215e19db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.186 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e000d4ad-5871-4b14-9aac-b0a0310f9063', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': 'instance-000000b9-058df251-6e51-4b9f-937f-08868563ee24-tapd86104b7-e4', 'timestamp': '2025-11-29T08:02:48.186856', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'tapd86104b7-e4', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:97:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86104b7-e4'}, 'message_id': 'cb848c56-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.797795501, 'message_signature': '8be7cb9534c3b66974a7a80fa17f4addcffa8e4659b90401ac706073bfca19d1'}]}, 'timestamp': '2025-11-29 08:02:48.187105', '_unique_id': '65d581fd251548aabf250c3f392376b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.187 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.188 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.188 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1731420900>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1731420900>]
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.188 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.188 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.188 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c939810-cadb-41af-9e51-99ad277f9351', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-vda', 'timestamp': '2025-11-29T08:02:48.188689', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb84d40e-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.756641708, 'message_signature': '3ebff84aa0e1b9ab89bfa2ac67d42b7eba07f3e544c61b44f995e991b8cc6eb2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-sda', 'timestamp': '2025-11-29T08:02:48.188689', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb84dd82-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.756641708, 'message_signature': 'dabaca198e0232c86e2dcf2b756eba8b4ad241e87cac5b2c1a2ef0c89d80d5a6'}]}, 'timestamp': '2025-11-29 08:02:48.189176', '_unique_id': '975c7af9882243bcbf8e3670c09f6c4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.189 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.190 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.190 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1731420900>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1731420900>]
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.190 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.write.latency volume: 4077501784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db5175a0-7905-4e49-9dba-ac1043708933', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4077501784, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-vda', 'timestamp': '2025-11-29T08:02:48.190808', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb8528b4-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': '5ae5cf5767101efea1d9889ea1e5c395382b6ccf67ab1054371cc6baefd6aa59'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-sda', 'timestamp': '2025-11-29T08:02:48.190808', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb85344e-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': 'f7d74e0798bde15c8de94d91c360789b6d3cf18916fb41440d5767e8dde7707b'}]}, 'timestamp': '2025-11-29 08:02:48.191424', '_unique_id': 'a207745cf45f46418be3ca51ccf28ec8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.191 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.192 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.192 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.write.requests volume: 286 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06f2daf1-d2d0-47d5-a330-0c1720c6bbb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 286, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-vda', 'timestamp': '2025-11-29T08:02:48.192845', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb8576fc-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': '971c00174302b7f6197d6333ca7d3e2ee9f17058157aa3a5439aeff76b9f3046'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-sda', 'timestamp': '2025-11-29T08:02:48.192845', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb858368-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': '76bb2dcc10c3f2d4f1e54a17781aca0e7c5b5813a2544999b616ad4a65e7a163'}]}, 'timestamp': '2025-11-29 08:02:48.193427', '_unique_id': '87d4d4b3a01d49f6bedadc597858684d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.193 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.194 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/network.outgoing.bytes volume: 3250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5988f9d-3da8-4701-a389-998fd83ecc7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3250, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': 'instance-000000b9-058df251-6e51-4b9f-937f-08868563ee24-tapd86104b7-e4', 'timestamp': '2025-11-29T08:02:48.194623', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'tapd86104b7-e4', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:97:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86104b7-e4'}, 'message_id': 'cb85bb3a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.797795501, 'message_signature': '4fd64dd92836ddc346cf04b54d47cf17a15f06cf1dc0bc8890b7292eb1ee476a'}]}, 'timestamp': '2025-11-29 08:02:48.194887', '_unique_id': '7fc58a5770a2439b9fc5175be446248f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cd39feb-661f-47e4-922b-06acbe93b1c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4195, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': 'instance-000000b9-058df251-6e51-4b9f-937f-08868563ee24-tapd86104b7-e4', 'timestamp': '2025-11-29T08:02:48.196036', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'tapd86104b7-e4', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:97:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86104b7-e4'}, 'message_id': 'cb85f28a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.797795501, 'message_signature': '25c8f09e3871bf5c1f15f84e2487bff77c57271939f87b4cba26c72458a4349b'}]}, 'timestamp': '2025-11-29 08:02:48.196332', '_unique_id': '8b984113e86e4aa9b470ad008fde52f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.196 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.197 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a12a680-9d33-46ef-9329-3252da43e850', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': 'instance-000000b9-058df251-6e51-4b9f-937f-08868563ee24-tapd86104b7-e4', 'timestamp': '2025-11-29T08:02:48.197516', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'tapd86104b7-e4', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:97:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86104b7-e4'}, 'message_id': 'cb862fca-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.797795501, 'message_signature': '0a0adf8e0b1654435707d7d8cb7657f125075843218980fb82a64a7e35168384'}]}, 'timestamp': '2025-11-29 08:02:48.197854', '_unique_id': 'de34895643f24909a67082efb994391f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.198 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.199 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.199 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28a9d984-f9f5-4293-8a6c-b5f26fd91dd5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-vda', 'timestamp': '2025-11-29T08:02:48.199108', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb866c6a-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.756641708, 'message_signature': '129ee139364d84656cc44cfdc9d20c3389047bcaac967fbe69037c9f3a2b7a42'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 509952, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-sda', 'timestamp': '2025-11-29T08:02:48.199108', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb867782-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.756641708, 'message_signature': '919cd9084c7903cb0f1da35caa6aeb5de2bb7a5a154f48b02bdf3f29a639d823'}]}, 'timestamp': '2025-11-29 08:02:48.199695', '_unique_id': '1d149422d7364a788be9fcd4428bfa07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.200 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.201 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0085e597-6ddc-4ca8-a190-2d191285c47c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': 'instance-000000b9-058df251-6e51-4b9f-937f-08868563ee24-tapd86104b7-e4', 'timestamp': '2025-11-29T08:02:48.201159', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'tapd86104b7-e4', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:97:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86104b7-e4'}, 'message_id': 'cb86bcc4-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.797795501, 'message_signature': '99cf928bcae51f00161f4cd469291aaf76a117231b32f1717bb108122daead26'}]}, 'timestamp': '2025-11-29 08:02:48.201486', '_unique_id': '6991cd6262d240a2bc063a3319adf7c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.202 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestServerBasicOps-server-1731420900>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestServerBasicOps-server-1731420900>]
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.203 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87f2c359-5a75-4c99-a89e-58864c7781ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': 'instance-000000b9-058df251-6e51-4b9f-937f-08868563ee24-tapd86104b7-e4', 'timestamp': '2025-11-29T08:02:48.203187', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'tapd86104b7-e4', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:97:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86104b7-e4'}, 'message_id': 'cb870bac-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.797795501, 'message_signature': 'fea0f6af358bf04d18fb3ba69ee3fd327cfa609e00caddd5afe3510e2e5dd2d7'}]}, 'timestamp': '2025-11-29 08:02:48.203501', '_unique_id': 'd71f0c38fc844990a8399dfd1a16c2a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8d04335-13ba-45ff-a230-920b7117f50a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': 'instance-000000b9-058df251-6e51-4b9f-937f-08868563ee24-tapd86104b7-e4', 'timestamp': '2025-11-29T08:02:48.205071', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'tapd86104b7-e4', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:97:03', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd86104b7-e4'}, 'message_id': 'cb87556c-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.797795501, 'message_signature': '5230aa06436f7f4e02b7a60a02609f1ddd17d7151d32f33efaa83657ccb0dee9'}]}, 'timestamp': '2025-11-29 08:02:48.205406', '_unique_id': 'd9781f7cd9184e91a38ddb05adb3b8e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.205 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.206 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.read.latency volume: 203575601 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 DEBUG ceilometer.compute.pollsters [-] 058df251-6e51-4b9f-937f-08868563ee24/disk.device.read.latency volume: 22083351 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '448cd044-017f-4777-a526-26cdf7181f58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 203575601, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-vda', 'timestamp': '2025-11-29T08:02:48.206784', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb879914-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': 'e00482839d2fadc5f56e32e8777305333e62ed0f284023dda5ad4e321e6c0fe6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 22083351, 'user_id': '28d5e849ea254d80ae2b0b1654f39d29', 'user_name': None, 'project_id': '3c11b975eeef454fa22662a762affc9c', 'project_name': None, 'resource_id': '058df251-6e51-4b9f-937f-08868563ee24-sda', 'timestamp': '2025-11-29T08:02:48.206784', 'resource_metadata': {'display_name': 'tempest-TestServerBasicOps-server-1731420900', 'name': 'instance-000000b9', 'instance_id': '058df251-6e51-4b9f-937f-08868563ee24', 'instance_type': 'm1.nano', 'host': '02a6402d38a56eb51f4ea71e57166a049ee4d52e4c1b56cddaec11ee', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '1f2aa7c9-ac89-4e57-88a4-cb4eaefc2f31', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '5d270706-931c-4fd1-846d-ba6ddeac2a79'}, 'image_ref': '5d270706-931c-4fd1-846d-ba6ddeac2a79', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb87a1fc-ccf9-11f0-8954-fa163e5a5606', 'monotonic_time': 8815.772432223, 'message_signature': '943e8dc600c31de2cf8e165caf82ba2349af1e8577734784ff23c486461d9659'}]}, 'timestamp': '2025-11-29 08:02:48.207372', '_unique_id': '64d26ff9aaf04a55a7464f9139aca00b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     yield
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 29 03:02:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:02:48.207 12 ERROR oslo_messaging.notify.messaging 
Nov 29 03:02:48 np0005539505 nova_compute[186958]: 2025-11-29 08:02:48.928 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:49 np0005539505 nova_compute[186958]: 2025-11-29 08:02:49.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:49 np0005539505 nova_compute[186958]: 2025-11-29 08:02:49.395 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:49 np0005539505 nova_compute[186958]: 2025-11-29 08:02:49.395 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:02:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:49.518 104201 DEBUG eventlet.wsgi.server [-] (104201) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Nov 29 03:02:49 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:49.520 104201 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0#015
Nov 29 03:02:49 np0005539505 ovn_metadata_agent[104089]: Accept: */*#015
Nov 29 03:02:49 np0005539505 ovn_metadata_agent[104089]: Connection: close#015
Nov 29 03:02:49 np0005539505 ovn_metadata_agent[104089]: Content-Type: text/plain#015
Nov 29 03:02:49 np0005539505 ovn_metadata_agent[104089]: Host: 169.254.169.254#015
Nov 29 03:02:49 np0005539505 ovn_metadata_agent[104089]: User-Agent: curl/7.84.0#015
Nov 29 03:02:49 np0005539505 ovn_metadata_agent[104089]: X-Forwarded-For: 10.100.0.9#015
Nov 29 03:02:49 np0005539505 ovn_metadata_agent[104089]: X-Ovn-Network-Id: a053bd92-7419-4340-b338-8c9eda334695 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:50.817 104201 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:50.817 104201 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 1.2974823#033[00m
Nov 29 03:02:50 np0005539505 haproxy-metadata-proxy-a053bd92-7419-4340-b338-8c9eda334695[256679]: 10.100.0.9:56742 [29/Nov/2025:08:02:49.517] listener listener/metadata 0/0/0/1300/1300 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:50.937 104201 DEBUG eventlet.wsgi.server [-] (104201) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:50.938 104201 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0#015
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: Accept: */*#015
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: Connection: close#015
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: Content-Length: 100#015
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: Content-Type: application/x-www-form-urlencoded#015
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: Host: 169.254.169.254#015
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: User-Agent: curl/7.84.0#015
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: X-Forwarded-For: 10.100.0.9#015
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: X-Ovn-Network-Id: a053bd92-7419-4340-b338-8c9eda334695#015
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: #015
Nov 29 03:02:50 np0005539505 ovn_metadata_agent[104089]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Nov 29 03:02:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:51.315 104201 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Nov 29 03:02:51 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:51.315 104201 INFO eventlet.wsgi.server [-] 10.100.0.9,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.3774731#033[00m
Nov 29 03:02:51 np0005539505 haproxy-metadata-proxy-a053bd92-7419-4340-b338-8c9eda334695[256679]: 10.100.0.9:34318 [29/Nov/2025:08:02:50.936] listener listener/metadata 0/0/0/379/379 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Nov 29 03:02:52 np0005539505 nova_compute[186958]: 2025-11-29 08:02:52.209 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:52 np0005539505 nova_compute[186958]: 2025-11-29 08:02:52.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.404 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.404 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.405 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.405 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.405 186962 DEBUG oslo_concurrency.lockutils [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Acquiring lock "058df251-6e51-4b9f-937f-08868563ee24" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.406 186962 DEBUG oslo_concurrency.lockutils [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.406 186962 DEBUG oslo_concurrency.lockutils [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Acquiring lock "058df251-6e51-4b9f-937f-08868563ee24-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.407 186962 DEBUG oslo_concurrency.lockutils [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.407 186962 DEBUG oslo_concurrency.lockutils [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.425 186962 INFO nova.compute.manager [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Terminating instance#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.438 186962 DEBUG nova.compute.manager [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 29 03:02:53 np0005539505 kernel: tapd86104b7-e4 (unregistering): left promiscuous mode
Nov 29 03:02:53 np0005539505 NetworkManager[55134]: <info>  [1764403373.4649] device (tapd86104b7-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 29 03:02:53 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:53Z|00869|binding|INFO|Releasing lport d86104b7-e43a-474d-9e35-200858602d45 from this chassis (sb_readonly=0)
Nov 29 03:02:53 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:53Z|00870|binding|INFO|Setting lport d86104b7-e43a-474d-9e35-200858602d45 down in Southbound
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.472 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:53 np0005539505 ovn_controller[95143]: 2025-11-29T08:02:53Z|00871|binding|INFO|Removing iface tapd86104b7-e4 ovn-installed in OVS
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.475 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:53.481 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:97:03 10.100.0.9'], port_security=['fa:16:3e:5b:97:03 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '058df251-6e51-4b9f-937f-08868563ee24', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a053bd92-7419-4340-b338-8c9eda334695', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c11b975eeef454fa22662a762affc9c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5088d4cc-c787-4b31-b84f-cd25421e032f a5148027-7b1d-4285-977d-e0cf6cfc66a2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a226e0f9-4cbd-4be3-91d3-c87f616ac2b3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>], logical_port=d86104b7-e43a-474d-9e35-200858602d45) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6389ad36a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 03:02:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:53.482 104094 INFO neutron.agent.ovn.metadata.agent [-] Port d86104b7-e43a-474d-9e35-200858602d45 in datapath a053bd92-7419-4340-b338-8c9eda334695 unbound from our chassis#033[00m
Nov 29 03:02:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:53.484 104094 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a053bd92-7419-4340-b338-8c9eda334695, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 29 03:02:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:53.485 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[a922f157-82ca-4a5a-84d8-532e8833f237]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:53 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:53.486 104094 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a053bd92-7419-4340-b338-8c9eda334695 namespace which is not needed anymore#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.490 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:53 np0005539505 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000b9.scope: Deactivated successfully.
Nov 29 03:02:53 np0005539505 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000b9.scope: Consumed 13.776s CPU time.
Nov 29 03:02:53 np0005539505 systemd-machined[153285]: Machine qemu-90-instance-000000b9 terminated.
Nov 29 03:02:53 np0005539505 kernel: tapd86104b7-e4: entered promiscuous mode
Nov 29 03:02:53 np0005539505 kernel: tapd86104b7-e4 (unregistering): left promiscuous mode
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.666 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.708 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.727 186962 INFO nova.virt.libvirt.driver [-] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Instance destroyed successfully.#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.728 186962 DEBUG nova.objects.instance [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lazy-loading 'resources' on Instance uuid 058df251-6e51-4b9f-937f-08868563ee24 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.756 186962 DEBUG nova.virt.libvirt.vif [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T08:02:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-1731420900',display_name='tempest-TestServerBasicOps-server-1731420900',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testserverbasicops-server-1731420900',id=185,image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBeU1cc1A3oubUgmFqzVQGGjzs+lE8IJZUVZaoKpr6cfNMuO5sy18fQd7r4kC0tonosditk7SlhijtVMWQXRjXt3dWTLk3CHE2OvAH026U0SXODUaMLaWMEufDtXy84EFQ==',key_name='tempest-TestServerBasicOps-619853107',keypairs=<?>,launch_index=0,launched_at=2025-11-29T08:02:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3c11b975eeef454fa22662a762affc9c',ramdisk_id='',reservation_id='r-k0z16k4y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='5d270706-931c-4fd1-846d-ba6ddeac2a79',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-511476965',owner_user_name='tempest-TestServerBasicOps-511476965-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T08:02:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='28d5e849ea254d80ae2b0b1654f39d29',uuid=058df251-6e51-4b9f-937f-08868563ee24,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d86104b7-e43a-474d-9e35-200858602d45", "address": "fa:16:3e:5b:97:03", "network": {"id": "a053bd92-7419-4340-b338-8c9eda334695", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1014759778-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3c11b975eeef454fa22662a762affc9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86104b7-e4", "ovs_interfaceid": "d86104b7-e43a-474d-9e35-200858602d45", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.756 186962 DEBUG nova.network.os_vif_util [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Converting VIF {"id": "d86104b7-e43a-474d-9e35-200858602d45", "address": "fa:16:3e:5b:97:03", "network": {"id": "a053bd92-7419-4340-b338-8c9eda334695", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1014759778-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3c11b975eeef454fa22662a762affc9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd86104b7-e4", "ovs_interfaceid": "d86104b7-e43a-474d-9e35-200858602d45", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.757 186962 DEBUG nova.network.os_vif_util [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:97:03,bridge_name='br-int',has_traffic_filtering=True,id=d86104b7-e43a-474d-9e35-200858602d45,network=Network(a053bd92-7419-4340-b338-8c9eda334695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86104b7-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.758 186962 DEBUG os_vif [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:97:03,bridge_name='br-int',has_traffic_filtering=True,id=d86104b7-e43a-474d-9e35-200858602d45,network=Network(a053bd92-7419-4340-b338-8c9eda334695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86104b7-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.759 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.760 186962 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd86104b7-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.764 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.767 186962 DEBUG oslo_concurrency.processutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.768 186962 INFO os_vif [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:97:03,bridge_name='br-int',has_traffic_filtering=True,id=d86104b7-e43a-474d-9e35-200858602d45,network=Network(a053bd92-7419-4340-b338-8c9eda334695),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd86104b7-e4')#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.768 186962 INFO nova.virt.libvirt.driver [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Deleting instance files /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24_del#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.769 186962 INFO nova.virt.libvirt.driver [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Deletion of /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24_del complete#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.773 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-000000b9, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/058df251-6e51-4b9f-937f-08868563ee24/disk#033[00m
Nov 29 03:02:53 np0005539505 neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695[256673]: [NOTICE]   (256677) : haproxy version is 2.8.14-c23fe91
Nov 29 03:02:53 np0005539505 neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695[256673]: [NOTICE]   (256677) : path to executable is /usr/sbin/haproxy
Nov 29 03:02:53 np0005539505 neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695[256673]: [WARNING]  (256677) : Exiting Master process...
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.931 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:53 np0005539505 neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695[256673]: [ALERT]    (256677) : Current worker (256679) exited with code 143 (Terminated)
Nov 29 03:02:53 np0005539505 neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695[256673]: [WARNING]  (256677) : All workers exited. Exiting... (0)
Nov 29 03:02:53 np0005539505 systemd[1]: libpod-0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af.scope: Deactivated successfully.
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.937 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.938 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5525MB free_disk=73.04219818115234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.938 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:53 np0005539505 nova_compute[186958]: 2025-11-29 08:02:53.938 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:53 np0005539505 podman[256838]: 2025-11-29 08:02:53.943201051 +0000 UTC m=+0.376558479 container died 0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:02:54 np0005539505 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af-userdata-shm.mount: Deactivated successfully.
Nov 29 03:02:54 np0005539505 systemd[1]: var-lib-containers-storage-overlay-b98f05982ab35dbaef72ec7a12697b8ad3e629093a2d25f39a3145fd704462ee-merged.mount: Deactivated successfully.
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.115 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Instance 058df251-6e51-4b9f-937f-08868563ee24 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.116 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.116 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.120 186962 INFO nova.compute.manager [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Took 0.68 seconds to destroy the instance on the hypervisor.#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.121 186962 DEBUG oslo.service.loopingcall [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.121 186962 DEBUG nova.compute.manager [-] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.121 186962 DEBUG nova.network.neutron [-] [instance: 058df251-6e51-4b9f-937f-08868563ee24] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.173 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.193 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.229 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.229 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:54 np0005539505 podman[256838]: 2025-11-29 08:02:54.404277819 +0000 UTC m=+0.837635237 container cleanup 0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 29 03:02:54 np0005539505 systemd[1]: libpod-conmon-0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af.scope: Deactivated successfully.
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.823 186962 DEBUG nova.compute.manager [req-585a07d6-216e-4973-8701-83d9fc0e9aaf req-40769be0-c0d5-4b80-a561-bf6baa99a045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Received event network-vif-unplugged-d86104b7-e43a-474d-9e35-200858602d45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.823 186962 DEBUG oslo_concurrency.lockutils [req-585a07d6-216e-4973-8701-83d9fc0e9aaf req-40769be0-c0d5-4b80-a561-bf6baa99a045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "058df251-6e51-4b9f-937f-08868563ee24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.824 186962 DEBUG oslo_concurrency.lockutils [req-585a07d6-216e-4973-8701-83d9fc0e9aaf req-40769be0-c0d5-4b80-a561-bf6baa99a045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.824 186962 DEBUG oslo_concurrency.lockutils [req-585a07d6-216e-4973-8701-83d9fc0e9aaf req-40769be0-c0d5-4b80-a561-bf6baa99a045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.824 186962 DEBUG nova.compute.manager [req-585a07d6-216e-4973-8701-83d9fc0e9aaf req-40769be0-c0d5-4b80-a561-bf6baa99a045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] No waiting events found dispatching network-vif-unplugged-d86104b7-e43a-474d-9e35-200858602d45 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 03:02:54 np0005539505 nova_compute[186958]: 2025-11-29 08:02:54.824 186962 DEBUG nova.compute.manager [req-585a07d6-216e-4973-8701-83d9fc0e9aaf req-40769be0-c0d5-4b80-a561-bf6baa99a045 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Received event network-vif-unplugged-d86104b7-e43a-474d-9e35-200858602d45 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 29 03:02:55 np0005539505 podman[256888]: 2025-11-29 08:02:55.538046108 +0000 UTC m=+1.114749562 container remove 0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 29 03:02:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:55.544 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[2fbd5ee9-7192-49ad-a849-85e20c96571a]: (4, ('Sat Nov 29 08:02:53 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695 (0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af)\n0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af\nSat Nov 29 08:02:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a053bd92-7419-4340-b338-8c9eda334695 (0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af)\n0d1d74664189fbb5ec09c6c4c7f9f1632f9452fc170094e794ce6d0d635352af\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:55.545 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[97d6377a-e868-4f8d-aac8-d1413b0fdb34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:55.546 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa053bd92-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:02:55 np0005539505 nova_compute[186958]: 2025-11-29 08:02:55.548 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:55 np0005539505 kernel: tapa053bd92-70: left promiscuous mode
Nov 29 03:02:55 np0005539505 nova_compute[186958]: 2025-11-29 08:02:55.559 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:55 np0005539505 nova_compute[186958]: 2025-11-29 08:02:55.560 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:55.563 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4f676761-43de-46de-abf2-5425c097d0c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:55.583 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[fa64be52-a01a-4294-806d-de9697263c63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:55.585 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[4c785fc0-5f27-4b08-b74a-177aa067ab5b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:55.603 213906 DEBUG oslo.privsep.daemon [-] privsep: reply[28a3ed68-6a2f-43a3-b47a-76daa44a9557]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 879257, 'reachable_time': 41587, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256905, 'error': None, 'target': 'ovnmeta-a053bd92-7419-4340-b338-8c9eda334695', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:55.607 104206 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a053bd92-7419-4340-b338-8c9eda334695 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 29 03:02:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:55.607 104206 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa80000-5e32-4eb3-a413-c257a9753eef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 29 03:02:55 np0005539505 systemd[1]: run-netns-ovnmeta\x2da053bd92\x2d7419\x2d4340\x2db338\x2d8c9eda334695.mount: Deactivated successfully.
Nov 29 03:02:55 np0005539505 podman[256906]: 2025-11-29 08:02:55.657721521 +0000 UTC m=+0.053848783 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:02:55 np0005539505 podman[256903]: 2025-11-29 08:02:55.657692311 +0000 UTC m=+0.055981814 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Nov 29 03:02:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:55.773 104094 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'c2:47:2b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '3e:b8:34:93:e6:1e'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 29 03:02:55 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:55.773 104094 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 29 03:02:55 np0005539505 nova_compute[186958]: 2025-11-29 08:02:55.818 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:55 np0005539505 nova_compute[186958]: 2025-11-29 08:02:55.847 186962 DEBUG nova.network.neutron [-] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 29 03:02:55 np0005539505 nova_compute[186958]: 2025-11-29 08:02:55.868 186962 INFO nova.compute.manager [-] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Took 1.75 seconds to deallocate network for instance.#033[00m
Nov 29 03:02:55 np0005539505 nova_compute[186958]: 2025-11-29 08:02:55.928 186962 DEBUG nova.compute.manager [req-2080c566-2489-4376-961c-330d0a6d7a49 req-88fe5483-bc97-47aa-9804-ba23dfa7d86a 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Received event network-vif-deleted-d86104b7-e43a-474d-9e35-200858602d45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.098 186962 DEBUG oslo_concurrency.lockutils [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.098 186962 DEBUG oslo_concurrency.lockutils [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.135 186962 DEBUG nova.compute.provider_tree [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.148 186962 DEBUG nova.scheduler.client.report [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.170 186962 DEBUG oslo_concurrency.lockutils [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.214 186962 INFO nova.scheduler.client.report [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Deleted allocations for instance 058df251-6e51-4b9f-937f-08868563ee24#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.311 186962 DEBUG oslo_concurrency.lockutils [None req-09e8483a-2b54-41f8-8d42-d073a19c650c 28d5e849ea254d80ae2b0b1654f39d29 3c11b975eeef454fa22662a762affc9c - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.903 186962 DEBUG nova.compute.manager [req-dde76487-9782-46c8-8dc1-7155411b4169 req-e4a5f1de-2caf-4d81-afe4-925f0923ddcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Received event network-vif-plugged-d86104b7-e43a-474d-9e35-200858602d45 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.904 186962 DEBUG oslo_concurrency.lockutils [req-dde76487-9782-46c8-8dc1-7155411b4169 req-e4a5f1de-2caf-4d81-afe4-925f0923ddcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Acquiring lock "058df251-6e51-4b9f-937f-08868563ee24-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.904 186962 DEBUG oslo_concurrency.lockutils [req-dde76487-9782-46c8-8dc1-7155411b4169 req-e4a5f1de-2caf-4d81-afe4-925f0923ddcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.904 186962 DEBUG oslo_concurrency.lockutils [req-dde76487-9782-46c8-8dc1-7155411b4169 req-e4a5f1de-2caf-4d81-afe4-925f0923ddcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] Lock "058df251-6e51-4b9f-937f-08868563ee24-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.905 186962 DEBUG nova.compute.manager [req-dde76487-9782-46c8-8dc1-7155411b4169 req-e4a5f1de-2caf-4d81-afe4-925f0923ddcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] No waiting events found dispatching network-vif-plugged-d86104b7-e43a-474d-9e35-200858602d45 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 29 03:02:56 np0005539505 nova_compute[186958]: 2025-11-29 08:02:56.905 186962 WARNING nova.compute.manager [req-dde76487-9782-46c8-8dc1-7155411b4169 req-e4a5f1de-2caf-4d81-afe4-925f0923ddcb 0e7b7b888d1243ae80aea6e17ec26409 d1af104e13fd4ff9810b62031e33a7a3 - - default default] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Received unexpected event network-vif-plugged-d86104b7-e43a-474d-9e35-200858602d45 for instance with vm_state deleted and task_state None.#033[00m
Nov 29 03:02:58 np0005539505 nova_compute[186958]: 2025-11-29 08:02:58.229 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:58 np0005539505 nova_compute[186958]: 2025-11-29 08:02:58.230 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:02:58 np0005539505 nova_compute[186958]: 2025-11-29 08:02:58.230 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:02:58 np0005539505 nova_compute[186958]: 2025-11-29 08:02:58.262 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:02:58 np0005539505 nova_compute[186958]: 2025-11-29 08:02:58.263 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:58 np0005539505 nova_compute[186958]: 2025-11-29 08:02:58.263 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:02:58 np0005539505 nova_compute[186958]: 2025-11-29 08:02:58.763 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:58 np0005539505 nova_compute[186958]: 2025-11-29 08:02:58.932 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:02:59 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:02:59.776 104094 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=cdd09ca7-026f-4d2a-8ff6-406e1fbb33b0, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 29 03:03:00 np0005539505 podman[256951]: 2025-11-29 08:03:00.714339965 +0000 UTC m=+0.050249570 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 29 03:03:01 np0005539505 nova_compute[186958]: 2025-11-29 08:03:01.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:01 np0005539505 nova_compute[186958]: 2025-11-29 08:03:01.443 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:01 np0005539505 nova_compute[186958]: 2025-11-29 08:03:01.631 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:03 np0005539505 nova_compute[186958]: 2025-11-29 08:03:03.766 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:03 np0005539505 nova_compute[186958]: 2025-11-29 08:03:03.934 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:04 np0005539505 nova_compute[186958]: 2025-11-29 08:03:04.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:05 np0005539505 nova_compute[186958]: 2025-11-29 08:03:05.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:08 np0005539505 nova_compute[186958]: 2025-11-29 08:03:08.706 186962 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764403373.704512, 058df251-6e51-4b9f-937f-08868563ee24 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 29 03:03:08 np0005539505 nova_compute[186958]: 2025-11-29 08:03:08.706 186962 INFO nova.compute.manager [-] [instance: 058df251-6e51-4b9f-937f-08868563ee24] VM Stopped (Lifecycle Event)#033[00m
Nov 29 03:03:08 np0005539505 nova_compute[186958]: 2025-11-29 08:03:08.729 186962 DEBUG nova.compute.manager [None req-c3a67d69-c149-468f-99cd-5ce59d61608e - - - - - -] [instance: 058df251-6e51-4b9f-937f-08868563ee24] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 29 03:03:08 np0005539505 podman[256971]: 2025-11-29 08:03:08.74021472 +0000 UTC m=+0.074555289 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 03:03:08 np0005539505 podman[256972]: 2025-11-29 08:03:08.74798616 +0000 UTC m=+0.079010515 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 03:03:08 np0005539505 nova_compute[186958]: 2025-11-29 08:03:08.768 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:08 np0005539505 nova_compute[186958]: 2025-11-29 08:03:08.936 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:13 np0005539505 podman[257021]: 2025-11-29 08:03:13.723882371 +0000 UTC m=+0.051177238 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:03:13 np0005539505 podman[257022]: 2025-11-29 08:03:13.728967665 +0000 UTC m=+0.053252527 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 29 03:03:13 np0005539505 nova_compute[186958]: 2025-11-29 08:03:13.769 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:13 np0005539505 nova_compute[186958]: 2025-11-29 08:03:13.937 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:18 np0005539505 nova_compute[186958]: 2025-11-29 08:03:18.772 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:18 np0005539505 nova_compute[186958]: 2025-11-29 08:03:18.939 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:21 np0005539505 nova_compute[186958]: 2025-11-29 08:03:21.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:23 np0005539505 nova_compute[186958]: 2025-11-29 08:03:23.775 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:23 np0005539505 nova_compute[186958]: 2025-11-29 08:03:23.942 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:25 np0005539505 nova_compute[186958]: 2025-11-29 08:03:25.389 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:25 np0005539505 nova_compute[186958]: 2025-11-29 08:03:25.390 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 29 03:03:25 np0005539505 nova_compute[186958]: 2025-11-29 08:03:25.610 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 29 03:03:26 np0005539505 podman[257059]: 2025-11-29 08:03:26.716848269 +0000 UTC m=+0.046853546 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 03:03:26 np0005539505 podman[257058]: 2025-11-29 08:03:26.74519656 +0000 UTC m=+0.079354925 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Nov 29 03:03:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:03:27.553 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:03:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:03:27.553 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:03:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:03:27.553 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:03:28 np0005539505 nova_compute[186958]: 2025-11-29 08:03:28.778 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:28 np0005539505 nova_compute[186958]: 2025-11-29 08:03:28.944 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:31 np0005539505 podman[257102]: 2025-11-29 08:03:31.720630519 +0000 UTC m=+0.054505233 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 29 03:03:33 np0005539505 nova_compute[186958]: 2025-11-29 08:03:33.780 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:33 np0005539505 nova_compute[186958]: 2025-11-29 08:03:33.945 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:35 np0005539505 nova_compute[186958]: 2025-11-29 08:03:35.193 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:38 np0005539505 ovn_controller[95143]: 2025-11-29T08:03:38Z|00872|memory_trim|INFO|Detected inactivity (last active 30029 ms ago): trimming memory
Nov 29 03:03:38 np0005539505 nova_compute[186958]: 2025-11-29 08:03:38.783 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:38 np0005539505 nova_compute[186958]: 2025-11-29 08:03:38.947 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:39 np0005539505 podman[257122]: 2025-11-29 08:03:39.710063772 +0000 UTC m=+0.047743561 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 29 03:03:39 np0005539505 podman[257123]: 2025-11-29 08:03:39.742404946 +0000 UTC m=+0.075111415 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 29 03:03:42 np0005539505 nova_compute[186958]: 2025-11-29 08:03:42.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:42 np0005539505 nova_compute[186958]: 2025-11-29 08:03:42.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 29 03:03:43 np0005539505 nova_compute[186958]: 2025-11-29 08:03:43.786 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:43 np0005539505 nova_compute[186958]: 2025-11-29 08:03:43.949 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:44 np0005539505 podman[257173]: 2025-11-29 08:03:44.732120609 +0000 UTC m=+0.063345962 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Nov 29 03:03:44 np0005539505 podman[257174]: 2025-11-29 08:03:44.732182771 +0000 UTC m=+0.060877312 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 29 03:03:48 np0005539505 nova_compute[186958]: 2025-11-29 08:03:48.789 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:48 np0005539505 nova_compute[186958]: 2025-11-29 08:03:48.951 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:49 np0005539505 nova_compute[186958]: 2025-11-29 08:03:49.409 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:49 np0005539505 nova_compute[186958]: 2025-11-29 08:03:49.410 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:03:52 np0005539505 nova_compute[186958]: 2025-11-29 08:03:52.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:53 np0005539505 nova_compute[186958]: 2025-11-29 08:03:53.792 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:53 np0005539505 nova_compute[186958]: 2025-11-29 08:03:53.951 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:54 np0005539505 nova_compute[186958]: 2025-11-29 08:03:54.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:03:54 np0005539505 nova_compute[186958]: 2025-11-29 08:03:54.519 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:03:54 np0005539505 nova_compute[186958]: 2025-11-29 08:03:54.519 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:03:54 np0005539505 nova_compute[186958]: 2025-11-29 08:03:54.519 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:03:54 np0005539505 nova_compute[186958]: 2025-11-29 08:03:54.520 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:03:54 np0005539505 nova_compute[186958]: 2025-11-29 08:03:54.675 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:03:54 np0005539505 nova_compute[186958]: 2025-11-29 08:03:54.676 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5715MB free_disk=73.07097625732422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:03:54 np0005539505 nova_compute[186958]: 2025-11-29 08:03:54.676 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:03:54 np0005539505 nova_compute[186958]: 2025-11-29 08:03:54.676 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:03:55 np0005539505 nova_compute[186958]: 2025-11-29 08:03:55.635 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:03:55 np0005539505 nova_compute[186958]: 2025-11-29 08:03:55.635 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:03:55 np0005539505 nova_compute[186958]: 2025-11-29 08:03:55.664 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:03:57 np0005539505 podman[257212]: 2025-11-29 08:03:57.711363117 +0000 UTC m=+0.046531557 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 29 03:03:57 np0005539505 podman[257211]: 2025-11-29 08:03:57.714403973 +0000 UTC m=+0.052644030 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter)
Nov 29 03:03:58 np0005539505 nova_compute[186958]: 2025-11-29 08:03:58.703 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:03:58 np0005539505 nova_compute[186958]: 2025-11-29 08:03:58.795 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:58 np0005539505 nova_compute[186958]: 2025-11-29 08:03:58.954 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:03:59 np0005539505 nova_compute[186958]: 2025-11-29 08:03:59.234 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:03:59 np0005539505 nova_compute[186958]: 2025-11-29 08:03:59.234 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:04:02 np0005539505 podman[257252]: 2025-11-29 08:04:02.711473533 +0000 UTC m=+0.044715975 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 29 03:04:03 np0005539505 nova_compute[186958]: 2025-11-29 08:04:03.235 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:03 np0005539505 nova_compute[186958]: 2025-11-29 08:04:03.235 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:04:03 np0005539505 nova_compute[186958]: 2025-11-29 08:04:03.235 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:04:03 np0005539505 nova_compute[186958]: 2025-11-29 08:04:03.682 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:04:03 np0005539505 nova_compute[186958]: 2025-11-29 08:04:03.682 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:03 np0005539505 nova_compute[186958]: 2025-11-29 08:04:03.682 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:03 np0005539505 nova_compute[186958]: 2025-11-29 08:04:03.682 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:03 np0005539505 nova_compute[186958]: 2025-11-29 08:04:03.797 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:03 np0005539505 nova_compute[186958]: 2025-11-29 08:04:03.956 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:05 np0005539505 nova_compute[186958]: 2025-11-29 08:04:05.820 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:07 np0005539505 nova_compute[186958]: 2025-11-29 08:04:07.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:08 np0005539505 nova_compute[186958]: 2025-11-29 08:04:08.800 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:08 np0005539505 nova_compute[186958]: 2025-11-29 08:04:08.958 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:10 np0005539505 podman[257271]: 2025-11-29 08:04:10.707684957 +0000 UTC m=+0.043832420 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:04:10 np0005539505 podman[257272]: 2025-11-29 08:04:10.753145323 +0000 UTC m=+0.082981218 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:04:13 np0005539505 nova_compute[186958]: 2025-11-29 08:04:13.842 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:13 np0005539505 nova_compute[186958]: 2025-11-29 08:04:13.959 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:15 np0005539505 podman[257323]: 2025-11-29 08:04:15.737363729 +0000 UTC m=+0.064833934 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:04:15 np0005539505 podman[257322]: 2025-11-29 08:04:15.749113531 +0000 UTC m=+0.085004174 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 03:04:18 np0005539505 nova_compute[186958]: 2025-11-29 08:04:18.845 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:18 np0005539505 nova_compute[186958]: 2025-11-29 08:04:18.961 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:23 np0005539505 nova_compute[186958]: 2025-11-29 08:04:23.848 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:23 np0005539505 nova_compute[186958]: 2025-11-29 08:04:23.963 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:04:27.554 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:04:27.554 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:04:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:04:27.554 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:04:28 np0005539505 podman[257364]: 2025-11-29 08:04:28.732102076 +0000 UTC m=+0.051777865 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 03:04:28 np0005539505 podman[257363]: 2025-11-29 08:04:28.754642283 +0000 UTC m=+0.081476754 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Nov 29 03:04:28 np0005539505 nova_compute[186958]: 2025-11-29 08:04:28.851 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:28 np0005539505 nova_compute[186958]: 2025-11-29 08:04:28.965 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:33 np0005539505 podman[257408]: 2025-11-29 08:04:33.75225656 +0000 UTC m=+0.073173540 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 03:04:33 np0005539505 nova_compute[186958]: 2025-11-29 08:04:33.855 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:33 np0005539505 nova_compute[186958]: 2025-11-29 08:04:33.967 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:38 np0005539505 nova_compute[186958]: 2025-11-29 08:04:38.858 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:38 np0005539505 nova_compute[186958]: 2025-11-29 08:04:38.971 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:41 np0005539505 podman[257427]: 2025-11-29 08:04:41.757760848 +0000 UTC m=+0.078556593 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 03:04:41 np0005539505 podman[257428]: 2025-11-29 08:04:41.770733994 +0000 UTC m=+0.093395552 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 29 03:04:43 np0005539505 nova_compute[186958]: 2025-11-29 08:04:43.862 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:43 np0005539505 nova_compute[186958]: 2025-11-29 08:04:43.973 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:46 np0005539505 podman[257476]: 2025-11-29 08:04:46.722312407 +0000 UTC m=+0.055573143 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 03:04:46 np0005539505 podman[257475]: 2025-11-29 08:04:46.742176899 +0000 UTC m=+0.073811778 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:04:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:04:48 np0005539505 nova_compute[186958]: 2025-11-29 08:04:48.899 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:48 np0005539505 nova_compute[186958]: 2025-11-29 08:04:48.974 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:49 np0005539505 nova_compute[186958]: 2025-11-29 08:04:49.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:49 np0005539505 nova_compute[186958]: 2025-11-29 08:04:49.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:04:53 np0005539505 nova_compute[186958]: 2025-11-29 08:04:53.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:53 np0005539505 nova_compute[186958]: 2025-11-29 08:04:53.902 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:53 np0005539505 nova_compute[186958]: 2025-11-29 08:04:53.975 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:54 np0005539505 nova_compute[186958]: 2025-11-29 08:04:54.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:56 np0005539505 nova_compute[186958]: 2025-11-29 08:04:56.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:56 np0005539505 nova_compute[186958]: 2025-11-29 08:04:56.404 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:04:56 np0005539505 nova_compute[186958]: 2025-11-29 08:04:56.404 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:04:56 np0005539505 nova_compute[186958]: 2025-11-29 08:04:56.404 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:04:56 np0005539505 nova_compute[186958]: 2025-11-29 08:04:56.404 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:04:56 np0005539505 nova_compute[186958]: 2025-11-29 08:04:56.563 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:04:56 np0005539505 nova_compute[186958]: 2025-11-29 08:04:56.564 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5717MB free_disk=73.07184982299805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:04:56 np0005539505 nova_compute[186958]: 2025-11-29 08:04:56.564 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:04:56 np0005539505 nova_compute[186958]: 2025-11-29 08:04:56.564 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:04:57 np0005539505 nova_compute[186958]: 2025-11-29 08:04:57.108 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:04:57 np0005539505 nova_compute[186958]: 2025-11-29 08:04:57.109 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:04:57 np0005539505 nova_compute[186958]: 2025-11-29 08:04:57.258 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:04:57 np0005539505 nova_compute[186958]: 2025-11-29 08:04:57.300 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:04:57 np0005539505 nova_compute[186958]: 2025-11-29 08:04:57.302 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:04:57 np0005539505 nova_compute[186958]: 2025-11-29 08:04:57.302 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:04:58 np0005539505 nova_compute[186958]: 2025-11-29 08:04:58.905 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:58 np0005539505 nova_compute[186958]: 2025-11-29 08:04:58.978 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:04:59 np0005539505 nova_compute[186958]: 2025-11-29 08:04:59.303 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:04:59 np0005539505 nova_compute[186958]: 2025-11-29 08:04:59.303 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:04:59 np0005539505 nova_compute[186958]: 2025-11-29 08:04:59.303 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:04:59 np0005539505 podman[257516]: 2025-11-29 08:04:59.722133028 +0000 UTC m=+0.054948035 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 29 03:04:59 np0005539505 podman[257517]: 2025-11-29 08:04:59.730628348 +0000 UTC m=+0.055009266 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 03:04:59 np0005539505 nova_compute[186958]: 2025-11-29 08:04:59.761 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:05:01 np0005539505 nova_compute[186958]: 2025-11-29 08:05:01.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:01 np0005539505 nova_compute[186958]: 2025-11-29 08:05:01.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:02 np0005539505 nova_compute[186958]: 2025-11-29 08:05:02.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:03 np0005539505 nova_compute[186958]: 2025-11-29 08:05:03.908 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:03 np0005539505 nova_compute[186958]: 2025-11-29 08:05:03.979 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:04 np0005539505 podman[257562]: 2025-11-29 08:05:04.709516375 +0000 UTC m=+0.043363978 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 29 03:05:07 np0005539505 nova_compute[186958]: 2025-11-29 08:05:07.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:08 np0005539505 nova_compute[186958]: 2025-11-29 08:05:08.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:08 np0005539505 nova_compute[186958]: 2025-11-29 08:05:08.910 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:08 np0005539505 nova_compute[186958]: 2025-11-29 08:05:08.981 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:12 np0005539505 podman[257581]: 2025-11-29 08:05:12.709633731 +0000 UTC m=+0.042658157 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 03:05:12 np0005539505 podman[257582]: 2025-11-29 08:05:12.750664931 +0000 UTC m=+0.080022303 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:05:13 np0005539505 nova_compute[186958]: 2025-11-29 08:05:13.912 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:13 np0005539505 nova_compute[186958]: 2025-11-29 08:05:13.983 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:17 np0005539505 podman[257630]: 2025-11-29 08:05:17.739190359 +0000 UTC m=+0.074493317 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd)
Nov 29 03:05:17 np0005539505 podman[257631]: 2025-11-29 08:05:17.752129015 +0000 UTC m=+0.085355444 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 29 03:05:18 np0005539505 nova_compute[186958]: 2025-11-29 08:05:18.915 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:18 np0005539505 nova_compute[186958]: 2025-11-29 08:05:18.984 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:23 np0005539505 nova_compute[186958]: 2025-11-29 08:05:23.918 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:23 np0005539505 nova_compute[186958]: 2025-11-29 08:05:23.986 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:05:27.555 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:05:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:05:27.555 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:05:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:05:27.556 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:05:28 np0005539505 nova_compute[186958]: 2025-11-29 08:05:28.921 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:28 np0005539505 nova_compute[186958]: 2025-11-29 08:05:28.988 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:30 np0005539505 podman[257671]: 2025-11-29 08:05:30.722491444 +0000 UTC m=+0.053479344 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, vcs-type=git, managed_by=edpm_ansible, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41)
Nov 29 03:05:30 np0005539505 podman[257672]: 2025-11-29 08:05:30.728071481 +0000 UTC m=+0.054189363 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 03:05:33 np0005539505 nova_compute[186958]: 2025-11-29 08:05:33.922 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:33 np0005539505 nova_compute[186958]: 2025-11-29 08:05:33.990 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:35 np0005539505 podman[257715]: 2025-11-29 08:05:35.719514143 +0000 UTC m=+0.052603349 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:05:38 np0005539505 nova_compute[186958]: 2025-11-29 08:05:38.925 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:38 np0005539505 nova_compute[186958]: 2025-11-29 08:05:38.991 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:43 np0005539505 podman[257734]: 2025-11-29 08:05:43.716004796 +0000 UTC m=+0.051709553 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:05:43 np0005539505 podman[257735]: 2025-11-29 08:05:43.751253773 +0000 UTC m=+0.081850656 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:05:43 np0005539505 nova_compute[186958]: 2025-11-29 08:05:43.927 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:43 np0005539505 nova_compute[186958]: 2025-11-29 08:05:43.992 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:48 np0005539505 podman[257785]: 2025-11-29 08:05:48.717063349 +0000 UTC m=+0.054696018 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 29 03:05:48 np0005539505 podman[257786]: 2025-11-29 08:05:48.723273125 +0000 UTC m=+0.054676947 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 29 03:05:48 np0005539505 nova_compute[186958]: 2025-11-29 08:05:48.930 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:48 np0005539505 nova_compute[186958]: 2025-11-29 08:05:48.994 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:49 np0005539505 nova_compute[186958]: 2025-11-29 08:05:49.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:49 np0005539505 nova_compute[186958]: 2025-11-29 08:05:49.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:05:53 np0005539505 nova_compute[186958]: 2025-11-29 08:05:53.932 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:53 np0005539505 nova_compute[186958]: 2025-11-29 08:05:53.997 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:54 np0005539505 nova_compute[186958]: 2025-11-29 08:05:54.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.406 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.406 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.407 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.407 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.587 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.588 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5714MB free_disk=73.07184982299805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.588 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.589 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.645 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.645 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.778 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.799 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.801 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:05:57 np0005539505 nova_compute[186958]: 2025-11-29 08:05:57.801 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:05:58 np0005539505 nova_compute[186958]: 2025-11-29 08:05:58.935 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:58 np0005539505 nova_compute[186958]: 2025-11-29 08:05:58.999 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:05:59 np0005539505 nova_compute[186958]: 2025-11-29 08:05:59.801 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:05:59 np0005539505 nova_compute[186958]: 2025-11-29 08:05:59.802 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:05:59 np0005539505 nova_compute[186958]: 2025-11-29 08:05:59.802 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:05:59 np0005539505 nova_compute[186958]: 2025-11-29 08:05:59.827 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:06:01 np0005539505 podman[257824]: 2025-11-29 08:06:01.718595268 +0000 UTC m=+0.050217601 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 03:06:01 np0005539505 podman[257823]: 2025-11-29 08:06:01.719121503 +0000 UTC m=+0.052717892 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 29 03:06:03 np0005539505 nova_compute[186958]: 2025-11-29 08:06:03.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:03 np0005539505 nova_compute[186958]: 2025-11-29 08:06:03.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:03 np0005539505 nova_compute[186958]: 2025-11-29 08:06:03.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:03 np0005539505 nova_compute[186958]: 2025-11-29 08:06:03.939 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:04 np0005539505 nova_compute[186958]: 2025-11-29 08:06:04.000 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:06 np0005539505 podman[257869]: 2025-11-29 08:06:06.710912434 +0000 UTC m=+0.045206279 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 29 03:06:08 np0005539505 nova_compute[186958]: 2025-11-29 08:06:08.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:08 np0005539505 nova_compute[186958]: 2025-11-29 08:06:08.941 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:09 np0005539505 nova_compute[186958]: 2025-11-29 08:06:09.000 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:10 np0005539505 nova_compute[186958]: 2025-11-29 08:06:10.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:13 np0005539505 nova_compute[186958]: 2025-11-29 08:06:13.943 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:14 np0005539505 nova_compute[186958]: 2025-11-29 08:06:14.003 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:14 np0005539505 podman[257890]: 2025-11-29 08:06:14.718151791 +0000 UTC m=+0.052762163 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 29 03:06:14 np0005539505 podman[257891]: 2025-11-29 08:06:14.751179615 +0000 UTC m=+0.083191693 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 29 03:06:18 np0005539505 nova_compute[186958]: 2025-11-29 08:06:18.946 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:19 np0005539505 nova_compute[186958]: 2025-11-29 08:06:19.006 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:19 np0005539505 podman[257943]: 2025-11-29 08:06:19.718900215 +0000 UTC m=+0.052414972 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:06:19 np0005539505 podman[257942]: 2025-11-29 08:06:19.719464951 +0000 UTC m=+0.056058885 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 03:06:23 np0005539505 nova_compute[186958]: 2025-11-29 08:06:23.948 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:24 np0005539505 nova_compute[186958]: 2025-11-29 08:06:24.008 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:06:27.556 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:06:27.557 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:06:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:06:27.557 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:06:28 np0005539505 nova_compute[186958]: 2025-11-29 08:06:28.952 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:29 np0005539505 nova_compute[186958]: 2025-11-29 08:06:29.008 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:32 np0005539505 podman[257982]: 2025-11-29 08:06:32.732601519 +0000 UTC m=+0.061403507 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 03:06:32 np0005539505 podman[257981]: 2025-11-29 08:06:32.733075183 +0000 UTC m=+0.062947391 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., version=9.6)
Nov 29 03:06:33 np0005539505 nova_compute[186958]: 2025-11-29 08:06:33.957 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:34 np0005539505 nova_compute[186958]: 2025-11-29 08:06:34.010 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:37 np0005539505 podman[258025]: 2025-11-29 08:06:37.711852915 +0000 UTC m=+0.046138755 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:06:38 np0005539505 nova_compute[186958]: 2025-11-29 08:06:38.959 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:39 np0005539505 nova_compute[186958]: 2025-11-29 08:06:39.011 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:43 np0005539505 nova_compute[186958]: 2025-11-29 08:06:43.961 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:44 np0005539505 nova_compute[186958]: 2025-11-29 08:06:44.013 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:45 np0005539505 podman[258045]: 2025-11-29 08:06:45.758530708 +0000 UTC m=+0.093665810 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 03:06:45 np0005539505 podman[258046]: 2025-11-29 08:06:45.787364933 +0000 UTC m=+0.120853018 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 ceilometer_agent_compute[197706]: 2025-11-29 08:06:48.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 29 03:06:48 np0005539505 nova_compute[186958]: 2025-11-29 08:06:48.963 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:49 np0005539505 nova_compute[186958]: 2025-11-29 08:06:49.013 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:50 np0005539505 podman[258094]: 2025-11-29 08:06:50.720485555 +0000 UTC m=+0.053385491 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Nov 29 03:06:50 np0005539505 podman[258095]: 2025-11-29 08:06:50.720603108 +0000 UTC m=+0.050269972 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 03:06:51 np0005539505 nova_compute[186958]: 2025-11-29 08:06:51.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:51 np0005539505 nova_compute[186958]: 2025-11-29 08:06:51.378 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:06:53 np0005539505 nova_compute[186958]: 2025-11-29 08:06:53.966 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:54 np0005539505 nova_compute[186958]: 2025-11-29 08:06:54.016 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:55 np0005539505 nova_compute[186958]: 2025-11-29 08:06:55.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:58 np0005539505 nova_compute[186958]: 2025-11-29 08:06:58.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:58 np0005539505 nova_compute[186958]: 2025-11-29 08:06:58.969 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.017 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.410 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.411 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.411 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.412 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.609 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.610 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5733MB free_disk=73.07162475585938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.611 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.611 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.712 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.713 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.739 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing inventories for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.761 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating ProviderTree inventory for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.762 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Updating inventory in ProviderTree for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.790 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing aggregate associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.809 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Refreshing trait associations for resource provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0, traits: COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.838 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.855 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.857 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:06:59 np0005539505 nova_compute[186958]: 2025-11-29 08:06:59.857 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:07:01 np0005539505 nova_compute[186958]: 2025-11-29 08:07:01.857 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:01 np0005539505 nova_compute[186958]: 2025-11-29 08:07:01.858 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:07:01 np0005539505 nova_compute[186958]: 2025-11-29 08:07:01.858 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:07:01 np0005539505 nova_compute[186958]: 2025-11-29 08:07:01.914 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:07:03 np0005539505 nova_compute[186958]: 2025-11-29 08:07:03.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:03 np0005539505 podman[258130]: 2025-11-29 08:07:03.72044303 +0000 UTC m=+0.051460867 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Nov 29 03:07:03 np0005539505 podman[258131]: 2025-11-29 08:07:03.742507714 +0000 UTC m=+0.069630210 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 29 03:07:03 np0005539505 nova_compute[186958]: 2025-11-29 08:07:03.972 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:04 np0005539505 nova_compute[186958]: 2025-11-29 08:07:04.019 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:04 np0005539505 nova_compute[186958]: 2025-11-29 08:07:04.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:05 np0005539505 nova_compute[186958]: 2025-11-29 08:07:05.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:08 np0005539505 podman[258173]: 2025-11-29 08:07:08.765559749 +0000 UTC m=+0.084446477 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 03:07:08 np0005539505 nova_compute[186958]: 2025-11-29 08:07:08.974 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:09 np0005539505 nova_compute[186958]: 2025-11-29 08:07:09.021 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:10 np0005539505 nova_compute[186958]: 2025-11-29 08:07:10.374 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:10 np0005539505 nova_compute[186958]: 2025-11-29 08:07:10.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:13 np0005539505 nova_compute[186958]: 2025-11-29 08:07:13.977 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:14 np0005539505 nova_compute[186958]: 2025-11-29 08:07:14.022 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:16 np0005539505 podman[258192]: 2025-11-29 08:07:16.724442528 +0000 UTC m=+0.056844448 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:07:16 np0005539505 podman[258193]: 2025-11-29 08:07:16.755252589 +0000 UTC m=+0.082747371 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 29 03:07:18 np0005539505 nova_compute[186958]: 2025-11-29 08:07:18.980 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:19 np0005539505 nova_compute[186958]: 2025-11-29 08:07:19.024 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:21 np0005539505 podman[258241]: 2025-11-29 08:07:21.727500818 +0000 UTC m=+0.063685893 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3)
Nov 29 03:07:21 np0005539505 podman[258242]: 2025-11-29 08:07:21.736184503 +0000 UTC m=+0.066017228 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:07:23 np0005539505 nova_compute[186958]: 2025-11-29 08:07:23.981 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:24 np0005539505 nova_compute[186958]: 2025-11-29 08:07:24.026 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:07:27.557 104094 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:07:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:07:27.557 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:07:27 np0005539505 ovn_metadata_agent[104089]: 2025-11-29 08:07:27.557 104094 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:07:28 np0005539505 nova_compute[186958]: 2025-11-29 08:07:28.983 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:29 np0005539505 nova_compute[186958]: 2025-11-29 08:07:29.029 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:33 np0005539505 nova_compute[186958]: 2025-11-29 08:07:33.985 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:34 np0005539505 nova_compute[186958]: 2025-11-29 08:07:34.031 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:34 np0005539505 podman[258281]: 2025-11-29 08:07:34.712981483 +0000 UTC m=+0.043148671 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 29 03:07:34 np0005539505 podman[258280]: 2025-11-29 08:07:34.717993005 +0000 UTC m=+0.051682282 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, architecture=x86_64, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter)
Nov 29 03:07:38 np0005539505 nova_compute[186958]: 2025-11-29 08:07:38.987 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:39 np0005539505 nova_compute[186958]: 2025-11-29 08:07:39.033 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:39 np0005539505 podman[258327]: 2025-11-29 08:07:39.743422816 +0000 UTC m=+0.080572969 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 29 03:07:43 np0005539505 nova_compute[186958]: 2025-11-29 08:07:43.990 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:44 np0005539505 nova_compute[186958]: 2025-11-29 08:07:44.036 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:47 np0005539505 podman[258346]: 2025-11-29 08:07:47.72859432 +0000 UTC m=+0.067116279 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 29 03:07:47 np0005539505 podman[258347]: 2025-11-29 08:07:47.743998265 +0000 UTC m=+0.078074948 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 29 03:07:48 np0005539505 nova_compute[186958]: 2025-11-29 08:07:48.991 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:49 np0005539505 nova_compute[186958]: 2025-11-29 08:07:49.038 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:52 np0005539505 nova_compute[186958]: 2025-11-29 08:07:52.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:52 np0005539505 nova_compute[186958]: 2025-11-29 08:07:52.379 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 29 03:07:52 np0005539505 podman[258396]: 2025-11-29 08:07:52.730724092 +0000 UTC m=+0.061634624 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 29 03:07:52 np0005539505 podman[258397]: 2025-11-29 08:07:52.738682067 +0000 UTC m=+0.069999471 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 29 03:07:53 np0005539505 nova_compute[186958]: 2025-11-29 08:07:53.993 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:54 np0005539505 nova_compute[186958]: 2025-11-29 08:07:54.041 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:57 np0005539505 nova_compute[186958]: 2025-11-29 08:07:57.379 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:07:58 np0005539505 nova_compute[186958]: 2025-11-29 08:07:58.996 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:07:59 np0005539505 nova_compute[186958]: 2025-11-29 08:07:59.041 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:01 np0005539505 nova_compute[186958]: 2025-11-29 08:08:01.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.626 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.627 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.627 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.627 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.786 186962 WARNING nova.virt.libvirt.driver [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.788 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5730MB free_disk=73.07160568237305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.788 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.788 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.873 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.873 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.900 186962 DEBUG nova.compute.provider_tree [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed in ProviderTree for provider: 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.927 186962 DEBUG nova.scheduler.client.report [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Inventory has not changed for provider 2d55ea77-8118-4f48-9bb5-d62d10fd53c0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.929 186962 DEBUG nova.compute.resource_tracker [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.929 186962 DEBUG oslo_concurrency.lockutils [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 29 03:08:03 np0005539505 nova_compute[186958]: 2025-11-29 08:08:03.997 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:04 np0005539505 nova_compute[186958]: 2025-11-29 08:08:04.042 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:04 np0005539505 nova_compute[186958]: 2025-11-29 08:08:04.929 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:08:04 np0005539505 nova_compute[186958]: 2025-11-29 08:08:04.929 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 29 03:08:04 np0005539505 nova_compute[186958]: 2025-11-29 08:08:04.930 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 29 03:08:05 np0005539505 podman[258437]: 2025-11-29 08:08:05.713264795 +0000 UTC m=+0.046353832 container health_status 934886c0548f6dcefe42981d455f38625ff3249c58402ef832000215502642cf (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 29 03:08:05 np0005539505 podman[258436]: 2025-11-29 08:08:05.713385458 +0000 UTC m=+0.050180570 container health_status 88cc41bbf7d81ca893f6598f7f602e9fd54b33f1785cff799ad245019fae553e (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm)
Nov 29 03:08:05 np0005539505 nova_compute[186958]: 2025-11-29 08:08:05.943 186962 DEBUG nova.compute.manager [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 29 03:08:05 np0005539505 nova_compute[186958]: 2025-11-29 08:08:05.943 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:08:05 np0005539505 nova_compute[186958]: 2025-11-29 08:08:05.944 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:08:06 np0005539505 nova_compute[186958]: 2025-11-29 08:08:06.378 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:08:09 np0005539505 nova_compute[186958]: 2025-11-29 08:08:09.001 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:09 np0005539505 nova_compute[186958]: 2025-11-29 08:08:09.044 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:10 np0005539505 podman[258481]: 2025-11-29 08:08:10.710446748 +0000 UTC m=+0.048427051 container health_status ac493eb7b48c8db9b8c1beadfe424764c5f74cb296fd8df87a82ea08b0369282 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 29 03:08:11 np0005539505 nova_compute[186958]: 2025-11-29 08:08:11.373 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:08:11 np0005539505 nova_compute[186958]: 2025-11-29 08:08:11.377 186962 DEBUG oslo_service.periodic_task [None req-f4b59027-dc43-48ad-a285-2328858b0617 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 29 03:08:13 np0005539505 systemd-logind[794]: New session 63 of user zuul.
Nov 29 03:08:13 np0005539505 systemd[1]: Started Session 63 of User zuul.
Nov 29 03:08:14 np0005539505 nova_compute[186958]: 2025-11-29 08:08:14.005 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:14 np0005539505 nova_compute[186958]: 2025-11-29 08:08:14.046 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:17 np0005539505 podman[258647]: 2025-11-29 08:08:17.980359576 +0000 UTC m=+0.078042538 container health_status d92bcc690321ef8f6c68e6ed2c59e3b016ba4a13c60c7f2c9b633e977da4865a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 29 03:08:17 np0005539505 podman[258648]: 2025-11-29 08:08:17.983658989 +0000 UTC m=+0.079689564 container health_status f828ce8b85ce1ce1663b962b540aa6fe865c1d41c37221218886aec7e787df43 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 03:08:18 np0005539505 ovs-vsctl[258719]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 29 03:08:19 np0005539505 nova_compute[186958]: 2025-11-29 08:08:19.007 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:19 np0005539505 nova_compute[186958]: 2025-11-29 08:08:19.046 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:19 np0005539505 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 258529 (sos)
Nov 29 03:08:19 np0005539505 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 29 03:08:19 np0005539505 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 29 03:08:19 np0005539505 virtqemud[186353]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 29 03:08:19 np0005539505 virtqemud[186353]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 29 03:08:19 np0005539505 virtqemud[186353]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 29 03:08:23 np0005539505 systemd[1]: Starting Hostname Service...
Nov 29 03:08:23 np0005539505 podman[259233]: 2025-11-29 08:08:23.160822272 +0000 UTC m=+0.065187004 container health_status d18825ddbfe692f198ea3ec55348036b5c9000b785b64ee9f2cc0916f41ee623 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 29 03:08:23 np0005539505 podman[259232]: 2025-11-29 08:08:23.163306102 +0000 UTC m=+0.068938300 container health_status 25dad8feb04cffa1e234a3319085cc966cca9f35942c016296b83955822e02b3 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 29 03:08:23 np0005539505 systemd[1]: Started Hostname Service.
Nov 29 03:08:24 np0005539505 nova_compute[186958]: 2025-11-29 08:08:24.009 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 29 03:08:24 np0005539505 nova_compute[186958]: 2025-11-29 08:08:24.047 186962 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 32 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
